KR20140101616A - Mobile terminal and screen restoring method thereof - Google Patents

Mobile terminal and screen restoring method thereof Download PDF

Info

Publication number
KR20140101616A
KR20140101616A KR1020130014983A KR20130014983A KR20140101616A KR 20140101616 A KR20140101616 A KR 20140101616A KR 1020130014983 A KR1020130014983 A KR 1020130014983A KR 20130014983 A KR20130014983 A KR 20130014983A KR 20140101616 A KR20140101616 A KR 20140101616A
Authority
KR
South Korea
Prior art keywords
screen
image
information
capture
user
Prior art date
Application number
KR1020130014983A
Other languages
Korean (ko)
Inventor
황도현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130014983A priority Critical patent/KR20140101616A/en
Publication of KR20140101616A publication Critical patent/KR20140101616A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The present invention relates to a mobile terminal and a screen restoring method thereof, capable of quickly and conveniently restoring screen backup and restoration by using a screen capture. The present invention includes the steps of: displaying a screen including at least one object; storing the configuration information of the object included in the screen and a capture image of the screen when a screen capture command is inputted; and restoring and displaying the screen in the screen capture process based on the configuration information of the object and the previously stored capture image when a screen restoring command is inputted.

Description

 [0001] MOBILE TERMINAL AND SCREEN RESTORING METHOD THEREOF [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a mobile terminal and a screen reconstruction method thereof, which can perform screen backup and restoration quickly and conveniently using screen capturing.

The mobile terminal may be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some mobile terminals include additional functions to execute games, and some other mobile terminals are also implemented as multimedia devices. Moreover, recent mobile terminals can receive a broadcast or multicast signal to view a video or television program.

Efforts to support and increase the functions of the mobile terminal continue. Such efforts include not only changes and improvements in structural components that form the mobile terminal, but also improvements in software or hardware.

The mobile terminal provides a screen capture function that can save / share the currently viewed screen as a picture file. The screen capturing function is a function of storing a specific application screen including text and images as one image.

However, in the conventional screen capturing function, only the screen image is stored, but information other than the image, that is, the object included in the screen, the attribute information of the object, and the user input information are not stored.

Accordingly, in the case of a shopping mall, in order to search for a previously searched screen, the same search term must be searched again. In this case, a different product may be searched on the screen. Especially, if it is a Best / Planned product, it is difficult to find it again because it is exposed only on the main screen for a short period of time. Even if the website is set as a favorite, it is not easy to recognize the product by only viewing the text on the captured screen.

Further, in the related art, when the theme, icon (application, Widget) position of the home menu or the setting (eg, brightness) of the system is deleted / changed by an operation such as touch which the user does not want, It is difficult to restore the screen that is running before capturing by simply saving the image.

It is an object of the present invention to provide a method of restoring a screen of a mobile terminal that can easily restore a previous screen by storing various information constituting a screen together with a screen captured at the time of screen capturing.

It is another object of the present invention to provide a screen restoration method of a mobile terminal that can share a captured screen and related information with other users.

It is another object of the present invention to provide a screen restoration method of a mobile terminal capable of enhancing the recognition rate of an object during a screen restoration and performing a search more easily by backing up a screen as a thumbnail image at the time of screen capture.

According to another aspect of the present invention, there is provided a method of restoring a screen of a mobile terminal, the method comprising: displaying a screen including at least one object; Storing a capture image of a screen and configuration information of the object included in the screen when a screen capture command is input; And restoring and displaying a screen at the time of capturing a screen based on the stored captured image and the configuration information of the object when the screen restoration command is input.

The at least one object may include an icon, a text, an image, and a combination thereof. The screen may be one of a menu screen, a music or video playback screen, a game execution screen, an application execution screen, and a web browser screen.

The screen capture command is input by one of a combination of a predetermined touch pattern, a soft key, and a hard key, and the combination of the hard keys is a volume up key + power key.

Performing a general capture mode for capturing a full screen according to a pattern of the screen capture command and a smart capture mode for capturing a portion of a screen selected by the screen editing, May be determined by the touch time of the hard key for inputting.

The captured image is generated and stored as a thumbnail image of a screen, and the configuration information of the object is stored as attribute information of a screen capture image.

The configuration information of the object includes context information of the object and includes the type, location, version, link information of the object as well as user setting information. The user setting information includes system setting such as zoom magnification and resolution Information and user input information such as the quantity and color of the object.

The capture image may be an image in which the entire screen or a part of the screen is captured, and a part of the screen may include a single or at least one object selected by the user.

According to an exemplary embodiment, the step of storing the capture image and the configuration information of the object includes: extracting an image file from a browser screen; Displaying an image of a predetermined size or more among the extracted image files; Receiving a specific image; And generating a thumbnail image of the selected image, and setting and storing a tag in the generated thumbnail image. In this case, the tag may be a text tag or a color tag, do.

The screen restoration command is input by one of a combination of a predetermined touch pattern, a soft key, and a hard key, and the combination of the hard keys may be a volume down key + a power key.

The restored screen is the entirety of the pre-stored captured image or a part of the captured image edited by the user.

The step of restoring and displaying the screen according to an exemplary embodiment may include: determining a type of an application currently being executed; Reading the stored captured image and configuration information corresponding to the determined application; Confirming the setting of the restoration mode; And restoring the screen using the read captured image and the configuration information according to the determined restoration mode setting.

In this case, in the case where the restoration mode is set to the quick mode, restoring the screen includes restoring the screen with the most recently captured image and the configuration information. Displaying a thumbnail image list when the restoration mode is set to the normal mode; And restoring a screen by selecting a thumbnail image of a desired restoration position from the thumbnail image list. In the browser screen, a restore screen is displayed by accessing a link address of a thumbnail image.

The step of restoring and displaying the screen includes a step of outputting a part of the captured image, which is pre-stored or edited by the user, to the restoration screen according to the pattern of the screen restoration command, Is determined by the touch time of the hard key for inputting the screen restoration command.

According to an aspect of the present invention, there is provided a mobile terminal including: a display unit displaying a screen including at least one object; A memory for storing a captured image of the screen and configuration information of the object included in the screen when a screen capture command is input; And a controller for restoring and displaying a screen at the time of screen capture based on the stored capture image and the configuration information of the object when the screen recovery command is input, and the captured image is generated and stored as a thumbnail image of the screen.

The at least one object may include an icon, a text, an image, and a combination thereof. The screen may be one of a menu screen, a music or video playback screen, a game execution screen, an application execution screen, and a web browser screen.

The screen capture command is input by one of a combination of a predetermined touch pattern, a soft key, and a hard key, and the combination of the hard keys is a volume up key + power key.

The control unit may further include a general capture mode for capturing a full screen according to a pattern of a screen capture command and a smart capture mode for capturing a portion of a screen selected by the screen editing. And the touch time for the smart capture mode is longer than the touch time for the normal capture mode.

The configuration information of the object is information to be stored as attribute information of the screen capture image and includes context information of the object, including the type, location, version, link information of the object as well as user setting information, The setting information includes system setting information such as zoom magnification and resolution, and user input information such as the quantity and color of the object.

The capture image is an image in which the entire screen or a part of the screen is captured, and a part of the screen includes a single or at least one object selected by the user.

The control unit displays an image of a predetermined size or more out of the images extracted from the browser screen, generates a thumbnail image of the image selected by the user, and sets and stores a tag. The tag is a text tag and a color tag, Is set to < RTI ID = 0.0 >

The screen restore command is input by one of a combination of a predetermined touch pattern, a soft key, and a hard key, and the combination of the hard keys is a volume down key + a power key.

The restored screen is the entirety of the pre-stored captured image or a part of the captured image edited by the user.

The control unit determines the type of the application currently being executed when the screen restoration command is inputted, reads the stored captured image and configuration information corresponding to the determined application, confirms the setting of the predetermined restoration mode, And restores the screen using the read captured image and the configuration information according to the setting.

Wherein when the setting of the restoration mode is the quick mode, the controller restores the screen with the image captured most recently and the configuration information, and displays the thumbnail image list when the restoration mode is set to the normal mode, A thumbnail image of a desired restoration position is selected from the list to restore the screen, and the restoration screen is displayed by accessing the link address of the thumbnail image on the browser screen.

If the connection to the link address of the thumbnail image fails, the controller outputs an error message and analyzes and displays the main keyword in the link of the thumbnail image, thereby guiding the user to search using the keyword.

The control unit outputs a part of the captured image, which is pre-stored or edited by the user, to the restoration screen according to the pattern of the screen restoration command, and the pattern of the screen restoration command is a touch of the hard key And the touch time for restoring a part of the captured image is longer than the touch time for restoring the entire captured image.

In the present invention, information of an object constituting a screen together with a current screen image at the time of screen capture is stored as a backup screen, and the previously stored backup screen is restored and displayed by a simple operation at any time, Regardless of the update, there is an advantage that the desired application can be executed or the object included in the screen can be retrieved.

In addition, the present invention can increase the recognition rate of an object when a screen is restored by generating and storing a screen as a thumbnail image at the time of screen capture, and searching for a desired object can be performed more easily by performing a search using a URL link of a thumbnail image have.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention;
2A is a block diagram of a wireless communication system in which a mobile terminal can operate in accordance with an embodiment of the present invention;
FIG. 2B is a configuration diagram of a WiFi location tracking system in which a mobile terminal can operate according to an embodiment of the present invention; FIG.
3 is a flowchart illustrating a screen capturing method of a mobile terminal according to an embodiment of the present invention.
4 is a flowchart showing an operation of performing screen capturing in a home screen;
FIG. 5 is a view for explaining the embodiment shown in FIG. 4; FIG.
6 is a flowchart showing an operation of performing screen capture on a browser screen;
FIG. 7 is a view for explaining the embodiment shown in FIG. 6; FIG.
8 is a view showing a thumbnail image generated when a screen is captured;
9 is a flowchart showing an operation of generating a part of a screen as a captured image at the time of capturing a browser screen.
FIG. 10 is a view for explaining the embodiment shown in FIG. 9; FIG.
11 is a view showing an example in which a tag is set by generating a part of a screen as a captured image.
FIG. 12 is a flowchart illustrating an operation of performing a screen capture of a smart mode in the present invention. FIG.
Fig. 13 is a view for explaining the embodiment shown in Fig. 12; Fig.
14 is a flowchart showing a screen return method according to the present invention.
15 is a flowchart showing a method for performing a screen return on a home screen.
FIG. 16 is a view for explaining the embodiment shown in FIG. 15; FIG.
17 is a flowchart showing a method of performing a screen return by a browser.
FIG. 18 is a view for explaining the embodiment shown in FIG. 17; FIG.
FIGS. 19A and 19B are diagrams illustrating an example of providing additional information when a screen returns to a page of interest; FIG.
20 is a view showing an example of performing partial screen return in the present invention;
21 is an exemplary diagram for explaining a method of sharing a captured screen image;
22A and 22B are views showing an example in which only a specific portion is selected and restored in the received shared screen.
23 is a diagram showing an example of providing update information related to a backup screen stored in association with a cloud;
24 is a flowchart illustrating a method of restoring a screen of a mobile terminal according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. In addition, it should be noted that the attached drawings are only for easy understanding of the embodiments disclosed in the present specification, and should not be construed as limiting the technical idea disclosed in the present specification by the attached drawings.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC , A tablet PC (tablet PC), and an ultrabook (ultrabook). However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.

1 is a block diagram illustrating a mobile terminal according to one embodiment disclosed herein.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The mobile communication module 112 is configured to implement a video communication mode and a voice communication mode. The video call mode refers to a state of talking while viewing a video of the other party, and the voice call mode refers to a state in which a call is made without viewing the other party's video. In order to implement the video communication mode and the voice communication mode, the mobile communication module 112 is configured to transmit and receive at least one of voice and image.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. Examples of the wireless Internet technology include a wireless LAN (WLAN), a wireless fidelity (WiFi) direct, a DLNA (Digital Living Network Alliance), a Wibro (Wireless broadband), a Wimax (World Interoperability for Microwave Access), HSDPA Can be used.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, NFC (Near Field Communication), etc. are used as short range communication technology .

The position information module 115 is a module for obtaining the position of the mobile terminal, and representative examples thereof include a Global Position System (GPS) module or a Wireless Fidelity (WiFi) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to an external device through the wireless communication unit 110. [ Further, the position information of the user and the like can be calculated from the image frame obtained by the camera 121. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data according to a control command for controlling the operation of the mobile terminal 100 applied by the user. The user input unit 130 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 may sense the position of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence of the user, the orientation of the mobile terminal, And generates a sensing signal (or sensing signal) for sensing the current state and controlling the operation of the mobile terminal 100. For example, the sensing unit 140 may detect whether the slide phone is opened or closed when the mobile terminal 100 is in the form of a slide phone. The sensing unit 140 may sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like.

The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 154, a haptic module 155, and the like in order to generate output related to visual, auditory, have.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the display unit 151 displays the photographed and / or received video or UI and GUI.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, a 3D display, and an e-ink display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

Also, the display unit 151 may be configured as a stereoscopic display unit 152 for displaying a stereoscopic image.

Here, a stereoscopic image represents a 3-dimensional stereoscopic image, and a 3-dimensional stereoscopic image represents a progressive depth and reality in which objects are located on a monitor or a screen, It is an image that makes you feel the same as the space. 3D stereoscopic images are implemented using binocular disparity. The binocular parallax means a parallax caused by the position of two eyes away from each other. When two eyes see two different images and the images are transmitted to the brain through the retina and fused, the depth and real feeling of the stereoscopic image can be felt .

The stereoscopic display unit 152 may be applied to a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system). The stereoscopic method, which is widely used in home television receivers, includes a Wheatstone stereoscopic method.

Examples of the autostereoscopic method include a parallax barrier method, a lenticular method, an integral imaging method, and a switchable lens method. The projection method includes a reflection type holographic method and a transmission type holographic method.

Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.

In addition, the 3D thumbnail image can generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and combine them to generate one 3D thumbnail image. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.

The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the stereoscopic display unit 152 by a stereoscopic processing unit (not shown). The stereoscopic processing unit receives a 3D image and extracts a left image and a right image from the 3D image, or receives a 2D image and converts it into a left image and a right image.

On the other hand, when a display unit 151 and a sensor (hereinafter, referred to as 'touch sensor') that detects a touch operation form a mutual layer structure (hereinafter referred to as a 'touch screen' It can also be used as an input device in addition to the device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area where the touch object is touched on the touch sensor but also the pressure at the time of touch. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor 141 may be provided as an example of the sensing unit 140. The proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object existing in the vicinity of the detection surface without mechanical contact using an electromagnetic force or an infrared ray. The proximity sensor 141 has a longer life than the contact type sensor and its utilization is also high.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is electrostatic, it is configured to detect the proximity of the pointer by a change of the electric field along the proximity of an object having conductivity (hereinafter, referred to as a pointer). In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor 141 detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

In the case where the three-dimensional display unit 152 and the touch sensor have a mutual layer structure (hereinafter referred to as a 'three-dimensional touch screen') or a three-dimensional sensor that detects the touch operation and the stereoscopic display unit 152 are combined with each other The stereoscopic display unit 152 may also be used as a three-dimensional input device.

The sensing unit 140 may include a proximity sensor 141, a three-dimensional touch sensing unit 142, an ultrasonic sensing unit 143, and a camera sensing unit 144 as an example of the three-dimensional sensor.

The proximity sensor 141 measures the distance between the sensing surface (for example, a user's finger or a stylus pen) to which the touch is applied without mechanical contact using the force of the electromagnetic field or infrared rays. The terminal recognizes which part of the stereoscopic image has been touched using the distance. In particular, when the touch screen is of the electrostatic type, the proximity of the sensing object is detected by a change of the electric field according to the proximity of the sensing object, and the touch on the three-dimensional is recognized using the proximity.

The stereoscopic touch sensing unit 142 senses the strength or duration of a touch applied to the touch screen. For example, the three-dimensional touch sensing unit 142 senses a pressure to apply a touch, and when the pressing force is strong, recognizes the touch as a touch to an object located further away from the touch screen toward the inside of the terminal.

The ultrasonic sensing unit 143 is configured to recognize the position information of the sensing target using ultrasonic waves.

The ultrasound sensing unit 143 may include, for example, an optical sensor and a plurality of ultrasound sensors. The light sensor is configured to sense light, and the ultrasonic sensor is configured to sense ultrasonic waves. Since light is much faster than ultrasonic waves, the time it takes for light to reach the optical sensor is much faster than the time it takes for the ultrasonic waves to reach the ultrasonic sensor. Therefore, it is possible to calculate the position of the wave generating source using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera sensing unit 144 includes at least one of a camera 121, a photo sensor, and a laser sensor.

For example, the camera 121 and the laser sensor are combined with each other to sense a touch of a sensing target with respect to a three-dimensional stereoscopic image. When the distance information detected by the laser sensor is added to the two-dimensional image photographed by the camera, three-dimensional information can be obtained.

As another example, a photosensor may be stacked on a display element. The photosensor is configured to scan the movement of the object proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of light change, thereby acquiring position information of the object to be sensed.

The audio output module 153 can output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The sound output module 153 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 154 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 154 may output a signal for notifying the occurrence of an event by using a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the sound output module 153 so that the display unit 151 and the sound output module 153 may be classified as a part of the alarm unit 154 .

The haptic module 155 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 155 may be vibration. The intensity and pattern of the vibration generated by the hit module 155 can be controlled by the user's selection or setting of the control unit. For example, the haptic module 155 may combine and output different vibrations or sequentially output the vibrations.

In addition to the vibration, the haptic module 155 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 155 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sense of the finger or arm. At least two haptic modules 155 may be provided according to the configuration of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- And may include a storage medium of at least one type of disk and optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, An audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Therefore, the identification device can be connected to the terminal 100 through the interface unit 170. [

The interface unit 170 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal 100. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

 In addition, the control unit 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

In addition, if the state of the mobile terminal meets a set condition, the controller 180 can execute a lock state for restricting input of a control command of the user to the applications. Also, the controller 180 may control the lock screen displayed in the locked state based on the touch input sensed through the display unit 151 in the locked state.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein.

The software code may be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

Next, a communication system that can be implemented through the mobile terminal 100 according to the present invention will be described. 2A and 2B are conceptual diagrams of a communication system in which the mobile terminal 100 according to the present invention can operate.

First, referring to FIG. 2A, the communication system may use different wireless interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA) , Universal Mobile Telecommunications Systems (UMTS) (especially Long Term Evolution (LTE)), Global System for Mobile Communications (GSM), and the like.

Hereinafter, for the sake of convenience of description, the description will be limited to CDMA. However, it is apparent that the present invention can be applied to all communication systems including CDMA wireless communication systems.

2A, a CDMA wireless communication system includes at least one terminal 100, at least one base station (BS) 270, at least one base station controller (BSCs) 275 , And a mobile switching center (MSC) The MSC 280 is configured to be coupled to a Public Switched Telephone Network (PSTN) 290 and BSCs 275. BSCs 275 may be coupled in pairs with BS 270 through a backhaul line. The backhaul line may be provided according to at least one of E1 / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL or xDSL. Thus, a plurality of BSCs 275 may be included in the system shown in FIG. 2A.

Each of the plurality of BSs 270 may include at least one sector, and each sector may include an omnidirectional antenna or an antenna pointing to a particular direction of radials from the BS 270. In addition, each sector may include two or more antennas of various types. Each BS 270 may be configured to support a plurality of frequency assignments and each of the plurality of frequency assignments may have a specific spectrum (e.g., 1.25 MHz, 5 MHz, etc.).

The intersection of sector and frequency assignment may be referred to as a CDMA channel. The BS 270 may be referred to as a Base Station Transceiver Subsystem (BTSs). In this case, one BSC 275 and at least one BS 270 may be summed together. The base station may also indicate a "cell site ". Alternatively, each of the plurality of sectors for a particular BS 270 may be referred to as a plurality of cell sites.

As shown in FIG. 2A, a broadcasting transmitter (BT) 295 transmits a broadcasting signal to terminals 100 operating in the system. The broadcast receiving module 111 shown in FIG. 1 is provided in the terminal 100 to receive a broadcast signal transmitted by the BT 295.

In addition, FIG. 2A illustrates a satellite 300 of a Global Positioning System (GPS). The satellite 300 aids in locating the mobile terminal 100. Although two satellites are shown in FIG. 2A, useful location information may be obtained by two or more satellites. The location information module 115 shown in FIG. 1 cooperates with the satellite 300 shown in FIG. 2A to obtain desired location information. Here, the position of the mobile terminal 100 can be tracked using all the techniques capable of tracking the location as well as the GPS tracking technology. Also, at least one of the GPS satellites 300 may optionally or additionally be responsible for satellite DMB transmission.

Of the typical operation of a wireless communication system, the BS 270 receives a reverse link signal from the mobile terminal 100. At this time, the mobile terminal 100 is connecting a call, transmitting or receiving a message, or performing another communication operation. Each of the reverse link signals received by the particular base station 270 is processed by the particular base station 270. The data resulting from the processing is transmitted to the connected BSC 275. The BSC 275 provides call resource allocation and mobility management functions, including the organization of soft handoffs between the base stations 270. BSCs 275 also transmit the received data to MSC 280 and MSC 280 provides additional transport services for connection with PSTN 290. [ Similarly, the PSTN 290 is connected to the MSC 280, the MSC 280 is connected to the BSCs 275, and the BSCs 275 are connected to the BS 100 so that the forward link signal is transmitted to the mobile terminal 100 270 can be controlled.

Next, a method of acquiring location information of a mobile terminal using a WiFi (Wireless Fidelity) Positioning System (WPS) will be described with reference to FIG. 2B.

A WiFi Positioning System (WPS) 300 uses a WiFi module included in the mobile terminal 100 and a wireless access point 320 transmitting or receiving a wireless signal with the WiFi module, Is a technology for tracking the position of the terminal 100, and refers to a WLAN (Wireless Local Area Network) based positioning technology using WiFi.

The WiFi location tracking system 300 includes a Wi-Fi location server 310, a mobile terminal 100, a wireless AP 330 connected to the mobile terminal 100, and a database 330 in which certain wireless AP information is stored .

The WiFi location server 310 extracts information of the wireless AP 320 connected to the mobile terminal 100 based on the location information request message (or signal) of the mobile terminal 100. The information of the wireless AP 320 connected to the mobile terminal 100 may be transmitted to the Wi-Fi position server 310 through the mobile terminal 100 or may be transmitted from the wireless AP 320 to the Wi- Lt; / RTI >

SSID, RSSI, channel information, Privacy, Network Type, Signal Strength, and Noise Strength based on the location information request message of the mobile terminal 100, Lt; / RTI >

The WiFi location server 310 receives the information of the wireless AP 320 connected to the mobile terminal 100 and transmits the information included in the preformed database 330 and the received wireless AP 320, And extracts (or analyzes) the location information of the mobile terminal 100 by comparing the information.

In FIG. 2B, the wireless APs connected to the mobile terminal 100 are illustrated as first, second, and third wireless APs 320 as an example. However, the number of wireless APs connected to the mobile terminal 100 can be variously changed according to the wireless communication environment in which the mobile terminal 100 is located. The WiFi location tracking system 300 is capable of tracking the location of the mobile terminal 100 when the mobile terminal 100 is connected to at least one wireless AP.

Next, the database 330 may store various information of arbitrary wireless APs located at different locations, as will be described in more detail with respect to a database 330 in which arbitrary wireless AP information is stored.

The information of any wireless APs stored in the database 300 includes at least one of MAC address, SSID, RSSI, channel information, privacy, network type, radar coordinate of the wireless AP, (Available in GPS coordinates), address of the AP owner, telephone number, and the like.

Since the database 330 stores any wireless AP information and location information corresponding to the wireless AP, the WiFi location server 310 can search the mobile terminal 100 in the database 330, The location information of the mobile terminal 100 can be extracted by searching for wireless AP information corresponding to the information of the wireless AP 320 connected to the wireless AP 320 and extracting location information matched with the retrieved wireless AP information.

The extracted location information of the mobile terminal 100 is transmitted to the mobile terminal 100 through the Wi-Fi location server 310 so that the mobile terminal 100 can acquire the location information.

The present invention provides a method of storing (backing up) various information included in a screen together with an image of the screen at the time of capturing a screen, and allowing the user to restore (restore) the screen.

Advanced screen Capture ( Advanced screen capture )

The screen capture according to the present invention is a capture method for storing a screen image and information related to at least one object included in the screen, unlike the conventional method of capturing only the arrangement of the screen image. Therefore, the screen capturing method according to the present invention is referred to as an advanced screen capturing in order to distinguish it from the conventional screen capturing. Advanced screen capture is referred to as screen capture for convenience of explanation.

screen Capture  Activate function

A typical screen capture is activated (initiated) when the home key and power button are pressed simultaneously. In contrast, the advanced screen capture according to the present invention is activated (or started) when a predetermined screen capture input (or command) is detected.

The predetermined screen capture input may be a predetermined touch pattern input on the touch screen 151 or an input of a predetermined hard key. For example, the touch pattern may be a long touch input for a predetermined area of the touch screen 151 or a touch input for a predetermined soft key. Also, the hard key may be an input of, for example, a volume down key + a power key.

As another embodiment, the advanced screen capture according to the present invention can be activated by changing the touch time of a home key and a power button used in a general screen capturing. For example, assuming that the conventional screen capture is the home key + power button (1 second), the advanced screen capture according to the present invention can be activated when the home key + power button (2 seconds) is input.

screen Capture  Perform and Capture  Save Image (Backup)

The advanced screen capture according to the present invention includes the meaning of backing up a screen that is currently being displayed (running). That is, the present invention stores not only an image of a screen but also information related to at least one object included in the screen, so that the current screen state can be restored later.

3 is a flowchart illustrating a screen capturing method of a mobile terminal according to an exemplary embodiment of the present invention.

As shown in FIG. 3, the control unit 180 may display a predetermined screen on the display unit 151 (S110).

The screen may include a menu screen, a music and video playback screen, a game execution screen, an application execution screen, and a web browser screen.

If a predetermined screen capture command is input from the user in step S120, the controller 180 stores the information related to the current screen image in step S130. The screen image is generated as a thumbnail image.

The configuration information of the screen is context information of at least one object included in the screen and includes not only the type, location, version, and link information of the object but also user setting information, that is, .

The object includes at least one or more icons, text, images, and combinations thereof included in the screen.

The user information includes screen setting information (zoom magnification, resolution, character size) and input information of the user (e.g., the number of items purchased, color, etc.).

The screen image and the configuration information of the screen may be stored in a backup folder of the gallery. In this case, the configuration information of the screen may be stored as attribute information in the captured image file, or may be stored in a separate file that matches the image file.

Example - Full Screen Capture (Home Screen)

FIG. 4 is a flowchart illustrating an operation of performing screen capturing in the home screen, and FIG. 5 is a view for explaining the embodiment shown in FIG. 4. Referring to FIG.

4, when a screen capture command is input in a state that a home screen 50 is displayed, the controller 180 captures the currently executing home screen 50 and converts the captured home screen 50 into a JPEG file (Screen brightness sound and the like) settable on the home screen 50 and the theme information of the home screen 50 (step S220), information included in the home screen 50, that is, application icon information / position, widget icon information / And converts it into an XML file (S230).

The control unit 180 stores the home screen information converted into the XML file in the memory 160 as attribute information apxk data of the home screen image converted into the JPEG file in step S240, A capture completion message, for example, a notification message "Screen capture completed" is displayed on the display unit 180 (S250).

Example - Full screen capture (browser screen)

6 is a flowchart for explaining the embodiment shown in FIG. 3 in more detail, and is a flowchart showing an operation of performing screen capture on a predetermined browser screen, FIG. 7 is a view for explaining the embodiment shown in FIG. 6 to be.

As shown in FIG. 6, the controller 180 may display a predetermined browser screen (e.g., G market) 60 on the display unit 151 (S310).

If the user inputs a screen capture command from the user while the browser screen 60 is displayed, the controller 180 captures the currently running browser screen 60 and converts it into a JPEG file (S320) , The user input information (eg, text, the number of purchased items, the check box, the color), and the user setting information (eg, screen zoom ratio).

Accordingly, the controller 180 converts the acquired URL link address, user input information (eg, quantity of goods purchased, color) and user setting information (eg, screen zoom ratio) And stores it as attribute information (metadata) (S340). The obtained information may be stored in the header of the JPEG file.

Upon completion of the storage, the controller 180 displays a capture completion message (e.g., screen capture is completed) on the display unit 180 as shown in FIG. 7 (S350).

Example - Capture a part of the screen (browser screen)

In the present invention, the controller 180 stores the entire screen as one image (thumbnail image). However, when capturing a browser screen including a plurality of objects (text, images) as in the G-market, the entire screen is created as a single thumbnail image. Therefore, as shown in FIG. 8, It is difficult to recognize exactly what the included products are.

Accordingly, the present invention can display a separate pop-up window 70 for selecting a product image when a thumbnail of a browser screen is created, and generate a thumbnail image of a product selected by the user. In this case, the controller 180 automatically assigns a text tag to the generated thumbnail image according to user input. The text tag may be a browser URL or a trade name.

This method is a method of storing the URL link address of the browser as a thumbnail image instead of storing the text as text. The reason is that in the case of shopping malls, when the same search term is searched again, a product different from the previous search is often found. Especially, when it is best / planned product, it is difficult to reopen it because it is only exposed for a short time on the main screen. Even if it is retrieved through the favorite, URl link address is conventionally stored as text, It is difficult.

FIG. 9 is a flowchart showing an operation of generating a captured image as a part of a screen at the time of capturing a browser screen, and FIG. 10 is a view for explaining the embodiment shown in FIG.

9 and 10, when a screen capture command is input in a state that a browser screen (eg, G market) is displayed, the controller 180 extracts a product image from the G market screen, In the pop-up window 70 in the form of a list (S410, S420). In this case, the control unit 180 may extract the image tag from the HTML tag of the screen, and then extract the product image corresponding to JPEG or PNG from the extracted image tag.

The user can select a plurality of product image lists displayed in the pop-up window 70 to view enlarged images. When the user selects an enlarged product image (S430), the controller 180 generates the captured product image as a captured image, that is, a thumbnail image (S440), and displays a tag input window so that the user can input the tag name, In step S450, a tag is set in the captured image with the tag name input by the user in the tag input window. The tag input window includes a text tag input window and a color tag input window.

As another example, the controller 810 may automatically set a text tag to a browser URL or a corresponding product name when generating a thumbnail image with the selected product image.

11 is a diagram illustrating an example of setting a tag by generating a captured image as a part of a screen according to the embodiment of FIG.

11, when a screen capture command is input while the browser screen is displayed, the controller 180 extracts the image files A to E from the browser screen, - Display in list form on up window 70. The predetermined size is a size that is invisible when a captured image is generated, and can be set by the user.

The control unit 810 displays a text tag input window 71 and a color tag input window 72 at the lower end of the selected image B, respectively, when the image B is selected from the displayed images A to C. The user can set a text tag (e.g., a gmarket) and a color tag (e.g., yellow) for the image B in the text tag input window 71 and the color tag input window 72, respectively.

If the user selects OK after setting the text tag (e.g., gmarket) and the color tag (e.g., yellow), the controller 180 stores the selected image B as a captured image 73 in the backup folder of the gallery. At this time, the controller 180 displays a text tag 74 and a visual cue 75 in the upper left corner of the touch screen image. When the queue is dragged, the controller 810 displays the captured captured image so that the user can edit the captured image as needed.

The above operations can be performed in the same manner in other application screens.

Example - Capture a part of the screen (Home screen)

The advanced screen capture according to the present invention can be divided into a normal mode and a smart mode. All the screen captures described above are performed in the normal capture mode.

In the present invention, the smart mode screen capture is a mode in which the capture range or area can be adjusted / selected and can be activated (initiated) by a setting of a capture menu or a separate screen capture command. For example, the normal mode screen capture according to the present invention is activated when the home key + power button (2 seconds) is input, and the smart mode screen capture is activated when the home key + power button (3 seconds) can do. In addition, the smart mode screen capture may be a predetermined touch pattern input on the touch screen 151 or an input of a predetermined hard key. For example, the touch pattern may be a long touch input for a predetermined area of the touch screen 151 or a touch input for a predetermined soft key.

FIG. 12 is a flowchart illustrating an operation of performing a screen capture of a smart mode in the present invention, and FIG. 13 is a view for explaining the embodiment shown in FIG.

12 and 13, when a smart mode screen capture command is inputted (S510) while the home screen 50 is displayed, the controller 180 directly displays the home screen 50 without capturing the screen image, A rectangular capturing range adjustment guide 80 is displayed. The user can set the capture range by adjusting the size of the capture range adjustment guide 80 (S520). If the region within the set capture range is touched, the controller 180 generates a capture image for the corresponding region and stores the captured image together with information related to the object.

In addition, if the capture range is set, the user can selectively capture only a desired portion within the range (S530). If the user selects a specific capture area A to C within the adjusted or unadjusted range 80 and touches the empty area at step S540 the controller 180 controls the capture area 80 A to C, and stores the captured image together with information of the object (S550, S560).

When the storage of the captured image and the related information is completed, the control unit 180 displays a capture completion message ("screen capture completed").

The above operations can be performed in the same manner in other application screens.

Return to screen Restore ) - Full screen return

In the present invention, restoration of the screen means to construct and display a screen at the time of screen capture using the captured image and information. The screen return is performed by displaying the captured image stored in the gallery on the display unit 180 again.

The control unit 180 displays the captured image stored in the gallery on the display unit 151 again when a screen return input (command) is detected. The screen return input may be a predetermined touch pattern input on the touch screen 151 or an input of a predetermined hard key.

For example, the touch pattern may be a long touch input for a predetermined area of the touch screen 151 or a touch input for a predetermined soft key. Also, the hard key input may be an input of a predetermined key combination, for example, a volume up key + a power key.

The control unit 180 may return to the screen most recently captured when the screen return input (command) is detected, or may return to the screen selected by the user when the captured screen is plural. This return screen setting can be set in the menu. That is, when the user sets the Quick mode in the menu, the display returns to the screen captured most recently. If the user selects the normal mode (Normal), a list of captured images is displayed, Return to the screen.

FIG. 14 is a flowchart illustrating a screen return method according to the present invention.

The user can input a predetermined screen return input command (S610). The predetermined screen return capturing input may be a predetermined touch pattern input on the touch screen 151 or an input of a predetermined hard key. For example, the touch pattern may be a double touch input for a predetermined area of the touch screen 151 or a touch input for a predetermined soft key. Also, the hard key may be an input of a volume up key + a power key, for example.

If the screen return input is detected, the controller 180 checks the currently running application and reads the screen capture image and the object information (620, S630). If it is determined that the application is currently running, the execution screen is configured and displayed by using the read captured image and object information (S640). For example, if the currently running application is a home menu, the screen returns to the screen of the image selected by the user from the previous home screen or the plurality of captured images captured most recently according to the return mode (quick mode or normal mode) In the case of a browser, the screen returns to the browser screen of the last captured home screen or the image selected by the user according to the return mode (quick mode or normal mode).

FIG. 15 is a flowchart illustrating a method of performing a screen return on a home screen, and FIG. 16 is a view for explaining an embodiment shown in FIG.

As shown in FIG. 15, when a screen return input is detected in the home screen (S710, S720), the controller 180 reads the stored captured image and object information and confirms the return setting mode (S730, S740). The return setting mode can be set by the user in the capture menu.

As a result, if the return setting mode is set to the quick mode, the controller 180 returns to the home screen before capturing by displaying the latest captured image and object information on the home execution screen (S750).

On the other hand, when the return setting mode is the normal mode, the controller 180 displays a thumbnail list of the home screen to be restored, that is, a list of a plurality of captured images and a date as pop- (S780). The user can touch a specific captured image with a long touch to view detailed information of the corresponding image.

When the user touches a predetermined image in the pop-up window 81 to select a restored image (S770), the controller 180 displays a home execution screen using the object information corresponding to the selected captured image, And returns to the home screen (S780).

As another embodiment, the present invention performs the same operation as the normal mode even when the user selects the restore menu from the home menu.

FIG. 17 is a flowchart showing a method of performing a screen return by a browser, and FIG. 18 is a view for explaining the embodiment shown in FIG.

17, when a screen return input is detected during execution of the browser (S810, S820), the control unit 180 reads the stored screen capture image, URL link information, user input information, and confirms the return setting mode (S830, S840). The return setting mode can be set by the user in the capture menu.

As a result, if the return setting mode is set to the quick mode, the control unit 180 checks the URL link of the most recently captured image and then moves to the corresponding URL link (S850), S860) The browser execution screen is displayed using the user input information to return to the browser screen before capturing (S870).

If the confirmed URL link can not be connected, the control unit 180 provides the additional information together with the error message "Connection is impossible"

On the other hand, when the return setting mode is the normal mode, as shown in FIG. 18, the controller 180 displays a thumbnail list of the web page to be restored, that is, a list of a plurality of captured images, (S880). The user can touch the specific captured image with a long touch to view detailed information of the corresponding image, and can select a restored image by touching a specific image (S870).

When the user selects the restored image in the pop-up window 81, the control unit 180 moves to the URL link of the image (S860) and displays the browser execution screen using the captured image and the user input information, (S870). ≪ / RTI >

As another embodiment, the present invention performs the same operation as the normal mode even when the user selects the restore menu from the home menu.

Figs. 19A and 19B are examples of providing additional information when the screen returns to the page of interest.

When a screen return input is detected by the web browser, the control unit 180 accesses the screen before capturing, that is, the URL link of the page of interest according to the return setting mode, and returns to the screen of the page of interest.

However, when the URL link of the interested page is not connected, the control unit 180 outputs an error message (HTTP 404 error message) indicating "connection is impossible " (Name, quantity, color, ..) of the most frequently used keywords are displayed and displayed on the pop-up window, and a message asking whether to input the keyword in the previous page is displayed. In other words, when a link connection error occurs, the controller 180 displays the search term (e.g., dress) and the user input information (purchase quantity and color of the dress) input by the user at the time of capture on the pop-up window 81, A message asking whether to search again with the search term is displayed. If the user selects OK, the search term (one piece) is automatically entered into the search window and searched.

Return to screen Restore ) - Return to partial screen

The screen return according to the present invention can be divided into a full screen return and a partial screen return. All the screen captures described above correspond to full screen return.

In the present invention, restoration of the partial screen means restoring only a part of the screen selected by the user. The screen return may be activated (initiated) by setting the capture menu or by a separate screen capture command. For example, the full screen return according to the present invention may be activated when the volume up key + power key (2 seconds) is input and the partial screen return may be activated when the volume up key + power key (3 seconds) have. In addition, the smart mode screen capture may be a predetermined touch pattern input on the touch screen 151.

20 shows an example of performing partial screen return in the present invention.

As shown in FIG. 20, when a partial screen return command is detected while the home menu is displayed, the controller 180 displays the entire captured image, that is, the return (restored) image. The user can select only a desired portion of the returned image and restore only the corresponding portion.

When the user selects a specific part (object) from the previously captured whole image, the control unit 180 displays the previously stored entire object information (application icon information / position, widget icon information / position, setting value information , Home theme information) using the information corresponding to the selected icon to display the home menu.

In addition, if a specific part (video reproducer, gallery, YouTube) icon is selected from the previously captured entire image, the control unit 1800 directly applies the page to the corresponding application.

Share screen capture

Meanwhile, according to various embodiments of the present invention, a captured image (backup screen), in particular a product image of a shopping mall, can be transmitted to another terminal (hereinafter referred to as a receiver) or uploaded to the SNS site and shared with other users. The backup screen may be performed by selecting a separate transmission means after screen capturing or by selecting it in the gallery. When the backup screen is transmitted, related information is also transmitted.

In the present invention, the captured image is referred to as a backup screen, and may be referred to as a shared screen when it is transmitted to another user or shared.

21 is an exemplary diagram for explaining a method of sharing the captured screen image (backup screen).

As shown in FIG. 21, the captured image (backup screen or shared screen) and related information transmitted to the receiver is stored in the gallery of the corresponding terminal. In particular, the icons of the captured images may be displayed as separate icons so that they can be distinguished from one another according to the type of application.

However, if the application required to recover the received captured image is not installed in the recipient's terminal, the received captured image is displayed as a non-installed icon 21.

When the recipient selects the non-installed icon 21, the control unit 180 moves to the corresponding application, for example, "Google play" to induce installation. When the installation of the corresponding application is completed, Is restored and displayed.

On the other hand, if the user desires to save the icon after trashed in the trash bin on the restored screen, the control unit 180 restores and displays the shared screen except for the abandoned icon, that is, the partial shared screen.

FIGS. 22A and 22B show an example in which only a specific portion is selected and restored in the received shared screen.

The user may select at least one object (for example, an application icon) included in the shared screen before storing the received shared screen to configure a shared screen composed only of the selected objects.

For example, as shown in FIG. 22A, when a new shared screen 91 including a plurality of objects 90 is displayed, the user can select a predetermined number of objects (five objects). The object selection can be performed through the setting pen 92 as shown in FIG. 22B.

Once the object is selected, the control unit 180 constructs a new shared screen including only five objects and displays it in the pop-up form 91. When the user confirms the storage, the shared screen is stored and displayed.

cloud ( Cloud ) Interlocking

The present invention can provide update information related to a backup screen stored in association with a cloud. The update information includes update information, newly updated screen, and object related information included in the screen.

23 shows an example of providing update information related to a backup screen stored in association with a cloud.

As shown in FIG. 23, the screen (backup screen) captured by the user can be stored in a separate folder of the gallery. If the user selects the stored backup screen, the controller 180 restores the selected screen and displays the selected backup screen.

However, in the case of the home screen, there is little change. However, in the case of the web page screen, updates such as changing the arrangement and shape of the objects and adding new objects are performed periodically. Accordingly, when the stored backup screen is executed after a predetermined time elapses, or when a specific object, text, link, or the like is selected on the backup screen, the version of the corresponding backup screen may not be provided or maintained at this time have.

In this case, the control unit 180 may provide update information related to the selected backup screen (or object, text, link).

That is, if the web page is maintained when the backup screen (or object, text, link) is selected, the current screen is maintained. If the web page is updated, the web page is notified of the update through the pop- When the user requests the update, the link is moved to the newly updated latest screen, and the newly updated screen is displayed.

24 is a flowchart illustrating a screen reconstruction method of a mobile terminal according to an embodiment of the present invention.

As shown in FIG. 24, the control unit 180 displays a predetermined screen on the display unit 151 (S910). The screen may include a menu screen, a music and video playback screen, a game execution screen, at least one application execution screen, and a web browser screen.

If a predetermined screen capture command is input from the user in step S920, the controller 180 captures the currently displayed screen and stores the captured screen together with the object information included in the selected screen in step S930. The capture screen image is a thumbnail image.

The information related to the screen is context information of at least one object included in the screen and includes not only the type, location, version, and link information of the object but also user setting information, .

The object includes at least one or more icons, texts, images and combinations thereof included in the screen. The user information includes at least one of setting information (zoom magnification, resolution, letter size) - the number of items purchased, color, etc.).

The screen image and information related to the screen may be stored in a backup folder of the gallery. In this case, information related to the screen may be stored as attribute information in the captured image file, or may be stored in a separate file that matches the image file.

When a predetermined screen return input (command) is input (S940), the controller 180 reads the stored screen capture image and object information (950, S960) after confirming the currently running application. If it is determined that the application is currently running, the execution screen is composed and displayed by using the read captured image and object information (S970). For example, if the currently running application is a home menu, the screen returns to the screen of the image selected by the user from the previous home screen or the plurality of captured images captured most recently according to the return mode (quick mode or normal mode) In the case of a browser, the screen returns to the browser screen of the last captured home screen or the image selected by the user according to the return mode (quick mode or normal mode).

Although the present invention has been described with respect to a screen backup and a screen restoration operation by screen capturing using a home screen and a browser screen as an example, the present invention is not limited to this, and is applicable to music and movie playback screens, game execution screens, and various application execution screens.

As described above, according to the present invention, when capturing a screen, information of an object constituting a screen together with a current screen image is stored as a backup screen, and a previously stored backup screen is restored and displayed by a simple operation, There is an advantage that a desired application can be executed or an object included in the screen can be searched regardless of the screen configuration and the update of the screen.

The present invention can increase the recognition rate of an object during a screen restoration by generating and storing a screen as a thumbnail image at the time of screen capture, and searching for a desired object can be performed more easily by performing a search using a URL link of a thumbnail image.

According to an embodiment of the present invention, the above-described method can be implemented as a computer-readable code on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Further, the computer may include a control unit of the terminal.

The above-described screen reconstruction method of the mobile terminal can be applied to a configuration and a method of the embodiments described above in a limited manner, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.

50: Home screen 60: Browser screen
70, 81: pop-up window 71: text tag input window
72: Color tag input window 73: Capture image 151:
74: Visual Cue 80: Capture Scope Adjustment Guide
90: Object 91: Shared Screen
151: a display unit 160; Memory
180:

Claims (31)

Displaying a screen including at least one object;
Storing a capture image of a screen and configuration information of the object included in the screen when a screen capture command is input; And
And restoring and displaying a screen at the time of screen capture based on the stored captured image and the configuration information of the object when the screen recovery command is input.
The method of claim 1, wherein the at least one object
An icon, a text, an image, and a combination thereof.
The method of claim 1,
A menu screen, a music or video playback screen, a game execution screen, an application execution screen, and a web browser screen.
The method of claim 1, wherein the screen capture command
A predetermined touch pattern, a combination of a soft key and a hard key,
Wherein the combination of the hard keys is a volume up key + a power key.
The method of claim 1, further comprising: performing a general capture mode for capturing a full screen according to a pattern of the screen capture command and a smart capture mode for capturing a portion of a screen selected by the screen editing,
Wherein the pattern of the screen capture command is determined by a touch time of a hard key for inputting a screen capture command.
The method of claim 1,
Wherein the thumbnail image is generated and stored as a thumbnail image of the screen, and the configuration information of the object is stored as attribute information of the screen capture image.
The method of claim 1, wherein the configuration information of the object
The context information of the object includes the type, location, version, and link information of the object as well as user setting information,
Wherein the user setting information includes system setting information such as zoom magnification and resolution, and user input information such as the number and color of the object.
The method of claim 1,
The entire screen or part of the screen is the captured image,
Wherein a part of the screen includes a single or at least one object selected by the user.
The method of claim 1, wherein the step of storing the capture image and the configuration information of the object
Extracting an image file from a browser screen;
Displaying an image of a predetermined size or more among the extracted image files;
Receiving a specific image; And
Generating a thumbnail image of the selected image, and setting and storing a tag in the generated thumbnail image,
Wherein the tag is input by a user as a text tag and a color tag or is automatically set.
The method of claim 1,
A predetermined touch pattern, a combination of a soft key and a hard key,
Wherein the combination of the hard keys is a volume down key + a power key.
The method of claim 1,
Wherein the captured image is an entirety of the captured image or a part of a captured image edited by a user.
The method of claim 1, wherein the step of restoring and displaying the screen comprises:
Determining a type of an application currently being executed;
Reading the stored captured image and configuration information corresponding to the determined application;
Confirming the setting of the restoration mode; And
And restoring the screen using the read captured image and the configuration information according to the determined restoration mode setting.
13. The method of claim 12, wherein restoring the screen comprises:
If the setting of the restoration mode is the quick mode, reconstructing the screen with the latest captured image and configuration information;
Displaying a thumbnail image list when the restoration mode is set to the normal mode; And
Selecting a thumbnail image of a desired restoration position from the thumbnail image list and restoring the screen,
And displaying a restore screen by accessing a link address of a thumbnail image on the browser screen.
The method as claimed in claim 13, further comprising: outputting an error message when the connection to the link address of the thumbnail image fails, and displaying a main keyword in a link of the thumbnail image to display a keyword And restoring a screen of the mobile terminal. The method of claim 1, wherein the step of restoring and displaying the screen comprises:
And outputting a part of the captured image, which has been previously stored or edited by the user, to the restoration screen in accordance with the pattern of the screen restoration command,
Wherein the pattern of the screen recovery command is determined by a touch time of a hard key for inputting a screen recovery command.
The method of claim 1, further comprising: transmitting the stored captured image and configuration information of the object to another user to share a restored screen image. A display unit displaying a screen including at least one object;
A memory for storing a captured image of the screen and configuration information of the object included in the screen when a screen capture command is input; And
And a controller for restoring and displaying the screen at the time of capturing a screen based on the stored captured image and the configuration information of the object when the screen restoration command is input,
Wherein the captured image is generated and stored as a thumbnail image of the screen.
18. The method of claim 17, wherein the at least one object
Icons, text, images, and combinations thereof.
18. The method of claim 17,
A menu screen, a music or video playback screen, a game execution screen, an application execution screen, and a web browser screen.
18. The method of claim 17,
A predetermined touch pattern, a combination of a soft key and a hard key,
Wherein the combination of the hard keys is a volume up key + power key.
18. The apparatus of claim 17, wherein the control unit
Performing a general capture mode for capturing a full screen according to a pattern of a screen capture command and a smart capture mode for capturing a portion of a screen selected by the screen editing,
Wherein the controller determines the pattern of the screen capture command according to the touch time of the hard key for inputting the screen capture command, and the touch time for the smart capture mode is longer than the touch time for the general capture mode.
18. The method of claim 17, wherein the configuration information of the object
Information stored as attribute information of the screen capture image, and includes context information of the object, including the type, location, version, and link information of the object as well as user setting information,
Wherein the user setting information includes system setting information such as a zoom magnification and a resolution, and user input information such as a quantity and a color of the object.
18. The system of claim 17, wherein the capture image
The entire screen or part of the screen is the captured image,
Wherein the part of the screen includes a single or at least one object selected by the user.
18. The apparatus of claim 17, wherein the control unit
A thumbnail image of an image selected by the user is displayed, and a tag is set and stored among the images extracted from the browser screen, and the tag is set as a text tag and a color tag by the user or automatically set Wherein the mobile terminal is a mobile terminal.
18. The method of claim 17,
A predetermined touch pattern, a combination of a soft key and a hard key,
Wherein the combination of the hard keys is a volume down key + a power key.
18. The method of claim 17,
The captured image being a part of the entire captured image or the captured image edited by the user.
18. The apparatus of claim 17, wherein the control unit
A screen restoration command is inputted, the type of the currently executed application is determined, and the captured image and configuration information corresponding to the determined application are read, and the setting of the predetermined restoration mode is confirmed, And restores the screen using the read captured image and the configuration information.
28. The apparatus of claim 27, wherein the control unit
If the setting of the restoration mode is the quick mode, the screen is restored with the latest captured image and configuration information,
If the setting of the restoration mode is the normal mode, a thumbnail image list is displayed,
A thumbnail image of a desired restoration position is selected from the thumbnail image list, and the screen is restored.
And displays a restoration screen by accessing a link address of a thumbnail image on the browser screen.
29. The apparatus of claim 28, wherein the control unit
Wherein when the connection to the link address of the thumbnail image fails, an error message is output and the main keyword is analyzed and displayed on the link of the thumbnail image to induce the user to search using the keyword.
18. The apparatus of claim 17, wherein the control unit
A part of the captured image that has been previously stored or a part of the user-edited captured image is output to the restoration screen according to the pattern of the screen recovery command,
Wherein the pattern of the restoration command is determined by a touch time of a hard key for inputting a restoration command, and a touch time for restoring a part of the captured image is greater than a touch time for restoration of the entire captured image. terminal.
18. The apparatus of claim 17, wherein the control unit
And transmits the stored captured image and the configuration information of the object to another user to share the restore screen.
KR1020130014983A 2013-02-12 2013-02-12 Mobile terminal and screen restoring method thereof KR20140101616A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130014983A KR20140101616A (en) 2013-02-12 2013-02-12 Mobile terminal and screen restoring method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130014983A KR20140101616A (en) 2013-02-12 2013-02-12 Mobile terminal and screen restoring method thereof

Publications (1)

Publication Number Publication Date
KR20140101616A true KR20140101616A (en) 2014-08-20

Family

ID=51746910

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130014983A KR20140101616A (en) 2013-02-12 2013-02-12 Mobile terminal and screen restoring method thereof

Country Status (1)

Country Link
KR (1) KR20140101616A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101525025B1 (en) * 2014-12-10 2015-06-03 유흥권 Live capturing method in smartphone
WO2020060241A1 (en) * 2018-09-21 2020-03-26 Samsung Electronics Co., Ltd. Electronic device and method for capturing multimedia content

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101525025B1 (en) * 2014-12-10 2015-06-03 유흥권 Live capturing method in smartphone
WO2020060241A1 (en) * 2018-09-21 2020-03-26 Samsung Electronics Co., Ltd. Electronic device and method for capturing multimedia content
US10908701B2 (en) 2018-09-21 2021-02-02 Samsung Electronics Co., Ltd. Electronic device and method for capturing multimedia content

Similar Documents

Publication Publication Date Title
US11095808B2 (en) Terminal and method for controlling the same
KR102080746B1 (en) Mobile terminal and control method thereof
KR102099358B1 (en) Mobile terminal and control method thereof
KR102043148B1 (en) Mobile terminal and touch coordinate predicting method thereof
KR20180020386A (en) Mobile terminal and operating method thereof
US10719197B2 (en) Mobile terminal extracting contents with a calendar for generating and displaying an electronic note and method thereof
KR20150056353A (en) The mobile terminal and the control method thereof
KR20140128724A (en) Mobile terminal and control method thereof
KR20150048529A (en) Method for generating receipe information in a mobile terminal
KR20150047032A (en) Mobile terminal and method for controlling the same
KR102045893B1 (en) Mobile terminal and control method thereof
KR101763227B1 (en) Mobile terminal and method for controlling the same
KR20140133081A (en) Mobile terminal and sharing contents displaying method thereof
KR20150059473A (en) Mobile terminal and multi-device interworking method using finger print thereof
KR102106873B1 (en) Mobile terminal and method of controlling the same
KR20150055446A (en) Mobile terminal and control method thereof
KR20140101616A (en) Mobile terminal and screen restoring method thereof
KR20140122559A (en) Mobile terminal and control method thereof
KR102025774B1 (en) Mobile terminal having operation control function via drawing input and operation control method thereof
KR20150017955A (en) Control apparatus of mobile terminal and method thereof
KR20150008971A (en) Mobile terminal and memo management method thereof
KR102135363B1 (en) Mobile terminal and control method thereof
KR20150015801A (en) Mobile terminal and control method thereof
KR20140133078A (en) Control apparatus of mobile terminal and method thereof
KR20150052649A (en) Mobile terminal having screen control function using hovering and control method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination