WO2014208783A1 - 이동 단말기 및 이동 단말기의 제어 방법 - Google Patents
이동 단말기 및 이동 단말기의 제어 방법 Download PDFInfo
- Publication number
- WO2014208783A1 WO2014208783A1 PCT/KR2013/005596 KR2013005596W WO2014208783A1 WO 2014208783 A1 WO2014208783 A1 WO 2014208783A1 KR 2013005596 W KR2013005596 W KR 2013005596W WO 2014208783 A1 WO2014208783 A1 WO 2014208783A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- mobile terminal
- line
- touch screen
- controller
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/17—Image acquisition using hand-held instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/235—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/28—Character recognition specially adapted to the type of the alphabet, e.g. Latin alphabet
- G06V30/287—Character recognition specially adapted to the type of the alphabet, e.g. Latin alphabet of Kanji, Hiragana or Katakana characters
Definitions
- the present invention relates to a mobile terminal and a method of controlling the mobile terminal for recognizing a character using an optical character recognition technology and recommending an application.
- a mobile terminal such as a smart phone provides not only a voice call function but also various multimedia services such as data communication, camera, DMB, video playback, and text message service.
- the camera of the mobile terminal can be used to acquire images including text such as business cards, magazines, logos, signs, and prints, and extract text from the acquired images using optical character recognition (OCR) technology. can do.
- OCR optical character recognition
- the present invention provides a method for controlling a mobile terminal and a mobile terminal that designates a specific region or a specific character of an image, extracts only characters intersecting the designated specific region by using optical character recognition technology, and recommends an application related to the extracted character. It is about.
- a mobile terminal for realizing the above object is to display an image including a character on the touch screen and the touch screen, receive a continuous touch input via the touch screen, continuous touch input
- the control unit may display a line on an area of an image corresponding to and to extract a character at least partially intersecting the line when receiving a specific input through a touch screen and to display the extracted character on a popup window.
- the controller may display the line in real time according to the continuous touch input.
- the controller may display at least one new line on an area of the image corresponding to the at least one new continuous touch input.
- control unit When the control unit receives a specific input, the control unit extracts a character that crosses at least a portion of each line and displays the text on at least one pop-up window, and displays at least one icon representing a related application according to the content of the text in the at least one pop-up window, respectively. I can display it.
- the controller may also display at least one icon representing a related application on a popup window according to the extracted text.
- the controller When the controller receives an input for one of the at least one icon, the controller executes an application corresponding to the icon, and separates the characters displayed on the pop-up window according to the contents and displays them on at least one input item of the application execution screen.
- the controller When the controller receives an input for long touching an area of the line, the controller enters a mode for editing a line and displays an icon for adjusting the length of the line according to the position of the long touched point, or an icon for moving the line. I can display it.
- the controller may enlarge and display the size of the image when receiving a specific input through the touch screen, and fix the enlarged image when receiving an input of long touching a region of the enlarged image.
- the controller When the controller receives a touch input in the text area displayed on the pop-up window, the controller may enter a mode for editing text.
- the controller may delete the displayed line and the popup window.
- the line may be displayed at a preset thickness.
- the controller may display the line in a transparent or translucent manner and in a preset color or contrast.
- the controller may set a virtual area in which the thickness of the line is enlarged in the vertical direction, and extract a character included in the virtual area.
- the mobile terminal for realizing the above object to display the touch screen, the camera, and the image taken by the camera on the touch screen as a preview screen, continuous touch through the touch screen
- a control unit for receiving an input, displaying a line over an area of an image corresponding to a continuous touch input, and extracting a character at least partially crossing the line and displaying the character in a pop-up window when receiving a specific input through a touch screen. It may include.
- control method of the mobile terminal for realizing the above object is a step of displaying an image including a character on the touch screen, the continuous touch input via the touch screen in the image display state Receiving, displaying a line over an area of the image corresponding to the received continuous touch input, receiving a specific input through a touch screen, and extracting a character at least partially intersecting the line according to the specific input; And displaying the extracted text in a popup window.
- control method of the mobile terminal for realizing the above object is to drive the camera to take an image including a character, to display the captured image on the touch screen as a preview screen Receiving a continuous touch input through a touch screen while displaying an image, displaying a line over an area of an image corresponding to the received continuous touch input, receiving a specific input through a touch screen, And extracting a character at least partially crossing the line according to a specific input and displaying the extracted character in a popup window.
- the mobile terminal and the control method of the mobile terminal according to the present invention have the following effects.
- a specific region of an image displayed on the touch screen of the mobile terminal is directly selected by a continuous touch input, so that only characters intersecting with the specific region desired by the user may be extracted.
- the size of a specific area is finely adjusted so that the beginning and end of the character that the user wants to extract intersect the specific area according to a specific input received through the touch screen of the mobile terminal. Character recognition rate can be improved.
- a plurality of specific areas may be selected in one image, and characters including at least a part of the separated plurality of specific areas may be extracted at a time.
- the text may be directly extracted from the image photographed by driving the camera.
- FIG. 1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
- FIG. 2A is a front perspective view of a mobile terminal according to an embodiment of the present invention.
- FIG. 2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention.
- 3 and 4 are flowcharts of a control method of a mobile terminal according to the first embodiment of the present invention.
- 5A and 5B are diagrams for describing a method of displaying an image of a mobile terminal according to a first embodiment of the present invention.
- 6A to 6C are diagrams for describing a method in which a mobile terminal according to a first embodiment of the present invention receives a continuous touch input and displays a line.
- FIG. 7 is a diagram for describing a method for receiving a specific input by a mobile terminal according to a first embodiment of the present invention.
- FIGS. 8A and 8B are diagrams for describing a method of extracting and displaying a character of a mobile terminal according to a first embodiment of the present invention.
- 9A to 9C are diagrams for describing a method of inputting a text extracted as an input item of a related application of a mobile terminal according to a first embodiment of the present invention.
- FIG. 10 is a diagram for describing a method of controlling a mobile terminal according to a second embodiment of the present invention.
- 11A and 11B are diagrams for describing a method of obtaining an image of a mobile terminal according to a second embodiment of the present invention.
- FIG. 12 is a view for explaining a method of displaying an image of a mobile terminal according to a second embodiment of the present invention.
- FIGS. 13A to 13C are diagrams for describing a method in which a mobile terminal according to a second embodiment of the present invention receives a continuous touch input and displays a line.
- FIG. 14 is a diagram for describing a method for receiving a specific input by a mobile terminal according to a second embodiment of the present invention.
- FIG. 15 is a diagram for describing a method of extracting and displaying a character of a mobile terminal according to a second embodiment of the present invention.
- 16A and 16B are diagrams for describing a method of inputting a text extracted as an input item of a related application of a mobile terminal according to a second embodiment of the present invention.
- the mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like.
- PDA personal digital assistant
- PMP portable multimedia player
- FIG. 1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
- the mobile terminal 100 may include a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory unit 160,
- the interface unit 170 may include a controller 180, a power supply unit 190, and the like.
- the components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.
- the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
- the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .
- the broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
- the broadcast channel may include a satellite channel and a terrestrial channel.
- the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
- the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
- the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
- the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.
- the broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video BroadcastHandheld (DVBH).
- EPG Electronic Program Guide
- DMB Digital Multimedia Broadcasting
- ESG Electronic Service Guide
- DVBH Digital Video BroadcastHandheld
- the broadcast receiving module 111 receives a broadcast signal using various broadcasting systems, in particular, Digital Multimedia Broadcasting Terrestrial (DMBT), Digital Multimedia Broadcasting Satellite (DMBS), Media Forward Link Only (MediaFLO), and Digital Video BroadcastHandheld (DVBH). ), A digital broadcast signal can be received using a digital broadcast system such as Integrated Services Digital Broadcast Terrestrial (ISDBT).
- DMBT Digital Multimedia Broadcasting Terrestrial
- DMBS Digital Multimedia Broadcasting Satellite
- MediaFLO Media Forward Link Only
- DVBH Digital Video BroadcastHandheld
- a digital broadcast signal can be received using a digital broadcast system such as Integrated Services Digital Broadcast Terrestrial (ISDBT).
- ISDBT Integrated Services Digital Broadcast Terrestrial
- the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcast system but also other broadcast system for providing a broadcast signal.
- the broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory unit 160.
- the mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
- the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
- the wireless internet module 113 refers to a module for wireless internet access, and the wireless internet module 113 may be internal or external to the mobile terminal 100.
- Wireless Internet technologies may include Wireless LAN (WiFi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
- the short range communication module 114 refers to a module for short range communication.
- Bluetooth Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- UWB Ultra Wideband
- ZigBee ZigBee
- the location information module 115 is a module for checking or obtaining the location of the mobile terminal.
- a representative example of the location information module is a GPS (Global Position System) module.
- the GPS module 115 calculates information about a distance of one point (object) away from three or more satellites, and information on a time at which the distance information is measured, and then calculates the calculated distance information.
- the trigonometric method By applying the trigonometric method to, three-dimensional positional information according to latitude, longitude, and altitude of one point (object) at one time can be calculated.
- a method of calculating position and time information using three satellites and correcting the error of the calculated position and time information using another satellite is also used.
- the GPS module 115 may continuously calculate the current position in real time and use the same to calculate speed information.
- the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122.
- the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
- the processed image frame may be displayed on the display unit 151.
- the image frame processed by the camera 121 may be stored in the memory unit 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the configuration aspect of the terminal.
- the microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
- the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode.
- the microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
- the user input unit 130 generates input data for the user to control the operation of the terminal.
- the user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.
- the sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, it may be responsible for sensing functions related to whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to an external device, and the like. Meanwhile, the sensing unit 140 may include an attitude sensor 141 and / or a proximity sensor.
- the output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.
- the display unit 151 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, and a GUI.
- UI user interface
- GUI graphic user interface
- the display unit 151 may be a liquid crystal display, a thin film transistor liquid crystal display, an organic light emitting diode, a flexible display, or a 3D display. It may include at least one.
- Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display.
- a representative example of the transparent display is a transparent LCD.
- the rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.
- a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.
- the display unit 151 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter, abbreviated as “touch screen”)
- the display unit 151 is an output device. It can also be used as an input device.
- the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
- the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal.
- the touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.
- the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.
- a proximity sensor may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen.
- the proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using mechanical force by using electromagnetic force or infrared rays.
- Proximity sensors have a longer life and higher utilization than touch sensors.
- the proximity sensor examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
- the touch screen When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer.
- the touch screen touch sensor
- the touch screen may be classified as a proximity sensor.
- the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch
- the act of actually touching the pointer on the screen is called “contact touch.”
- the position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
- the proximity sensor detects a proximity touch and a proximity touch pattern (eg, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state).
- a proximity touch and a proximity touch pattern eg, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
- Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
- the sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory unit 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
- the sound output module 152 outputs a sound signal related to a function (for example, a call signal reception sound and a message reception sound) performed in the mobile terminal 100.
- the sound output module 152 may include a receiver, a speaker, a buzzer, and the like.
- the alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input.
- the alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration.
- the video signal or the audio signal may also be output through the display unit 151 or the audio output module 152.
- the haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154.
- the intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.
- the haptic module 154 may be used for the effects of stimulation by the arrangement of pins vertically moving with respect to the contact skin surface, the effect of the injection force of the air through the injection or inlet or the stimulation through the suction force, and the stimulation that rubs the skin surface.
- Various tactile effects may be generated, such as effects by stimulation through contact of electrodes, effects by stimulation using electrostatic force, and effects of reproducing a sense of warmth and heat using an endothermic or heat generating element.
- the haptic module 154 may not only deliver the haptic effect through direct contact, but also implement the haptic effect through the muscle sense of the user's finger or arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.
- the memory unit 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
- the memory unit 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
- the memory unit 160 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), At least one of Random Access Memory (RAM), Static Random Access Memory (SRAM), ReadOnly Memory (ROM), Electrically Erasable Programmable ReadOnly Memory (EEPROM), Programmable ReadOnly Memory (PROM) magnetic memory, magnetic disk, optical disk It may include a storage medium of the type.
- the mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory unit 160 on the Internet.
- the interface unit 170 serves as a path with all external devices connected to the mobile terminal 100.
- the interface unit 170 receives data from an external device or receives power and transmits the data to each component inside the mobile terminal 100 or transmits the data inside the mobile terminal 100 to an external device.
- wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.
- the identification module is a chip that stores various types of information for authenticating the usage rights of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identify module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like.
- a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.
- the interface unit may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various command signals input from the cradle by a user may be moved. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.
- the controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like.
- the controller 180 may include a multimedia module 181 for playing multimedia.
- the multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.
- the controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.
- the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
- Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
- embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, microcontrollers, microprocessors, and electrical units for performing the functions. 180).
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed.
- the software code may be implemented by a software application written in a suitable programming language.
- the software code may be stored in the memory unit 160 and executed by the controller 180.
- FIG. 2A is a front perspective view of an example of a mobile terminal or a portable terminal according to the present invention.
- the disclosed portable terminal 100 has a terminal body in the form of a bar.
- the present invention is not limited thereto and may be applied to various structures such as a slide type, a folder type, a swing type, a swivel type, and two or more bodies are coupled to be relatively movable.
- the body includes a casing (casing, housing, cover, etc.) that forms an exterior.
- the case may be divided into a front case 101 and a rear case 102.
- Various electronic components are built in the space formed between the front case 101 and the rear case 102.
- At least one intermediate case may be further disposed between the front case 101 and the rear case 102.
- the cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).
- a metal material such as stainless steel (STS) or titanium (Ti).
- the display unit 151, the audio output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, and the interface 170 may be disposed in the terminal body, mainly the front case 101. have.
- the display unit 151 occupies most of the main surface of the front case 101.
- the sound output unit 152 and the camera 121 are disposed in regions adjacent to one end of both ends of the display unit 151, and the user input unit 131 and the microphone 122 are disposed in regions adjacent to the other end.
- the user input unit 132, the interface 170, and the like are disposed on side surfaces of the front case 101 and the rear case 102.
- the user input unit 130 is manipulated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132.
- the manipulation units 131 and 132 may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the user operates the tactile manner with a tactile feeling.
- the contents input by the manipulation units 131 and 132 may be variously set.
- the first manipulation unit 131 receives a command such as start, end, scroll, and the like, and the second manipulation unit 132 adjusts the volume of the sound output from the sound output unit 152 or displays 151. Command), such as switching to the touch recognition mode.
- FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
- a camera 121 ′ may be additionally mounted on the rear of the terminal body, that is, the rear case 102.
- the camera 121 ′ has a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.
- the camera 121 has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the camera 121 'photographs a general subject and does not transmit it immediately. It is desirable to have a high pixel because there are many.
- the cameras 121 and 121 ' may be installed in the terminal body to be rotatable or popup.
- a flash 123 and a mirror 124 are further disposed adjacent to the camera 121 '.
- the flash 123 shines light toward the subject when the subject is photographed by the camera 121 '.
- the mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the camera 121 '.
- the sound output unit 152 ' may be further disposed on the rear surface of the terminal body.
- the sound output unit 152 ′ may implement a stereo function together with the sound output unit 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.
- the antenna 124 for receiving a broadcast signal may be additionally disposed on the side of the terminal body.
- the antenna 124 constituting a part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body.
- the terminal body is equipped with a power supply unit 190 for supplying power to the portable terminal 100.
- the power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.
- the rear case 102 may be further equipped with a touch pad 135 for sensing a touch.
- the touch pad 135 may also be configured to have a light transmission type.
- the display unit 151 is configured to output visual information from both sides, the visual information may be recognized through the touch pad 135.
- the information output on both surfaces may be controlled by the touch pad 135.
- a display may be additionally mounted on the touch pad 135, and a touch screen may be disposed on the rear case 102.
- the touch pad 135 operates in association with the display unit 151 of the front case 101.
- the touch pad 135 may be disposed in parallel to the rear of the display unit 151.
- the touch pad 135 may have the same or smaller size as the display unit 151.
- the controller (180 of FIG. 1) of the mobile terminal according to the present invention may display a line on an area of an image corresponding to the continuous touch input when receiving a continuous touch input while the image is displayed on the touch screen 151. .
- controller (180 of FIG. 1) may extract, by an optical character recognition method, a character at least partially crossing the line when receiving a specific input through the touch screen while the line is displayed.
- the controller 180 of FIG. 1 may extract the character by reading pixel information of an area included in the line and gradually comparing the feature points of the stored character while gradually expanding the thickness of the line.
- the controller 180 of FIG. 1 may display the extracted text together with the related application as a popup window, and directly input the extracted text as an input item of the executed application by executing the related application.
- a user can capture a screen displaying a character or take an image, extract only a desired character from the image, and use it directly in a related application. Therefore, the user can store or retrieve specific information without directly inputting a character using a keypad.
- the optical character recognition method is used only for a part of the image centered on the area where the line is displayed.
- the time may be shorter than when performing optical character recognition on the entire image.
- FIGS. 5A to 9C are diagrams for describing a method for controlling a mobile terminal according to the first embodiment of the present invention. .
- FIG. 3 is a flowchart of a control method of a mobile terminal for extracting characters from an image displayed on a touch screen according to a first embodiment of the present invention.
- the controller (180 of FIG. 1) may display an image including a character on a touch screen (S110).
- the image may include a photo stored in the memory of the mobile terminal, a capture screen for capturing a web page displayed on the touch screen of the mobile terminal, and the like, and a preview screen displayed after the camera is driven.
- the controller (180 of FIG. 1) may receive a continuous touch input through a touch screen on which an image is displayed (S120).
- the continuous touch input refers to an input for continuously touching adjacent points without removing a finger, such as a drag input, a flicking input, or a sliding input.
- the controller when a user drags an area displaying a character to be extracted from the image contents displayed on the touch screen, the controller (180 of FIG. 1) receives a continuous touch input.
- the controller (180 of FIG. 1) may display a line on an area of the image corresponding to the received touch input (S130). That is, when a touch input for a point is received and a touch input for an adjacent point is received, the controller (180 of FIG. 1) displays a line at a point previously touched, so that the controller (180 of FIG. 1) is continuous. Touch input can display a line in real time according to the input.
- the controller (180 of FIG. 1) may display the line to have a predetermined thickness, or display the thickness thicker or thinner than the set reference value in proportion to the area of the touch input received or the capacitance change amount, and may be a different layer from the image. layer) can be displayed by overlapping them to be transparent or translucent and have a specific color or contrast. Thus, the user can see the image contents displayed on the lower layer through the line.
- the controller (180 of FIG. 1) displays one line in a corresponding upper region of the image when receiving a continuous touch input in a specific area of the touch screen, and displays a previously displayed line when receiving at least one new continuous touch input. At least one new line can be displayed while maintaining the line. Accordingly, the controller 180 of FIG. 1 may display a plurality of separated lines overlapping the image displayed on the touch screen.
- the controller (180 of FIG. 1) may receive a specific input through a touch screen on which at least one line overlaps the image (S140).
- the specific input may be a touch input of a specific softkey that may input a control signal for extracting a character of an image intersecting at least one line, or an operation input of a function key or an operation key.
- the controller (180 of FIG. 1) may extract a character in which at least one line and at least a portion of the displayed line intersect (S150).
- the controller (180 of FIG. 1) is a method of searching for a character included in a line by an optical character recognition method while gradually increasing the thickness of the line in the vertical direction based on the coordinates of the line displayed on the touch screen. You can extract intersecting characters.
- the controller 180 of FIG. 1 scans an image region overlapping a displayed line and extracts a character by detecting the scanned content, and if the character is not recognized only by the image region overlapping the displayed line, the thickness of the line
- the letter can be extracted by repeating the same process while gradually expanding in the vertical direction.
- the controller 180 of FIG. 1 may gradually increase the thickness of the line in the vertical direction until the margin is detected.
- the controller (180 of FIG. 1) may display the extracted text in a popup window and provide the extracted text (S160).
- the controller (180 of FIG. 1) may switch to a mode in which the text displayed on the pop-up window can be edited. Therefore, the user can modify or use only the wrong characters among the extracted characters or add other characters to the extracted characters to use the method of storing, copying, sharing, or transmitting.
- the controller (180 of FIG. 1) may display the extracted characters for each line by dividing the lines in one pop-up window, or display each of them as a separate pop-up window.
- the controller (180 of FIG. 1) may change the shape of the popup window so as to show a connection relationship with the displayed line.
- FIG. 4 is a flowchart illustrating a control method of a mobile terminal for recommending a related application according to an extracted text associated with a first embodiment of the present invention.
- the controller In the displaying of the extracted text on the popup window, the controller (180 of FIG. 1) searches for a related application according to the content of the extracted text (S161), and displays an icon representing the related application along with the text on the popup window. Step S162 may be included.
- the controller (180 of FIG. 1) recommends an application such as a schedule, a search, an email, or a memo as a related application when the content of the text indicates a date, and indicates an application such as a map, a schedule, a memo, or a traffic when the place is indicated.
- an application such as a schedule, a search, an email, or a memo as a related application when the content of the text indicates a date
- an application such as a map, a schedule, a memo, or a traffic when the place is indicated.
- Recommended as a related application recommended as a related application.
- controller (180 of FIG. 1) receives an input for one of the related applications displayed on the popup window (S170), executes the related application corresponding to the icon (S180), and inputs the characters displayed on the popup window to the input item of the application execution screen. You can enter
- the controller (180 of FIG. 1) may be separated according to the contents of the text (S190), and display the separated text on an input item of a related application execution screen (S200).
- 5A and 5B are diagrams for describing a method of displaying an image of a mobile terminal according to a first embodiment of the present invention.
- the controller 180 of FIG. 1 may display a plurality of icons a1 to a5 in the first area R1 of the touch screen 151, and an icon representing a gallery application among the plurality of icons.
- a touch input for (a3) may be received.
- the controller (180 of FIG. 1) switches to a gallery application when receiving a touch input for a gallery icon, and when receiving a touch input for a specific photo ('photo 10') of a photo gallery, the touch screen.
- An image of 'Photo 10' may be displayed in the second region R2 of 151 (see FIG. 6A).
- the image in 'Photo 10' is an image captured and saved by email.
- 6A to 6C are diagrams for describing a method in which a mobile terminal according to a first embodiment of the present invention receives a continuous touch input and displays a line.
- the controller (180 of FIG. 1) receives a touch input for the touch pen icon a4 among icons displayed on the first area R1 of the touch screen 151, referring to FIG. 6B.
- the user may receive a continuous touch input for a specific area on the email image displayed in the second area, '2 pm on March 29, 2013 (Friday)'.
- the controller (180 of FIG. 1) may include 'LG Electronics R & D' and 'LG Electronics' on March 29, 2013 (Friday), when a continuous touch input is received in the second area R2 of the touch screen 151.
- Lines l1 and l2 can be displayed on the research center. In this case, the line may be displayed in real time when a continuous touch input is received.
- FIG. 7 is a diagram for describing a method for receiving a specific input by a mobile terminal according to a first embodiment of the present invention.
- the controller When the controller (180 of FIG. 1) receives a touch input for the execution icon a5 displayed in the first area R1 of the touch screen 151 while two lines l1 and l2 are displayed, the line Extract a character intersecting with and search for related application.
- the execution icon a5 is an icon capable of inputting a control signal for commanding text extraction and related application search execution.
- FIGS. 8A and 8B are diagrams for describing a method of extracting and displaying a character of a mobile terminal according to a first embodiment of the present invention.
- the controller (180 of FIG. 1) extracts a character 'LG Electronics R & D Lab' that intersects the line '1' and intersects the line '2. (W1) can be displayed together.
- icons b1 to b4 representing applications related to the contents of the characters extracted from the two lines l1 and l2 may be displayed together in the popup window W1.
- the controller (180 of FIG. 1) extracts the text 'LG Electronics R & D Research Institute' which crosses the line 'l1' on February 29, 2013 (Friday) 2 pm and the line 'L2' and separates the popup window. Each can be displayed at (W1).
- icons b1 to b4 representing the characters extracted from the line l1 and the related application are displayed in the first popup window W1
- icons b1 'to b4' representing the characters extracted from the line l2 and the related application are displayed in the second. It may be displayed on the popup window W2.
- the related applications b1 to b4 and b1 'to b4' may be recommended in the order of the most frequently used applications regardless of the content of the text, or a preprogrammed application may be recommended.
- 9A to 9C are diagrams for describing a method of inputting a text extracted as an input item of a related application of a mobile terminal according to a first embodiment of the present invention.
- the controller when the controller (180 of FIG. 1) receives a touch input to any one of the icons b1 to b4 representing the related application displayed on the pop-up window W1, the control unit 180 of FIG. As illustrated, the screen of the touch screen 151 may be switched to the application execution screen corresponding to the icon b1 from which the touch input is received.
- controller (180 of FIG. 1) automatically displays characters corresponding to the input items A2 and A3 corresponding to the contents of the characters displayed in the pop-up window W1 of FIG. 9A among the input items A1 to A3 of the application execution screen. Can be displayed as
- the text content when the text content indicates a date and a place, the text content may be displayed separately by corresponding input items.
- the controller may display the selected application execution screen in a state where the input of the input item is completed as shown in FIG. 9C.
- FIG. 10 is a diagram for describing a method of controlling a mobile terminal according to a second embodiment of the present invention.
- the second embodiment there is a difference from the first embodiment in that it is used in the step of driving the camera to take an image and storing the photographed image, and the other configuration is different from the first embodiment. Since they are all the same, only the configurations with differences will be described.
- the controller (180 of FIG. 1) may drive a camera to capture an image (S210), and display the captured image on a touch screen (S220).
- the controller 180 of FIG. 1 may extract a character by an optical character recognition method by using the image displayed as a preview screen on the camera application execution screen without being stored in the gallery application.
- 11A and 11B are diagrams for describing a method of obtaining an image of a mobile terminal according to a second embodiment of the present invention.
- the controller (180 of FIG. 1) may display a plurality of icons a1 to a5 in the first area R1 of the touch screen 151, and among the plurality of icons a1 to a5.
- a touch input for an icon a1 representing a camera application may be received.
- the controller (180 of FIG. 1) when receiving a touch input for the camera icon a1, the controller (180 of FIG. 1) may switch to a camera application and photograph a business card image.
- the controller (180 of FIG. 1) receives a touch input for the photographing icon c1 by executing a camera application, the controller (180 of FIG. 1) may display an image of a business card photographed on the second area R2 of the touch screen 151 (FIG. 12).
- FIG. 12 is a view for explaining a method of displaying an image of a mobile terminal according to a second embodiment of the present invention.
- the controller 180 of FIG. 1 may display a preview screen of a business card image photographed on the second area R2 of the touch screen 151. That is, the business card image is an image displayed on the preview screen after shooting, and is not stored in the memory of the mobile terminal.
- FIGS. 13A to 13C are diagrams for describing a method in which a mobile terminal according to a second embodiment of the present invention receives a continuous touch input and displays a line.
- the controller (180 of FIG. 1) receives a touch input for the touch pen icon a4 among icons displayed in the first area R11 of the touch screen 151, and then, in FIG. 13B.
- each sequence for a particular area on the business card image displayed in the second area is 'Kim o', 'Yangjae-dong, Seocho-gu, Seoul, 137-130', '010-1234-5678', 'abcd@lge.com' The received touch input.
- the controller (180 of FIG. 1) may include 'kim o', 'Yangjae-dong, Seocho-gu, Seoul,' 137-130, where a continuous touch input is received in the second area R2 of the touch screen 151.
- Lines l1, l2, l3, and l4 may be displayed on '010-1234-5678' and 'abcd@lge.com', respectively. In this case, the line may be displayed in real time when a continuous touch input is received.
- FIG. 14 is a diagram for describing a method for receiving a specific input by a mobile terminal according to a second embodiment of the present invention.
- the controller When the controller (180 of FIG. 1) receives a touch input for the execution icon a5 displayed in the first area R1 of the touch screen 151 while four lines l1 to l4 are displayed, the line You can switch to the mode of extracting characters intersecting with and searching for related applications.
- FIG. 15 is a diagram for describing a method of extracting and displaying a character of a mobile terminal according to a second embodiment of the present invention.
- the controller (180 of FIG. 1) may extract a character that intersects each line, search for a related application, and display the same on the popup window W1.
- the extracted character may be displayed by changing a paragraph to display the character.
- controller may display icons extracted for each line and icons representing related applications as separate pop-up windows in the same manner as in FIG. 8B, where each pop-up window indicates a connection relationship from which line is extracted.
- the pop-up window can be changed to display.
- 16A and 16B are diagrams for describing a method of inputting a text extracted as an input item of a related application of a mobile terminal according to a second embodiment of the present invention.
- the controller 180 of FIG. 1 when the controller (180 of FIG. 1) receives a touch input to any one of the icons b1 to b4 representing the related application displayed on the pop-up window W1, the controller 180 may be connected to FIG. 16B. As illustrated, the screen of the touch screen 151 may be switched to the application execution screen corresponding to the icon b1 from which the touch input is received.
- controller 180 of FIG. 1 automatically displays characters corresponding to the input items A1 to A4 corresponding to the contents of the characters displayed on the popup window W1 of FIG. 16A among the input items A1 to A4 of the application execution screen. Can be displayed as
- the text content when the text content indicates a name, a mobile phone number, an e-mail address, or an address, the text content may be displayed separately by corresponding input items.
- the method for controlling a mobile terminal according to the present invention described above may be provided by recording on a computer-readable recording medium as a program for executing in a computer.
- the control method of the mobile terminal according to the present invention can be executed through software.
- the constituent means of the present invention are code segments that perform the necessary work.
- the program or code segments may be stored on a processor readable medium or transmitted by a computer data signal coupled with a carrier on a transmission medium or network.
- Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of the computer-readable recording device include ROM, RAM, CD-ROM, DVD-ROM, DVD-RAM, magnetic tape, floppy disk, hard disk, optical data storage device, and the like.
- the computer readable recording medium can also be distributed over network coupled computer devices so that the computer readable code is stored and executed in a distributed fashion.
- the present invention can be applied to a mobile terminal including a recording medium recording an application program using an optical character recognition technology, a device for executing the application program, a smartphone, a PDA, a notebook computer, an IPTV, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (15)
- 터치 스크린; 및상기 터치 스크린에 문자를 포함하는 이미지를 표시하고, 상기 터치 스크린을 통해 연속된 터치 입력을 수신하고, 상기 연속된 터치 입력에 대응되는 상기 이미지의 영역 위에 라인을 표시하고, 상기 터치 스크린을 통해 특정 입력을 수신하는 경우 상기 라인과 적어도 일부가 교차되는 문자를 추출하고 상기 문자를 팝업창에 표시하는 제어부;를 포함하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 연속된 터치 입력에 따라 실시간으로 라인을 표시하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 특정 입력을 수신하기 전에 상기 터치 스크린을 통해 적어도 하나의 새로운 연속된 터치 입력을 수신하는 경우, 상기 적어도 하나의 새로운 연속된 터치 입력에 대응되는 상기 이미지의 영역 위에 적어도 하나의 새로운 라인을 표시하는 것을 특징으로 하는 이동 단말기.
- 제3항에 있어서,상기 제어부는, 상기 특정 입력을 수신하는 경우, 각 라인과 적어도 일부가 교차되는 문자를 추출하여 적어도 하나의 팝업창에 각각 표시하고, 상기 문자의 내용에 따라 관련 어플리케이션을 나타내는 적어도 하나의 아이콘을 상기 적어도 하나의 팝업창에 각각 표시하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 추출한 문자의 내용에 따라 관련 어플리케이션을 나타내는 적어도 하나의 아이콘을 상기 팝업창에 함께 표시하는 것을 특징으로 하는 이동 단말기.
- 제5항에 있어서,상기 제어부는, 상기 적어도 하나의 아이콘 중 하나에 대한 입력을 수신하면는 경우 상기 아이콘에 대응되는 어플리케이션을 실행하고, 상기 팝업창에 표시된 문자를 내용에 따라 분리하여 상기 어플리케이션 실행화면의 적어도 하나의 입력 항목에 표시하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 라인의 한 영역을 롱 터치하는 입력을 수신하는 경우, 상기 라인을 편집하는 모드로 진입하고, 상기 롱 터치된 지점의 위치에 따라 상기 라인의 길이를 조절하는 아이콘을 표시하거나, 상기 라인을 이동하는 아이콘을 표시하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 터치 스크린을 통해 특정 입력을 수신하면 상기 이미지의 크기를 확대하여 표시하고, 상기 확대된 이미지의 한 영역을 롱 터치하는 입력을 수신하면 상기 확대된 이미지를 고정시키는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 팝업창에 표시된 문자 영역에 터치 입력을 수신하면, 상기 문자를 편집하는 모드로 진입하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 터치 스크린에 표시된 리셋 아이콘에 대한 입력을 수신하는 경우, 상기 라인 및 상기 팝업창을 삭제하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 라인은 미리 설정된 두께로 표시되고,상기 제어부는, 상기 라인을 투명 또는 반투명하고, 미리 설정된 색깔 또는 명암으로 표시하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 제어부는, 상기 라인의 두께를 상하 방향으로 확대한 가상의 영역을 설정하고, 상기 가상의 영역에 포함되는 문자를 추출하는 것을 특징으로 하는 이동 단말기.
- 터치 스크린;카메라; 및상기 카메라를 통해 촬영된 이미지를 미리보기 화면으로 상기 터치 스크린에 표시하고, 상기 터치 스크린을 통해 연속된 터치 입력을 수신하고, 상기 연속된 터치 입력에 대응되는 상기 이미지의 영역 위에 라인을 표시하고, 상기 터치 스크린을 통해 특정 입력을 수신하는 경우 상기 라인과 적어도 일부가 교차되는 문자를 추출하고 상기 문자를 팝업창에 표시하는 제어부;를 포함하는 이동 단말기.
- 터치 스크린에 문자를 포함하는 이미지를 표시하는 단계;상기 이미지가 표시된 상태에서 상기 터치 스크린을 통해 연속된 터치 입력을 수신하는 단계;상기 수신된 연속된 터치 입력에 대응되는 상기 이미지의 영역 위에 라인을 표시하는 단계;상기 터치 스크린을 통해 특정 입력을 수신하는 단계; 및상기 특정 입력에 따라 상기 라인과 적어도 일부가 교차되는 문자를 추출하고 상기 추출된 문자를 팝업창에 표시하는 단계;를 포함하는 이동 단말기의 제어 방법.
- 카메라를 구동시켜 문자를 포함하는 이미지를 촬영하는 단계;상기 촬영된 이미지를 미리보기 화면으로 터치 스크린에 표시하는 단계;상기 이미지가 표시된 상태에서 상기 터치 스크린을 통해 연속된 터치 입력을 수신하는 단계;상기 수신된 연속된 터치 입력에 대응되는 상기 이미지의 영역 위에 라인을 표시하는 단계;상기 터치 스크린을 통해 특정 입력을 수신하는 단계; 및상기 특정 입력에 따라 상기 라인과 적어도 일부가 교차되는 문자를 추출하고 상기 추출된 문자를 팝업창에 표시하는 단계;를 포함하는 이동 단말기의 제어 방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2013/005596 WO2014208783A1 (ko) | 2013-06-25 | 2013-06-25 | 이동 단말기 및 이동 단말기의 제어 방법 |
US14/899,086 US10078444B2 (en) | 2013-06-25 | 2013-06-25 | Mobile terminal and method for controlling mobile terminal |
KR1020157033599A KR102135262B1 (ko) | 2013-06-25 | 2013-06-25 | 이동 단말기 및 이동 단말기의 제어 방법 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2013/005596 WO2014208783A1 (ko) | 2013-06-25 | 2013-06-25 | 이동 단말기 및 이동 단말기의 제어 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014208783A1 true WO2014208783A1 (ko) | 2014-12-31 |
Family
ID=52142100
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2013/005596 WO2014208783A1 (ko) | 2013-06-25 | 2013-06-25 | 이동 단말기 및 이동 단말기의 제어 방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10078444B2 (ko) |
KR (1) | KR102135262B1 (ko) |
WO (1) | WO2014208783A1 (ko) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3113005A1 (en) * | 2015-07-02 | 2017-01-04 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
EP3133801A1 (en) * | 2015-08-20 | 2017-02-22 | Lg Electronics Inc. | Mobile terminal and method of controling the same |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170083207A1 (en) * | 2014-03-20 | 2017-03-23 | Nec Corporation | Information processing apparatus, information processing method, and information processing program |
KR102314274B1 (ko) | 2014-08-18 | 2021-10-20 | 삼성전자주식회사 | 컨텐츠 처리 방법 및 그 전자 장치 |
WO2017209564A1 (ko) * | 2016-06-02 | 2017-12-07 | 주식회사 플런티코리아 | 앱 리스트 제공 방법 및 그 장치 |
KR102567003B1 (ko) * | 2018-05-08 | 2023-08-16 | 삼성전자주식회사 | 전자 장치 및 그 동작방법 |
CN111338540B (zh) * | 2020-02-11 | 2022-02-18 | Oppo广东移动通信有限公司 | 图片文本处理方法、装置、电子设备和存储介质 |
KR20220102302A (ko) * | 2021-01-13 | 2022-07-20 | 삼성전자주식회사 | 클립보드 기능을 제공하기 위한 방법 및 이를 지원하는 전자 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060142054A1 (en) * | 2004-12-27 | 2006-06-29 | Kongqiao Wang | Mobile communications terminal and method therefor |
US20090182548A1 (en) * | 2008-01-16 | 2009-07-16 | Jan Scott Zwolinski | Handheld dictionary and translation apparatus |
US20090247219A1 (en) * | 2008-03-25 | 2009-10-01 | Jian-Liang Lin | Method of generating a function output from a photographed image and related mobile computing device |
KR20100056028A (ko) * | 2008-11-19 | 2010-05-27 | 삼성전자주식회사 | 휴대단말기의 터치 신호를 이용한 정보 추출 방법 및 장치 |
US20120131520A1 (en) * | 2009-05-14 | 2012-05-24 | Tang ding-yuan | Gesture-based Text Identification and Selection in Images |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090108943A (ko) * | 2008-04-14 | 2009-10-19 | 한국전자통신연구원 | 인터넷 메일 첨부 파일 텍스트 추출 방법 및 장치 |
CN102890692A (zh) * | 2011-07-22 | 2013-01-23 | 阿里巴巴集团控股有限公司 | 一种网页信息抽取方法及抽取系统 |
KR102031142B1 (ko) * | 2013-07-12 | 2019-10-11 | 삼성전자주식회사 | 영상 디스플레이를 제어하는 전자 장치 및 방법 |
KR102216246B1 (ko) * | 2014-08-07 | 2021-02-17 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
-
2013
- 2013-06-25 US US14/899,086 patent/US10078444B2/en active Active
- 2013-06-25 WO PCT/KR2013/005596 patent/WO2014208783A1/ko active Application Filing
- 2013-06-25 KR KR1020157033599A patent/KR102135262B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060142054A1 (en) * | 2004-12-27 | 2006-06-29 | Kongqiao Wang | Mobile communications terminal and method therefor |
US20090182548A1 (en) * | 2008-01-16 | 2009-07-16 | Jan Scott Zwolinski | Handheld dictionary and translation apparatus |
US20090247219A1 (en) * | 2008-03-25 | 2009-10-01 | Jian-Liang Lin | Method of generating a function output from a photographed image and related mobile computing device |
KR20100056028A (ko) * | 2008-11-19 | 2010-05-27 | 삼성전자주식회사 | 휴대단말기의 터치 신호를 이용한 정보 추출 방법 및 장치 |
US20120131520A1 (en) * | 2009-05-14 | 2012-05-24 | Tang ding-yuan | Gesture-based Text Identification and Selection in Images |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3113005A1 (en) * | 2015-07-02 | 2017-01-04 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
CN106331311A (zh) * | 2015-07-02 | 2017-01-11 | Lg电子株式会社 | 移动终端及其控制方法 |
EP3133801A1 (en) * | 2015-08-20 | 2017-02-22 | Lg Electronics Inc. | Mobile terminal and method of controling the same |
US10049094B2 (en) | 2015-08-20 | 2018-08-14 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
Also Published As
Publication number | Publication date |
---|---|
KR102135262B1 (ko) | 2020-07-17 |
US10078444B2 (en) | 2018-09-18 |
US20160196055A1 (en) | 2016-07-07 |
KR20160023661A (ko) | 2016-03-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014208783A1 (ko) | 이동 단말기 및 이동 단말기의 제어 방법 | |
WO2015056844A1 (en) | Mobile terminal and control method thereof | |
WO2015190666A1 (en) | Mobile terminal and method for controlling the same | |
WO2016010221A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2012050248A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2015088123A1 (en) | Electronic device and method of controlling the same | |
WO2014137074A1 (en) | Mobile terminal and method of controlling the mobile terminal | |
WO2015137580A1 (en) | Mobile terminal | |
WO2016114444A1 (ko) | 이동 단말기 및 이의 제어방법 | |
WO2018066782A1 (en) | Mobile terminal | |
WO2016076474A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2015088166A1 (ko) | 이동 단말기 및 그의 후면 입력부 운용방법 | |
WO2012046891A1 (ko) | 이동단말기, 디스플레이 장치 및 그 제어 방법 | |
WO2017014394A1 (en) | Mobile terminal and control method for the same | |
WO2019160198A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2017175942A1 (en) | Mobile terminal and method for controlling the same | |
WO2015105257A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2016056723A1 (en) | Mobile terminal and controlling method thereof | |
WO2015178520A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2016039509A1 (ko) | 단말기 및 그 동작 방법 | |
WO2016129781A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2016200005A1 (en) | Mobile terminal and display operating method thereof | |
WO2018101508A1 (ko) | 이동 단말기 | |
WO2016085139A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2015108287A1 (ko) | 이동 단말기 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13887773 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20157033599 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14899086 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13887773 Country of ref document: EP Kind code of ref document: A1 |