US8644881B2 - Mobile terminal and control method thereof - Google Patents

Mobile terminal and control method thereof Download PDF

Info

Publication number
US8644881B2
US8644881B2 US12/797,505 US79750510A US8644881B2 US 8644881 B2 US8644881 B2 US 8644881B2 US 79750510 A US79750510 A US 79750510A US 8644881 B2 US8644881 B2 US 8644881B2
Authority
US
United States
Prior art keywords
preview image
information
camera
controller
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/797,505
Other versions
US20110111806A1 (en
Inventor
Yoon-Ho Kim
Hye-Jin Oh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of US20110111806A1 publication Critical patent/US20110111806A1/en
Assigned to LG ELECTRONICS INC. reassignment LG ELECTRONICS INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, YOON-HO, OH, HYE-JIN
Application granted granted Critical
Publication of US8644881B2 publication Critical patent/US8644881B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/26Devices for calling a subscriber
    • H04M1/27Devices whereby a plurality of signals may be stored simultaneously
    • H04M1/274Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc
    • H04M1/2745Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips
    • H04M1/2753Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips providing data content
    • H04M1/2755Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips providing data content by optical scanning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00281Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
    • H04N1/00307Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3204Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
    • H04N2201/3205Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3204Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
    • H04N2201/3207Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of an address
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3204Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
    • H04N2201/3209Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of a telephone number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration

Definitions

  • the present invention relates to a mobile terminal and corresponding method for recognizing character information in a preview image and performing a preset operation based on the recognized character information.
  • Mobile terminals now provide many additional services beside the basic call service. For example, user's can now access the Internet, play games, watch videos, listen to music, capture images and videos, record audio files, etc. Mobile terminals also now provide broadcasting programs such that user can watch television shows, sporting programs, videos etc.
  • one object of the present invention is to provide a novel mobile terminal and corresponding method for capturing a preview image, recognizing information in the preview image, and performing a specific operation on the recognized information based on the selection of a predetermined button among a plurality of buttons on the mobile terminal.
  • the present invention provides in one aspect a method for controlling a mobile terminal, and which includes receiving, via an input unit, a selection signal indicating a selection of a predetermined button among multiple predetermined buttons on the mobile terminal, in which the multiple predetermined buttons corresponding to different preset functions executed on the mobile terminal; capturing, via a camera included on the mobile terminal, a preview image of an object upon receiving the selection signal; recognizing, via a controller included on the mobile terminal, a character string included in the captured preview image; and performing, via the controller, a preset function using the recognized character string and that corresponds to the selection of the predetermined button.
  • the present invention provides a mobile terminal including an input unit configured to receive a selection signal indicating a selection of a predetermined button among multiple predetermined buttons on the mobile terminal, in which the multiple predetermined buttons corresponding to different preset functions executed on the mobile terminal; a camera configured to capture a preview image of an object upon receiving the selection signal; and a controller configured to recognize a character string included in the captured preview image, and to perform a preset function using the recognized character string and that corresponds to the selection of the predetermined button.
  • FIG. 1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention
  • FIG. 2 is a flow chart illustrating a method for controlling a mobile terminal according to a first embodiment of the present invention
  • FIGS. 3A to 3D are overviews of display screens of a display unit according to the first embodiment of the present invention.
  • FIG. 4 is a flow chart illustrating a method for controlling a mobile terminal according to a second embodiment of the present invention
  • FIG. 5 is an overview of a display screen of a display unit according to the second embodiment of the present invention.
  • FIG. 6 is a flow chart illustrating a method for controlling a mobile terminal according to a third embodiment of the present invention.
  • FIG. 7 is an overview of a display screen of a display unit according to the third embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a mobile terminal 100 according to an embodiment of the present invention.
  • the mobile terminal 100 may be implemented in various forms.
  • the mobile terminal 100 may be a telematics terminal, a smart phone, a portable terminal, a personal digital assistant (PDA), a portable multimedia player (PMP) terminal, a notebook computer, a WiBro terminal, an Internet protocol television (IPTV) terminal, a navigation terminal, an audio video navigation (AVN) terminal, an audio/video (AN) system, and the like.
  • PDA personal digital assistant
  • PMP portable multimedia player
  • IPTV Internet protocol television
  • APN audio video navigation
  • AN audio/video
  • the mobile terminal 100 includes an antenna 110 , a call processing circuit 120 , a keypad 130 , a camera 140 , a display unit 150 , a speaker 160 , a microphone 170 , a storage unit 180 , and a controller 190 .
  • the mobile terminal 100 may be implemented with more elements than those illustrated in FIG. 1 or may be implemented with less elements than those illustrated In FIG. 1 .
  • the call processing circuit 120 codes a voice signal or a data signal with respect to an origination call according to a preset scheme, modulates the coded signal, and transmits the modulated signal to another terminal via the antenna 110 . Also, the call processing circuit 120 receives a signal transmitted from another terminal with respect to a reception call via the antenna 110 , demodulates the received signal, decodes the demodulated signal, and provides the decoded signal to the controller 190 . The function of the call processing circuit 120 can also be performed by the controller 190 .
  • the keypad 130 may include an array of actual physical keys or be configured as a touch type keypad implemented by software.
  • the camera 140 processes image frames such as a still image, video, and the like, obtained by an image sensor (e.g., a CCD sensor, a CMOS sensor, etc.) in a video call mode, an image capture mode, a video conference mode, and the like. Namely, corresponding video data obtained by the image sensor according to a CODEC scheme are decoded to fit each standard.
  • the processed image frames can then be displayed on the display unit 150 under the control of the controller 190 .
  • the image frames processed by the camera 140 may be stored in the storage unit 180 .
  • the display unit 150 also displays an operational state of each element included in the mobile terminal 100 under the control of the controller 190 . Further, the display unit 150 displays executed results of application programs (e.g., a calculator program, a search program, and the like) stored in the storage unit 180 under the control of the controller 190 .
  • application programs e.g., a calculator program, a search program, and the like
  • the display unit 150 may be configured to receive an input from the user by using a touch screen scheme.
  • the display unit 150 can display various contents such as various menu screen images by using a user interface and/or graphic user interface included in the storage unit 180 . Further, the contents displayed on the display unit 150 may include menu screen images including various text or image data (including map data or various information data) and data such as icons, a list menu, combo box, and the like.
  • the display unit 150 also displays image information captured by the camera 140 under the control of the controller 190 , and may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED), a flexible display, a field emission display (FED), and a three-dimensional (3D) display.
  • LCD Liquid Crystal Display
  • TFT-LCD Thin Film Transistor-LCD
  • OLED Organic Light Emitting Diode
  • flexible display a field emission display
  • FED field emission display
  • 3D three-dimensional
  • the display unit 150 may include a haptic module that generates various tactile effects the user may feel.
  • a typical example of the tactile effects generated by the haptic module is vibration.
  • the strength, pattern, and the like of the vibration generated by the haptic module can also be controlled. For example, different vibrations may be combined to be output or sequentially output.
  • the haptic module can generate various other tactile effects such as effects by stimulations such as a pin arrangement vertically moving with respect to a contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a contact on the skin, a contact of an electrode, electrostatic force, etc., an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat.
  • the haptic module may be also implemented to allow the user to feel a tactile effect through a muscle sensation such as user's fingers or arm, as well as transferring the tactile effect through a direct contact.
  • One or more haptic modules may also be provided according to an implementation form of the mobile terminal 100 .
  • the speaker 160 outputs, for example, voice information included in a signal processed by the controller 190 . Also, the speaker 160 outputs voice information included in results obtained by executing an application program under the control of the controller 190 .
  • the microphone 170 can receive an external audio signal via a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and process such signal into electrical audio (voice) data. The processed audio (voice) data can then be converted for output into a format transmittable to a mobile communication base station via a communication unit for the phone call mode.
  • the microphone 170 may also use various types of noise canceling (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
  • the keypad 130 and the microphone 170 may be implemented as a single input unit in the mobile terminal 100 .
  • the input unit can receive a button manipulation by the user, or receive a command or a control signal according to a manipulation such as touching or scrolling a displayed screen image.
  • the input unit can select a function desired by the user or receive information, and include various devices such as a touch screen, a jog shuttle, a stylus, a touch pen, and the like, besides the keypad 130 and the microphone 170 .
  • the storage unit 180 includes, for example, a flash memory, a non-volatile memory, a DRAM, a volatile memory, etc.
  • the flash memory stores an operating system for operating (or driving) the mobile terminal 100 and one or more application programs.
  • the application programs may include programs for a calculator, a voice call origination, a text message origination, an Internet access, a Web browser, WAP browser, an Internet data search, and the like.
  • the DRAM temporarily stores data generated in the process of operating the controller 190 .
  • the storage unit 180 stores various user interfaces (UIs) and/or graphic user interfaces (GUIs).
  • UIs user interfaces
  • GUIs graphic user interfaces
  • the storage unit 180 may include a storage medium such as a hard disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), read-only memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a magnetic memory, a magnetic disk, an optical disk, and the like.
  • ROM read-only memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only memory
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • the controller 190 controls a general operation of the mobile terminal 100 .
  • the controller 190 operates or drives the camera 140 .
  • the first button may be one of a button or a calculation function button for selecting a calculator function, a button or a call connection button including a call connection function button using a phone number, a call connection function button using a speed number, and a call connection function button using a URL for selecting a call connection function, a button or a search function button for selecting a search function, and a button or a camera operating button for operating the camera 140 .
  • the controller 190 then displays an image captured by (input to/received by) the camera 140 on the display unit 150 in the preview mode.
  • the preview mode refers to an image capture standby state in which a certain image is received by the camera 140 , rather than a state in which a still image or video is captured by the camera 140 .
  • the controller 190 also recognizes a character string including characters, numbers, symbols, etc. included in an image captured by the camera 140 in the preview mode.
  • the controller 190 then performs the function of a calculator program executed or operated in a background state or in a state of not being displayed on the screen based on the recognized character string, and displays the performing results on the display unit 150 .
  • the controller 190 classifies general text including a character string, symbols, and the like and cost information including information related to numbers and a fee in the recognized character string.
  • the controller 190 classifies the recognized text ‘Beat 2.5 Kg’ as general text and ‘13,200 WON’ as cost information in the character string ‘Beat 2.5 Kg 13,200 WON.’
  • the controller 190 can determine the text relates to cost information based on cost characteristics previously stored in the storage unit 180 .
  • the previously stored price characteristics information may be the characteristics representing a monetary unit of each country.
  • the previously stored price characteristics information may be ‘Won’ for Korean currency and ‘dollar’ or ‘$’ for the dollar.
  • the controller 190 can classify the characters or numbers included in the character string using the cost characteristics information as cost information based on the cost characteristics information, and classify other character string as general text.
  • the controller 190 also performs a preset function based on the recognized character string such as a call function, a search function, a calculation function, a particular display function, a particular haptic output function, an Internet function, a browser function, a translation function for translating a first language into a second language, etc.
  • the controller 190 can also display results or information regarding the preset function on the display unit 150 .
  • the controller 190 executes at least one of the calculator program, Internet connection program, Web browser, WAP browser, search program, etc. previously stored in the storage unit 180 in a background state.
  • the mobile terminal 100 can perform a communication function with another terminal via a wireline/wireless communication network.
  • the wireless Internet technique may include a wireless LAN (WLAN), Wi-Fi, Wireless Broadband (WiBro), World Interoperability for Microwave Access (WiMax), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), wireless mobile broadcast service (WMBS), and the like.
  • a short-range communication technique may also be used and includes BluetoothTM, radio frequency identification, infrared data association (IrDA), ultra-wideband (UWB), ZigBeeTM, and the like.
  • FIG. 2 is a flow chart illustrating a method for controlling a mobile terminal according to a first embodiment of the present invention.
  • the controller 190 operates the camera 140 (S 110 ).
  • the preset first button may be a button for selecting the calculator function or a button for operating the camera 140 .
  • the controller 190 displays an image captured by the camera 140 in the preview mode on the display unit 150 .
  • the preview mode refers to an image capture standby state in which a certain image is received by the camera 140 , rather than a state in which a still image or video is captured by the camera 140 .
  • the controller 190 when the camera 140 is operated, the controller 190 preferably changes the focus of the camera 140 to a preset zoom magnification and adjusts the focus according to an object in front of the focus-changed camera 140 to thereby shorten the time required for focus adjustment.
  • the controller 190 can also adjust the focus of the camera 140 using an auto-focus function. For example, in the present embodiment, when the calculator program through image recognition is performed, an initial zoom magnification operated after the camera 140 is operated can correspond to a close-up mode.
  • the storage unit 180 stores the image captured by the camera 140 in the preview mode (S 120 ). Thereafter, the controller 190 recognizes a character string including characters, numbers, symbols, etc. included in the recognized character string captured by the camera 140 (S 130 ). The controller 190 then displays the recognized character string on a region of the display unit 150 (S 140 ). For example, as shown in FIG. 3A , when an image of a detergent including the character string ‘Beat 2.5 Kg 13,200 WON’ is captured by the camera 140 in the preview mode, the controller 190 recognizes the character string and displays the image captured by the camera 140 on one region 301 of the display unit 150 and displays the recognized character string on another region 302 of the display unit 150 .
  • the controller 190 repeatedly performs the character string recognition and display process until the user selects as a preset second button (e.g., an ‘OK’ button or a ‘capture’ button).
  • a preset second button e.g., an ‘OK’ button or a ‘capture’ button.
  • the controller 190 performs the function of a calculator program executed in a background state or a state in which information regarding the calculator program is not displayed on the screen by using the recognized character string (S 150 ).
  • the controller 190 classifies the text ‘Beat 2.5 Kg 13,200 WON’ displayed on the display unit 150 into ‘Beat 2.5 Kg’ representing general text (e.g., including information such as the name of a product, a packing unit, and the like) and 13,200 WON′ representing price information, calculates a total amount based on the classified general text information and the price information, and displays the calculated information on the display unit 150 .
  • the controller 190 displays general text information 311 , price information 312 , and the total amount 313 on the display unit 150 , which are results of the calculator program execution results. Further, the controller 190 may perform the calculation function of the calculator program using only the price information.
  • the controller 190 recognizes a character string included in another image captured by the camera 140 in the preview mode. For example, when the recognized character string is ‘Salt 500 g ⁇ 1,500’ as shown in FIG. 3C , the controller 190 displays an image captured by the camera 140 on a region 321 of the display unit 150 , displays the recognized character string on a region 322 of the display unit 150 , and displays the calculator program execution result in the previous step (S 150 ) on a region 323 of the display unit 150 .
  • the controller 190 classifies the recognized character string ‘Salt 500 g ⁇ 1,500’ into ‘Salt 500 g’ representing general text and ‘ ⁇ 1,500’ representing price information, and calculates a total amount based on the classified general text information and the price information. Then, as shown in FIG. 3D , the controller 190 displays the general text information 331 , the price information 332 , and the total amount 333 on the display unit 150 , which are the calculator program execution results.
  • the controller 190 when an image of a certain new object including a certain character string is captured by the camera 140 in a state that the preset operation key or end key is not selected using the keypad 130 , the controller 190 repeatedly performs the character string recognition and calculation function performing process by performing the foregoing steps S 120 to S 160 .
  • the controller 190 displays the calculation result on the display unit 150 and terminates the calculation process (S 180 ).
  • the mobile terminal 100 can recognize the character string including characters, numbers, symbols, and the like included in the image captured by the camera, perform the calculation function based on the recognized character string according to the calculator program being executing in a background state, and display the calculation function performing results.
  • FIG. 4 is a flow chart illustrating a method for controlling a mobile terminal according to a second embodiment of the present invention.
  • the controller 190 operates the camera 140 (S 210 ).
  • the preset first button may be a button for selecting a call connection function (for example, a button for providing a call connection function using a phone number, a button for providing a call connection function using a speed number, a button for providing a call connection function using a URL, and the like), or a button for operating the camera 140 .
  • the controller 190 displays a preview image captured by the camera 140 in the preview mode on the display unit 150 .
  • the preview mode refers to an image capture standby state in which a certain image is received by the camera 140 , rather than a state in which a still image or video is captured by the camera 140 .
  • the controller 190 may change the focus of the camera 140 to a preset zoom magnification and adjust the focus according to an object in front of the focus-changed camera 140 to thereby shorten the time required for focus adjustment.
  • the controller 190 may adjust the focus of the camera 140 using an auto-focus function. For example, the controller 190 may set an initial zoom magnification immediately after the camera 140 is operated such that it corresponds to a close-up mode.
  • the storage unit 180 stores the image captured by the camera 140 in the preview mode (S 220 ). Thereafter, the controller 190 recognizes a character string including characters, numbers, symbols, etc. included in the recognized character string captured by the camera 140 (S 230 ). The controller 190 then displays the recognized character string on a region of the display unit 150 (S 240 ).
  • the controller 190 displays the image captured by the camera 140 on a region 501 of the display unit 150 and displays the recognized character string on a region 502 of the display unit 150 .
  • the controller 190 repeatedly performs the character string recognition and display process until the user selects as a preset second button (e.g., an ‘OK’ button or a ‘capture’ button) for acquiring the character string. Thereafter, when the preset second button is selected, the controller 190 classifies the recognized character string into information corresponding to numbers and general text including characters and symbols, and performs a call function based on the classified information corresponding to the numbers (S 250 ).
  • a preset second button e.g., an ‘OK’ button or a ‘capture’ button
  • the controller 190 classifies the ‘contact number 012-345-6789’ displayed on the display unit 150 into ‘contact number, -’ representing a general text including characters, symbols, etc. and ‘0123456789’ representing numbers.
  • the controller 190 then performs a call function with a terminal corresponding to the recognized numbers using the classified number information (e.g., ‘0123456789’) (S 250 ).
  • the mobile terminal 100 can recognize the character string including characters, numbers, symbols, etc. included in the image captured by the camera and perform a call communication function with another terminal based on the recognized character string.
  • FIG. 6 is a flow chart illustrating a method for controlling a mobile terminal according to a third embodiment of the present invention.
  • the controller 190 operates the camera 140 (S 310 ).
  • the preset first button may be a button for selecting a search function or a button for operating the camera 140 .
  • the controller 190 displays an image captured by the camera 140 in the preview mode on the display unit 150 (S 320 ).
  • the preview mode refers to an image capture standby state in which a certain image is received by the camera 140 , rather than a state in which a still image or video is substantially captured by the camera 140 .
  • the controller 190 may change the focus of the camera 140 to a preset zoom magnification and adjust the focus according to an object in front of the focus-changed camera 140 to thereby shorten the time required for focus adjustment.
  • the controller 190 may adjust the focus of the camera 140 using an auto-focus function. For example, the controller 190 may set an initial zoom magnification immediately after the camera 140 is operated such that it corresponds to a close-up mode.
  • the storage unit 180 then stores the image captured by the camera 140 in the preview mode. Thereafter, the controller 190 recognizes a character string including characters, numbers, symbols, etc. included in the recognized character string captured by the camera 140 (S 330 ), and displays the recognized character string on a region of the display unit 150 (S 340 ). For example, as shown in FIG. 7 , when an image of women's clothing is captured by the camera 140 in the preview mode, a character string included in the captured image is recognized (i.e., the recognized character string is ‘DAKS’). The controller 190 then displays the image captured by the camera 140 on a region 701 of the display unit 150 and displays the recognized character string on a region 702 of the display unit 150 as shown in FIG. 7 .
  • DAKS a character string included in the captured image
  • the controller 190 repeatedly performs the character string recognition and display process until the user selects a preset second button (e.g., an ‘OK’ button or a ‘capture’ button) for acquiring a character string. Thereafter, when the user selects the preset second button, the controller 190 performs the function of a search program or one of an Internet access program, a Web browser, and a WAP browser in a background state or in a state of not being displayed on the screen using the recognized character string (S 350 ).
  • a preset second button e.g., an ‘OK’ button or a ‘capture’ button
  • the controller 190 searches information associated with the recognized character string DAKS' using the Microsoft Internet explorer, for example, and displays the search result on the display unit 150 . Also, when an image of a certain object including a certain character string is captured by the camera 140 for more than a preset time, the controller 190 can automatically perform the search function based on the recognized character string.
  • the controller 190 can perform a search function on each of the character strings and display each search function result on the display unit 150 . Also, when the recognized character string is a plurality of character strings, the controller 190 can receive a certain selected character string according to a user input, perform a search function on the selected character string, and display the performing result on the display unit 150 .
  • the mobile terminal 100 can recognize the character string including characters, numbers, symbols, etc. included in the image captured by the camera 140 in the preview mode, perform a search function based on the recognized character string according to a search program being executing in a background state, and display the search results.
  • the mobile terminal 100 can recognize a character string including characters, numbers, symbols, etc. included in an image captured by the camera in the preview mode state, perform a certain function based on the recognized character string according to a certain program being executing in a background state, and display the function performing results, whereby the sequential process of acquiring the image, recognizing the character string and executing the application program is simplified, and the time required for executing the application program is shortened.
  • the present invention can capture a preview image and then determine if there are any character or text information included in the preview image. If there is not any character or text information in the preview image, the controller 190 can notify the user of the same. Further, in the embodiments of the present invention, the user can use the camera provided on the terminal for performing additional functions such as a calculating function, a search function, etc. This significantly increases the capabilities provided by the mobile terminal.
  • the user can then select one of a plurality of different buttons to instruct the controller 190 to detect or find character strings in the preview image and then perform a particular function defined by the selected button.
  • the user can be viewing an image using the camera and then decide they want to perform a particular function.
  • the user can select the particular button they want to perform the particular function.
  • the user may be viewing an image that has a phone number in it, and the user can select the phone number call button to instruct the controller to search the image for the phone number, display the phone number and call the phone number.
  • the function button or particular button in the embodiments can be one of a physical hardware button, a proximity touch input, a soft touch button, a predetermined gesture, a voice command, etc.
  • the user can view an image of a flyer for a music festival, for example, and capture the web page purchasing information.
  • the controller 190 can then access the web page so the user can purchase tickets for the concert. If the user already has an account on the web page (e.g., Ticketmaster), the controller 190 can transmit the login information for the website so the tickets can be easily purchased.
  • the user can capture an image of a first language (e.g., Spanish) and instruct the controller 190 to translate the first language into a second language (e.g., English).
  • a translation dictionary can be stored in the memory or storage unit 180 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)

Abstract

A method for controlling a mobile terminal, and which includes receiving, via an input unit, a selection signal indicating a selection of a predetermined button among multiple predetermined buttons on the mobile terminal, in which the multiple predetermined buttons corresponding to different preset functions executed on the mobile terminal; capturing, via a camera included on the mobile terminal, a preview image of an object upon receiving the selection signal; recognizing, via a controller included on the mobile terminal, a character string included in the captured preview image; and performing, via the controller, a preset function using the recognized character string and that corresponds to the selection of the predetermined button.

Description

CROSS-REFERENCE TO A RELATED APPLICATION
This application claims priority to Korean Patent Application No. 10-2009-0107729 filed on Nov. 9, 2009 in Korea, the entire contents of which is hereby incorporated by reference in its entirety.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a mobile terminal and corresponding method for recognizing character information in a preview image and performing a preset operation based on the recognized character information.
2. Description of the Related Art
Mobile terminals now provide many additional services beside the basic call service. For example, user's can now access the Internet, play games, watch videos, listen to music, capture images and videos, record audio files, etc. Mobile terminals also now provide broadcasting programs such that user can watch television shows, sporting programs, videos etc.
SUMMARY OF THE INVENTION
Accordingly, one object of the present invention is to provide a novel mobile terminal and corresponding method for capturing a preview image, recognizing information in the preview image, and performing a specific operation on the recognized information based on the selection of a predetermined button among a plurality of buttons on the mobile terminal.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, the present invention provides in one aspect a method for controlling a mobile terminal, and which includes receiving, via an input unit, a selection signal indicating a selection of a predetermined button among multiple predetermined buttons on the mobile terminal, in which the multiple predetermined buttons corresponding to different preset functions executed on the mobile terminal; capturing, via a camera included on the mobile terminal, a preview image of an object upon receiving the selection signal; recognizing, via a controller included on the mobile terminal, a character string included in the captured preview image; and performing, via the controller, a preset function using the recognized character string and that corresponds to the selection of the predetermined button.
In another aspect, the present invention provides a mobile terminal including an input unit configured to receive a selection signal indicating a selection of a predetermined button among multiple predetermined buttons on the mobile terminal, in which the multiple predetermined buttons corresponding to different preset functions executed on the mobile terminal; a camera configured to capture a preview image of an object upon receiving the selection signal; and a controller configured to recognize a character string included in the captured preview image, and to perform a preset function using the recognized character string and that corresponds to the selection of the predetermined button.
Further scope of applicability of the present invention will become apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are given by illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will become more fully understood from the detailed description given hereinbelow and the accompanying drawings, which are given by illustration only, and thus are not limitative of the present invention, and wherein:
FIG. 1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating a method for controlling a mobile terminal according to a first embodiment of the present invention;
FIGS. 3A to 3D are overviews of display screens of a display unit according to the first embodiment of the present invention;
FIG. 4 is a flow chart illustrating a method for controlling a mobile terminal according to a second embodiment of the present invention;
FIG. 5 is an overview of a display screen of a display unit according to the second embodiment of the present invention;
FIG. 6 is a flow chart illustrating a method for controlling a mobile terminal according to a third embodiment of the present invention; and
FIG. 7 is an overview of a display screen of a display unit according to the third embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. The same elements and equivalents as those in the construction are referred to as the same reference numerals and a detailed description therefor will be omitted for the sake of explanation.
FIG. 1 is a block diagram illustrating a mobile terminal 100 according to an embodiment of the present invention. Further, the mobile terminal 100 may be implemented in various forms. For example, the mobile terminal 100 may be a telematics terminal, a smart phone, a portable terminal, a personal digital assistant (PDA), a portable multimedia player (PMP) terminal, a notebook computer, a WiBro terminal, an Internet protocol television (IPTV) terminal, a navigation terminal, an audio video navigation (AVN) terminal, an audio/video (AN) system, and the like.
As shown in FIG. 1, the mobile terminal 100 includes an antenna 110, a call processing circuit 120, a keypad 130, a camera 140, a display unit 150, a speaker 160, a microphone 170, a storage unit 180, and a controller 190. The mobile terminal 100 may be implemented with more elements than those illustrated in FIG. 1 or may be implemented with less elements than those illustrated In FIG. 1.
The call processing circuit 120 codes a voice signal or a data signal with respect to an origination call according to a preset scheme, modulates the coded signal, and transmits the modulated signal to another terminal via the antenna 110. Also, the call processing circuit 120 receives a signal transmitted from another terminal with respect to a reception call via the antenna 110, demodulates the received signal, decodes the demodulated signal, and provides the decoded signal to the controller 190. The function of the call processing circuit 120 can also be performed by the controller 190.
In addition, the keypad 130, allowing a key input, may include an array of actual physical keys or be configured as a touch type keypad implemented by software. Further, the camera 140 processes image frames such as a still image, video, and the like, obtained by an image sensor (e.g., a CCD sensor, a CMOS sensor, etc.) in a video call mode, an image capture mode, a video conference mode, and the like. Namely, corresponding video data obtained by the image sensor according to a CODEC scheme are decoded to fit each standard. The processed image frames can then be displayed on the display unit 150 under the control of the controller 190.
In addition, the image frames processed by the camera 140 may be stored in the storage unit 180. The display unit 150 also displays an operational state of each element included in the mobile terminal 100 under the control of the controller 190. Further, the display unit 150 displays executed results of application programs (e.g., a calculator program, a search program, and the like) stored in the storage unit 180 under the control of the controller 190. Here, the display unit 150 may be configured to receive an input from the user by using a touch screen scheme.
Also, the display unit 150 can display various contents such as various menu screen images by using a user interface and/or graphic user interface included in the storage unit 180. Further, the contents displayed on the display unit 150 may include menu screen images including various text or image data (including map data or various information data) and data such as icons, a list menu, combo box, and the like. The display unit 150 also displays image information captured by the camera 140 under the control of the controller 190, and may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED), a flexible display, a field emission display (FED), and a three-dimensional (3D) display.
In addition, the display unit 150 may include a haptic module that generates various tactile effects the user may feel. A typical example of the tactile effects generated by the haptic module is vibration. The strength, pattern, and the like of the vibration generated by the haptic module can also be controlled. For example, different vibrations may be combined to be output or sequentially output.
Besides vibration, the haptic module can generate various other tactile effects such as effects by stimulations such as a pin arrangement vertically moving with respect to a contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a contact on the skin, a contact of an electrode, electrostatic force, etc., an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat. The haptic module may be also implemented to allow the user to feel a tactile effect through a muscle sensation such as user's fingers or arm, as well as transferring the tactile effect through a direct contact. One or more haptic modules may also be provided according to an implementation form of the mobile terminal 100.
In addition, the speaker 160 outputs, for example, voice information included in a signal processed by the controller 190. Also, the speaker 160 outputs voice information included in results obtained by executing an application program under the control of the controller 190. Further, the microphone 170 can receive an external audio signal via a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and process such signal into electrical audio (voice) data. The processed audio (voice) data can then be converted for output into a format transmittable to a mobile communication base station via a communication unit for the phone call mode. The microphone 170 may also use various types of noise canceling (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
Further, the keypad 130 and the microphone 170 may be implemented as a single input unit in the mobile terminal 100. Also, the input unit can receive a button manipulation by the user, or receive a command or a control signal according to a manipulation such as touching or scrolling a displayed screen image. In addition, the input unit can select a function desired by the user or receive information, and include various devices such as a touch screen, a jog shuttle, a stylus, a touch pen, and the like, besides the keypad 130 and the microphone 170.
In addition, the storage unit 180 includes, for example, a flash memory, a non-volatile memory, a DRAM, a volatile memory, etc. The flash memory stores an operating system for operating (or driving) the mobile terminal 100 and one or more application programs. In this instance, the application programs may include programs for a calculator, a voice call origination, a text message origination, an Internet access, a Web browser, WAP browser, an Internet data search, and the like.
Further, the DRAM temporarily stores data generated in the process of operating the controller 190. Also, the storage unit 180 stores various user interfaces (UIs) and/or graphic user interfaces (GUIs). Besides the flash memory and the DRAM, the storage unit 180 may include a storage medium such as a hard disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), read-only memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a magnetic memory, a magnetic disk, an optical disk, and the like.
In addition, the controller 190 controls a general operation of the mobile terminal 100. For example, when a preset first button is selected (pressed/touched/clicked, etc.), the controller 190 operates or drives the camera 140. In this instance, the first button may be one of a button or a calculation function button for selecting a calculator function, a button or a call connection button including a call connection function button using a phone number, a call connection function button using a speed number, and a call connection function button using a URL for selecting a call connection function, a button or a search function button for selecting a search function, and a button or a camera operating button for operating the camera 140.
The controller 190 then displays an image captured by (input to/received by) the camera 140 on the display unit 150 in the preview mode. Further, the preview mode refers to an image capture standby state in which a certain image is received by the camera 140, rather than a state in which a still image or video is captured by the camera 140. The controller 190 also recognizes a character string including characters, numbers, symbols, etc. included in an image captured by the camera 140 in the preview mode.
The controller 190 then performs the function of a calculator program executed or operated in a background state or in a state of not being displayed on the screen based on the recognized character string, and displays the performing results on the display unit 150. In particular, the controller 190 classifies general text including a character string, symbols, and the like and cost information including information related to numbers and a fee in the recognized character string.
For example, the controller 190 classifies the recognized text ‘Beat 2.5 Kg’ as general text and ‘13,200 WON’ as cost information in the character string ‘Beat 2.5 Kg 13,200 WON.’ The controller 190 can determine the text relates to cost information based on cost characteristics previously stored in the storage unit 180. For example, the previously stored price characteristics information may be the characteristics representing a monetary unit of each country. In more detail, the previously stored price characteristics information may be ‘Won’ for Korean currency and ‘dollar’ or ‘$’ for the dollar. In this manner, the controller 190 can classify the characters or numbers included in the character string using the cost characteristics information as cost information based on the cost characteristics information, and classify other character string as general text.
The controller 190 also performs a preset function based on the recognized character string such as a call function, a search function, a calculation function, a particular display function, a particular haptic output function, an Internet function, a browser function, a translation function for translating a first language into a second language, etc. The controller 190 can also display results or information regarding the preset function on the display unit 150. Also, when a preset first button is selected, the controller 190 executes at least one of the calculator program, Internet connection program, Web browser, WAP browser, search program, etc. previously stored in the storage unit 180 in a background state. These features will be discussed in more detail later.
In addition, the mobile terminal 100 can perform a communication function with another terminal via a wireline/wireless communication network. The wireless Internet technique may include a wireless LAN (WLAN), Wi-Fi, Wireless Broadband (WiBro), World Interoperability for Microwave Access (WiMax), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), wireless mobile broadcast service (WMBS), and the like. A short-range communication technique may also be used and includes Bluetooth™, radio frequency identification, infrared data association (IrDA), ultra-wideband (UWB), ZigBee™, and the like.
Next, FIG. 2 is a flow chart illustrating a method for controlling a mobile terminal according to a first embodiment of the present invention. As shown, when the user selects a preset first button or key using the keypad 130, the controller 190 operates the camera 140 (S110). In addition, the preset first button may be a button for selecting the calculator function or a button for operating the camera 140. Thereafter, the controller 190 displays an image captured by the camera 140 in the preview mode on the display unit 150.
Further, the preview mode refers to an image capture standby state in which a certain image is received by the camera 140, rather than a state in which a still image or video is captured by the camera 140. In addition, when the camera 140 is operated, the controller 190 preferably changes the focus of the camera 140 to a preset zoom magnification and adjusts the focus according to an object in front of the focus-changed camera 140 to thereby shorten the time required for focus adjustment. The controller 190 can also adjust the focus of the camera 140 using an auto-focus function. For example, in the present embodiment, when the calculator program through image recognition is performed, an initial zoom magnification operated after the camera 140 is operated can correspond to a close-up mode.
In addition, as shown in FIG. 2, the storage unit 180 stores the image captured by the camera 140 in the preview mode (S120). Thereafter, the controller 190 recognizes a character string including characters, numbers, symbols, etc. included in the recognized character string captured by the camera 140 (S130). The controller 190 then displays the recognized character string on a region of the display unit 150 (S140). For example, as shown in FIG. 3A, when an image of a detergent including the character string ‘Beat 2.5 Kg 13,200 WON’ is captured by the camera 140 in the preview mode, the controller 190 recognizes the character string and displays the image captured by the camera 140 on one region 301 of the display unit 150 and displays the recognized character string on another region 302 of the display unit 150.
Further, in one example, the controller 190 repeatedly performs the character string recognition and display process until the user selects as a preset second button (e.g., an ‘OK’ button or a ‘capture’ button). Thus, as shown in FIG. 2, when the user selects the preset second button, the controller 190 performs the function of a calculator program executed in a background state or a state in which information regarding the calculator program is not displayed on the screen by using the recognized character string (S150).
For example, when the user selects the ‘OK’ button, the controller 190 classifies the text ‘Beat 2.5 Kg 13,200 WON’ displayed on the display unit 150 into ‘Beat 2.5 Kg’ representing general text (e.g., including information such as the name of a product, a packing unit, and the like) and 13,200 WON′ representing price information, calculates a total amount based on the classified general text information and the price information, and displays the calculated information on the display unit 150. Namely, as shown in FIG. 3B, the controller 190 displays general text information 311, price information 312, and the total amount 313 on the display unit 150, which are results of the calculator program execution results. Further, the controller 190 may perform the calculation function of the calculator program using only the price information.
The displayed calculated information can also include a price per unit (e.g., 13,200 Won/2.5 Kg=5280 Won/Kg) so the user can easily compare the price per unit against another product. Also, when a certain object including a certain character string is captured for more than a preset time via the camera 140, the controller 190 can automatically perform the function of the calculator program based on the recognized character string without the user pressing the second button.
Thereafter, the controller 190 checks whether or not a preset operation key or an end key is selected by the keypad 130 (S160). Also, when the user selects the preset operation key (e.g., +, =, *, /, etc.) using the keypad 130 (Yes in S160), the controller 190 performs the above-described steps S120 to S150 to repeatedly perform the character string recognition and calculation function performing process (S170).
That is, when the user selects an operation sign ‘+’ using the keypad 130, the controller 190 recognizes a character string included in another image captured by the camera 140 in the preview mode. For example, when the recognized character string is ‘Salt 500 g\1,500’ as shown in FIG. 3C, the controller 190 displays an image captured by the camera 140 on a region 321 of the display unit 150, displays the recognized character string on a region 322 of the display unit 150, and displays the calculator program execution result in the previous step (S150) on a region 323 of the display unit 150.
Thereafter, when the user selects the preset button, the controller 190 classifies the recognized character string ‘Salt 500 g\1,500’ into ‘Salt 500 g’ representing general text and ‘\1,500’ representing price information, and calculates a total amount based on the classified general text information and the price information. Then, as shown in FIG. 3D, the controller 190 displays the general text information 331, the price information 332, and the total amount 333 on the display unit 150, which are the calculator program execution results.
Therefore, when an image of a certain new object including a certain character string is captured by the camera 140 in a state that the preset operation key or end key is not selected using the keypad 130, the controller 190 repeatedly performs the character string recognition and calculation function performing process by performing the foregoing steps S120 to S160. When the user selects the preset end key (e.g., ‘=, OK key, end key’, etc.) using the keypad 130, the controller 190 displays the calculation result on the display unit 150 and terminates the calculation process (S180).
In this manner, the mobile terminal 100 can recognize the character string including characters, numbers, symbols, and the like included in the image captured by the camera, perform the calculation function based on the recognized character string according to the calculator program being executing in a background state, and display the calculation function performing results.
Next, FIG. 4 is a flow chart illustrating a method for controlling a mobile terminal according to a second embodiment of the present invention. As shown, when the user selects a preset first button or a key using the keypad 130, the controller 190 operates the camera 140 (S210). Further, the preset first button may be a button for selecting a call connection function (for example, a button for providing a call connection function using a phone number, a button for providing a call connection function using a speed number, a button for providing a call connection function using a URL, and the like), or a button for operating the camera 140.
Thereafter, the controller 190 displays a preview image captured by the camera 140 in the preview mode on the display unit 150. As discussed above, the preview mode refers to an image capture standby state in which a certain image is received by the camera 140, rather than a state in which a still image or video is captured by the camera 140. Also, when the camera 140 is operated, the controller 190 may change the focus of the camera 140 to a preset zoom magnification and adjust the focus according to an object in front of the focus-changed camera 140 to thereby shorten the time required for focus adjustment. In addition, the controller 190 may adjust the focus of the camera 140 using an auto-focus function. For example, the controller 190 may set an initial zoom magnification immediately after the camera 140 is operated such that it corresponds to a close-up mode.
As shown in FIG. 4, the storage unit 180 stores the image captured by the camera 140 in the preview mode (S220). Thereafter, the controller 190 recognizes a character string including characters, numbers, symbols, etc. included in the recognized character string captured by the camera 140 (S230). The controller 190 then displays the recognized character string on a region of the display unit 150 (S240).
For example, as shown in FIG. 5, when an image of a contact number attached to a vehicle is captured by the camera 140 in the preview mode, a character string included in the captured image is recognized, and the recognized character string is ‘contact number 012-345-6789’. The controller 190 then displays the image captured by the camera 140 on a region 501 of the display unit 150 and displays the recognized character string on a region 502 of the display unit 150.
Further, the controller 190 repeatedly performs the character string recognition and display process until the user selects as a preset second button (e.g., an ‘OK’ button or a ‘capture’ button) for acquiring the character string. Thereafter, when the preset second button is selected, the controller 190 classifies the recognized character string into information corresponding to numbers and general text including characters and symbols, and performs a call function based on the classified information corresponding to the numbers (S250).
For example, when the user selects the ‘OK’ button, the controller 190 classifies the ‘contact number 012-345-6789’ displayed on the display unit 150 into ‘contact number, -’ representing a general text including characters, symbols, etc. and ‘0123456789’ representing numbers. The controller 190 then performs a call function with a terminal corresponding to the recognized numbers using the classified number information (e.g., ‘0123456789’) (S250). In this manner, in the preview mode, the mobile terminal 100 can recognize the character string including characters, numbers, symbols, etc. included in the image captured by the camera and perform a call communication function with another terminal based on the recognized character string.
Next, FIG. 6 is a flow chart illustrating a method for controlling a mobile terminal according to a third embodiment of the present invention. As shown, when the user selects a preset first button or a key using the keypad 130, the controller 190 operates the camera 140 (S310). In addition, in this embodiment, the preset first button may be a button for selecting a search function or a button for operating the camera 140.
Thereafter, the controller 190 displays an image captured by the camera 140 in the preview mode on the display unit 150 (S320). As discussed above, the preview mode refers to an image capture standby state in which a certain image is received by the camera 140, rather than a state in which a still image or video is substantially captured by the camera 140. In addition, when the camera 140 is operated, the controller 190 may change the focus of the camera 140 to a preset zoom magnification and adjust the focus according to an object in front of the focus-changed camera 140 to thereby shorten the time required for focus adjustment. Also, the controller 190 may adjust the focus of the camera 140 using an auto-focus function. For example, the controller 190 may set an initial zoom magnification immediately after the camera 140 is operated such that it corresponds to a close-up mode.
The storage unit 180 then stores the image captured by the camera 140 in the preview mode. Thereafter, the controller 190 recognizes a character string including characters, numbers, symbols, etc. included in the recognized character string captured by the camera 140 (S330), and displays the recognized character string on a region of the display unit 150 (S340). For example, as shown in FIG. 7, when an image of women's clothing is captured by the camera 140 in the preview mode, a character string included in the captured image is recognized (i.e., the recognized character string is ‘DAKS’). The controller 190 then displays the image captured by the camera 140 on a region 701 of the display unit 150 and displays the recognized character string on a region 702 of the display unit 150 as shown in FIG. 7.
As discussed previously, the controller 190 repeatedly performs the character string recognition and display process until the user selects a preset second button (e.g., an ‘OK’ button or a ‘capture’ button) for acquiring a character string. Thereafter, when the user selects the preset second button, the controller 190 performs the function of a search program or one of an Internet access program, a Web browser, and a WAP browser in a background state or in a state of not being displayed on the screen using the recognized character string (S350).
For example, when the user selects the ‘OK’ button, the controller 190 searches information associated with the recognized character string DAKS' using the Microsoft Internet explorer, for example, and displays the search result on the display unit 150. Also, when an image of a certain object including a certain character string is captured by the camera 140 for more than a preset time, the controller 190 can automatically perform the search function based on the recognized character string.
In addition, when the recognized character string is a plurality of character strings, the controller 190 can perform a search function on each of the character strings and display each search function result on the display unit 150. Also, when the recognized character string is a plurality of character strings, the controller 190 can receive a certain selected character string according to a user input, perform a search function on the selected character string, and display the performing result on the display unit 150.
Thus, according to this embodiment of the present invention, the mobile terminal 100 can recognize the character string including characters, numbers, symbols, etc. included in the image captured by the camera 140 in the preview mode, perform a search function based on the recognized character string according to a search program being executing in a background state, and display the search results.
Therefore, according to the various embodiments of the present invention, the mobile terminal 100 can recognize a character string including characters, numbers, symbols, etc. included in an image captured by the camera in the preview mode state, perform a certain function based on the recognized character string according to a certain program being executing in a background state, and display the function performing results, whereby the sequential process of acquiring the image, recognizing the character string and executing the application program is simplified, and the time required for executing the application program is shortened.
In addition, in an alternative embodiment, the present invention can capture a preview image and then determine if there are any character or text information included in the preview image. If there is not any character or text information in the preview image, the controller 190 can notify the user of the same. Further, in the embodiments of the present invention, the user can use the camera provided on the terminal for performing additional functions such as a calculating function, a search function, etc. This significantly increases the capabilities provided by the mobile terminal.
In addition, if the user is previewing an image to be captured, the user can then select one of a plurality of different buttons to instruct the controller 190 to detect or find character strings in the preview image and then perform a particular function defined by the selected button. For example, the user can be viewing an image using the camera and then decide they want to perform a particular function. Thus, in this instance, the user can select the particular button they want to perform the particular function. For example, the user may be viewing an image that has a phone number in it, and the user can select the phone number call button to instruct the controller to search the image for the phone number, display the phone number and call the phone number.
The function button or particular button in the embodiments can be one of a physical hardware button, a proximity touch input, a soft touch button, a predetermined gesture, a voice command, etc. In still another embodiment, the user can view an image of a flyer for a music festival, for example, and capture the web page purchasing information. The controller 190 can then access the web page so the user can purchase tickets for the concert. If the user already has an account on the web page (e.g., Ticketmaster), the controller 190 can transmit the login information for the website so the tickets can be easily purchased.
In another embodiment, the user can capture an image of a first language (e.g., Spanish) and instruct the controller 190 to translate the first language into a second language (e.g., English). A translation dictionary can be stored in the memory or storage unit 180. Thus, if a user was eating at a Mexican restaurant, for example, the user could easily see an English translation of a particular menu item.
The present invention encompasses various modifications to each of the examples and embodiments discussed herein. According to the invention, one or more features described above in one embodiment or example can be equally applied to another embodiment or example described above. The features of one or more embodiments or examples described above can be combined into each of the embodiments or examples described above. Any full or partial combination of one or more embodiment or examples of the invention is also part of the invention.
As the present invention may be embodied in several forms without departing from the characteristics thereof, it should also be understood that the above-described embodiments are not limited by any of the details of the foregoing description, unless otherwise specified, but rather should be construed broadly within its scope as defined in the appended claims, and therefore all changes and modifications that fall within the metes and bounds of the claims, or equivalents of such metes and bounds are therefore intended to be embraced by the appended claims.

Claims (12)

What is claimed is:
1. A method of controlling a mobile terminal, the method comprising:
receiving a first selecting input indicating a selection of an icon among a plurality of icons displayed on a display unit of the mobile terminal, wherein the display unit is touch sensitive;
operating a camera when the icon is selected;
displaying a preview image received by the camera on the display unit in a preview mode, wherein the preview mode is displaying the preview image without storing the preview image to a memory and recognizing, via a controller of the mobile terminal, data included in the preview image;
converting, via the controller, first information corresponding to the recognized data into second information; and
displaying the converted second information with at least part of the preview image on the display unit without the preview image being stored,
wherein the displaying the preview image, recognizing the data, converting the first information into the second information, and displaying the converted second information with at least part of the preview image is continuously performed until a second selecting input is received,
wherein the first information is a first language information and the second information is a second language information,
wherein the plurality of icons correspond to different functions and the selected icon is matched with a translation function for translating the first language information into the second language information,
wherein when a focus of the camera is changed, the method further comprises:
displaying another preview image different from the preview image in the preview mode;
recognizing, via the controller, data included in the another preview image;
converting, via the controller, the first information corresponding to the recognized data included in the another preview image into the second information; and
displaying the converted second information with at least part of the another preview image on the display unit without the another preview image being stored, and
wherein when the second selecting input is received while the preview image is displayed, the method further comprises:
storing the preview image to the memory, and
displaying continuously the converted second information with the stored preview image.
2. The method of claim 1, wherein when the camera is operated, the method further comprises adjusting the focus of the camera to automatically recognize the data included in the preview image.
3. The method of claim 1, further comprising:
setting a zoom magnification of the camera to a preset zoom magnification immediately after the camera is operated such that a time required for recognizing the data included in the preview image is reduced.
4. The method of claim 1, wherein the display unit includes at least two display regions, and
wherein the preview image is displayed on a first display region and the converted second information is displayed on a second display region.
5. The method of claim 1, wherein the recognized data includes at least one of numbers, characters and symbols included in the preview image.
6. A mobile terminal, comprising:
a touch screen display unit configured to receive a touch selection signal indicating a selection of an icon among a plurality of icons displayed on the display unit;
a camera; and
a controller configured to:
display a preview image received by the camera on the display unit in a preview mode, wherein the preview mode is displaying the preview image without storing the preview image to a memory,
recognize data included in the preview image,
convert first information corresponding to the recognized data into second information,
display the converted second information with at least part of the preview image on the display unit without the preview image being stored, and
continuously perform the displaying of the preview image, the recognizing of the data, the converting of the first information into the second information, and displaying of the converted second information with at least part of the preview image until a second selecting input is received,
wherein the first information is a first language information and the second information is a second language information,
wherein the plurality of icons correspond to different functions and the selected icon is matched with a translation function for translating the first language information into the second language information,
wherein when a focus of the camera is changed, the controller is further configured to:
display another preview image different from the preview image in the preview mode,
recognize data included in the another preview image,
convert the first information corresponding to the recognized data included in the another preview image into the second information, and
display the converted second information with at least part of the another preview image on the display unit without another preview image being stored, and
wherein when the second selecting input is received, the controller is further configured to:
stored the preview image to the memory, and
display continuously the converted second information with the stored preview image.
7. The mobile terminal of claim 6, wherein when the camera is operated, the controller is further configured to adjust the focus of the camera to automatically recognize the data included in the preview image.
8. The mobile terminal of claim 6, wherein the controller is further configured to set a zoom magnification of the camera to a preset zoom magnification immediately after the camera is operated such that a time required for recognizing the data included in the preview image is reduced.
9. The mobile terminal of claim 6, wherein the display unit includes at least two display regions, and
wherein the controller is further configured to display the preview image on a first display region and the converted second information on a second display region.
10. The mobile terminal of claim 6, wherein the recognized data includes at least one of numbers, characters and symbols included in the preview image.
11. The method of claim 1, wherein another one of the displayed icons corresponds to a calling function for calling a number included in the preview image, a calculating function for calculating prices included in price information in the preview image, and a searching function for searching for information included in the preview image.
12. The mobile terminal of claim 6, wherein another one of the displayed icons corresponds to a calling function for calling a number included in the preview image, a calculating function for calculating prices included in price information in the preview image, and a searching function for searching for information included in the preview image.
US12/797,505 2009-11-09 2010-06-09 Mobile terminal and control method thereof Expired - Fee Related US8644881B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2009-0107729 2009-11-09
KR1020090107729A KR20110051073A (en) 2009-11-09 2009-11-09 Method of executing application program in portable terminal

Publications (2)

Publication Number Publication Date
US20110111806A1 US20110111806A1 (en) 2011-05-12
US8644881B2 true US8644881B2 (en) 2014-02-04

Family

ID=43974546

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/797,505 Expired - Fee Related US8644881B2 (en) 2009-11-09 2010-06-09 Mobile terminal and control method thereof

Country Status (2)

Country Link
US (1) US8644881B2 (en)
KR (1) KR20110051073A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170118611A1 (en) * 2015-10-27 2017-04-27 Blackberry Limited Monitoring resource access
CN110971820A (en) * 2019-11-25 2020-04-07 Oppo广东移动通信有限公司 Photographing method, photographing device, mobile terminal and computer readable storage medium
US10952087B2 (en) 2015-10-27 2021-03-16 Blackberry Limited Detecting resource access

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012161706A1 (en) * 2011-05-24 2012-11-29 Hewlett-Packard Development Company, L.P. Region of interest of an image
US9319150B2 (en) 2012-10-29 2016-04-19 Dell Products, Lp Reduction of haptic noise feedback in system
CN103810083B (en) * 2012-11-06 2017-08-29 腾讯科技(深圳)有限公司 The interface adjustment method and device of a kind of application program
KR20150026338A (en) * 2013-09-02 2015-03-11 엘지전자 주식회사 Mobile terminal
KR102295655B1 (en) * 2014-07-24 2021-08-31 삼성전자주식회사 Apparatus for Providing Integrated Functions of Dial and Calculator and Method thereof
CN104199645B (en) * 2014-08-15 2017-08-22 苏州佳世达电通有限公司 The system and its based reminding method of reminder events
KR20160133781A (en) * 2015-05-13 2016-11-23 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR20200100918A (en) 2019-02-19 2020-08-27 삼성전자주식회사 Electronic device for providing various functions through application using a camera and operating method thereof

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020065728A1 (en) * 1998-12-14 2002-05-30 Nobuo Ogasawara Electronic shopping system utilizing a program downloadable wireless videophone
US6473523B1 (en) * 1998-05-06 2002-10-29 Xerox Corporation Portable text capturing method and device therefor
US20030200078A1 (en) * 2002-04-19 2003-10-23 Huitao Luo System and method for language translation of character strings occurring in captured image data
US20050057669A1 (en) * 2003-09-12 2005-03-17 Sony Ericsson Mobile Communications Ab Method and device for communication using an optical sensor
US6937747B2 (en) * 2001-09-24 2005-08-30 Hewlett Packard Development Company, L.P. System and method for capturing non-audible information for processing
US20070035616A1 (en) * 2005-08-12 2007-02-15 Lg Electronics Inc. Mobile communication terminal with dual-display unit having function of editing captured image and method thereof
US20080115080A1 (en) * 2006-11-10 2008-05-15 Fabrice Matulic Device, method, and computer program product for information retrieval
US20080168405A1 (en) * 2007-01-07 2008-07-10 Francisco Ryan Tolmasky Portable Multifunction Device, Method, and Graphical User Interface for Translating Displayed Content
US20080174570A1 (en) * 2006-09-06 2008-07-24 Apple Inc. Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics
US20090148074A1 (en) * 2007-12-10 2009-06-11 Motorola, Inc. Method for automatically performing an image processing function on an electronic device
US20090276690A1 (en) * 2008-05-02 2009-11-05 Reagan Inventions, Llc. System and method of embedding symbology in alphabetic letters and then linking the letters to a site or sites on the global computer network
US7787693B2 (en) * 2006-11-20 2010-08-31 Microsoft Corporation Text detection on mobile communications devices

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6473523B1 (en) * 1998-05-06 2002-10-29 Xerox Corporation Portable text capturing method and device therefor
US20020065728A1 (en) * 1998-12-14 2002-05-30 Nobuo Ogasawara Electronic shopping system utilizing a program downloadable wireless videophone
US6937747B2 (en) * 2001-09-24 2005-08-30 Hewlett Packard Development Company, L.P. System and method for capturing non-audible information for processing
US20030200078A1 (en) * 2002-04-19 2003-10-23 Huitao Luo System and method for language translation of character strings occurring in captured image data
US20050057669A1 (en) * 2003-09-12 2005-03-17 Sony Ericsson Mobile Communications Ab Method and device for communication using an optical sensor
US20070035616A1 (en) * 2005-08-12 2007-02-15 Lg Electronics Inc. Mobile communication terminal with dual-display unit having function of editing captured image and method thereof
US20080174570A1 (en) * 2006-09-06 2008-07-24 Apple Inc. Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics
US20080115080A1 (en) * 2006-11-10 2008-05-15 Fabrice Matulic Device, method, and computer program product for information retrieval
US7787693B2 (en) * 2006-11-20 2010-08-31 Microsoft Corporation Text detection on mobile communications devices
US20080168405A1 (en) * 2007-01-07 2008-07-10 Francisco Ryan Tolmasky Portable Multifunction Device, Method, and Graphical User Interface for Translating Displayed Content
US20090148074A1 (en) * 2007-12-10 2009-06-11 Motorola, Inc. Method for automatically performing an image processing function on an electronic device
US8059897B2 (en) * 2007-12-10 2011-11-15 Motorola Mobility, Inc. Method for automatically performing an image processing function on an electronic device
US20090276690A1 (en) * 2008-05-02 2009-11-05 Reagan Inventions, Llc. System and method of embedding symbology in alphabetic letters and then linking the letters to a site or sites on the global computer network

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170118611A1 (en) * 2015-10-27 2017-04-27 Blackberry Limited Monitoring resource access
US10764860B2 (en) * 2015-10-27 2020-09-01 Blackberry Limited Monitoring resource access
US10952087B2 (en) 2015-10-27 2021-03-16 Blackberry Limited Detecting resource access
CN110971820A (en) * 2019-11-25 2020-04-07 Oppo广东移动通信有限公司 Photographing method, photographing device, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
KR20110051073A (en) 2011-05-17
US20110111806A1 (en) 2011-05-12

Similar Documents

Publication Publication Date Title
US8644881B2 (en) Mobile terminal and control method thereof
US10708534B2 (en) Terminal executing mirror application of a peripheral device
US10395233B2 (en) Mobile terminal and method for controlling the same
US9116565B2 (en) Mobile terminal and method of controlling the mobile terminal
US10534460B2 (en) Terminal apparatus, display method and recording medium
US9900515B2 (en) Apparatus and method for transmitting information using information recognized in an image
US9066137B2 (en) Providing a search service convertible between a search window and an image display window
EP1465394A2 (en) Display of HTML documents on a mobile communication terminal
US20150082256A1 (en) Apparatus and method for display images
US9560224B2 (en) Remote control device, remote operation device, screen transmission control method, and non-transitory computer-readable recording medium encoded with screen display control program
US8731534B2 (en) Mobile terminal and method for displaying image according to call therein
US20150121286A1 (en) Display apparatus and user interface providing method thereof
CN110795007A (en) Method and device for acquiring screenshot information
EP2899986B1 (en) Display apparatus, mobile apparatus, system and setting controlling method for connection thereof
US11966447B2 (en) Details page processing method, apparatus, and system, electronic device, and storage medium
CN113422863A (en) Information display method, mobile terminal and readable storage medium
KR20180133138A (en) Mobile terminal and method for controlling the same
US20220129230A1 (en) Electronic apparatus, display apparatus and controlling method thereof
CN114647623A (en) Folder processing method, intelligent terminal and storage medium
CN113625921A (en) Method, device, storage medium and electronic equipment for displaying target message
CN108605074B (en) Method and equipment for triggering voice function
JP2014106769A (en) Electronic apparatus and control program and display control method
US20160077795A1 (en) Display apparatus and method of controlling thereof
CN106960022A (en) Application program recommends method and device
KR101875485B1 (en) Electronic apparatus and Method for providing service thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: LG ELECTRONICS INC., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, YOON-HO;OH, HYE-JIN;REEL/FRAME:031460/0429

Effective date: 20100601

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20220204