WO2014089669A1 - Handwriting-initiated search - Google Patents

Handwriting-initiated search Download PDF

Info

Publication number
WO2014089669A1
WO2014089669A1 PCT/CA2012/050894 CA2012050894W WO2014089669A1 WO 2014089669 A1 WO2014089669 A1 WO 2014089669A1 CA 2012050894 W CA2012050894 W CA 2012050894W WO 2014089669 A1 WO2014089669 A1 WO 2014089669A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
search
input
handwriting
characters
Prior art date
Application number
PCT/CA2012/050894
Other languages
French (fr)
Inventor
Koon Wah Yu
Ken Kwok Wai LO
Chun Ning TO
Original Assignee
Blackberry Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Blackberry Limited filed Critical Blackberry Limited
Priority to PCT/CA2012/050894 priority Critical patent/WO2014089669A1/en
Publication of WO2014089669A1 publication Critical patent/WO2014089669A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation

Definitions

  • the present disclosure relates generally to electronic devices and, more particularly, to handwriting-initiated searches.
  • Electronic computing devices offer access to large amounts of information. Such information can be stored locally on the computing devices and/or at remotely located devices accessible via network communications. In addition, such information can be stored in any of a number of different formats (e.g., different file formats, different database structures, etc.). Some computing devices offer search applications to enable users to search for information based on different user-provided search strings.
  • FIG. 1 depicts an example electronic device that initiates and performs a search in response to or after detecting user-input handwriting in accordance with the teachings of this disclosure.
  • FIG. 2 A depicts the example electronic device of FIG. 1 performing a search based on user-input English-language handwriting.
  • FIG. 2B depicts the example electronic device of FIG. 1 performing a search based on user-input Chinese-language handwriting.
  • FIG. 2C depicts the example electronic device of FIG. 1 performing a search based on user-input numeric handwriting.
  • FIG. 3 depicts an example electronic device that may be used to implement examples disclosed herein to initiate and perform searches in response to or after detecting user-input handwriting.
  • FIG. 4 depicts an example block diagram of a processor system that may be used to implement the electronic device of FIGS. 1 and 2A-2C and/or the electronic device of FIG. 3.
  • FIG. 5 depicts an example flow diagram representative of machine readable instructions that may be used to implement the electronic device of FIG. 3 to initiate and perform searches in response to or after detecting user-input handwriting.
  • Example methods, apparatus, and articles of manufacture are disclosed herein in connection with electronic devices, which may be any stationary device or mobile device.
  • Stationary devices include, for example, desktop computers, computer terminals, kiosks, etc.
  • Mobile devices include, for example, mobile communication devices, mobile computing devices, etc.
  • Mobile devices also referred to as terminals, wireless terminals, mobile stations, communication stations, or user equipment (UE), may include mobile smart phones (e.g., BlackBerry® smart phones), wireless personal digital assistants (PDA), tablets (e.g., the
  • Examples disclosed herein may be used in connection with electronic devices with or without network communication capabilities (e.g., with or without wired and/or wireless communication adapters).
  • electronic devices will also be referred to as user devices via which users can initiate searches using user-input handwriting to find information.
  • Example methods, apparatus, and articles of manufacture disclosed herein may be used to facilitate searching, finding, viewing, and/or interacting with information stored locally on a user device and/or stored at a remote location accessible through network communications via the user device.
  • searching is initiated on a user device in response to or after detecting user-input handwriting of characters used as a search string for a search operation.
  • a user initiates a search operation by inputting on a touch-sensitive display (e.g., using a finger or stylus) handwritten text including characters and/or words for which to search using the search operation.
  • Examples disclosed herein enable initiating and performing a search without needing a user to first initiate (e.g., launch) a search application before entering the handwritten text or to press a search button or enter a search gesture to initiate the search.
  • an operating system process is configured to monitor for handwriting input on a screen (e.g., a main screen (sometimes referred to as a home screen), or search screen) of the operating system displayed on the user device.
  • a screen e.g., a main screen (sometimes referred to as a home screen), or search screen
  • the user device in response to or after detecting user-input handwriting (e.g., on the operating system main screen of the user device), the user device initiates and performs a search of information accessible via the user device based on the user-input handwriting.
  • characters recognized from the user-input handwriting are used to form a search string for the search.
  • the user device displays search results of the search.
  • the user device displays the search results while further user-input handwriting is input by a user.
  • the user device refines (e.g., filters) the search results based on the further user-input handwriting.
  • the user device initiates and performs the search of information by receiving a touch event corresponding to a touch-sensitive interface (e.g., a touch screen display) of the user device.
  • the touch event corresponds to a character of the user-input handwriting used in a search string of the search operation.
  • the touch event is generated by the operating system of the user device (e.g., based on an interrupt at a processor) in response to or after detecting a touch on a touch screen display of the user device.
  • a user device analyzes user-input handwriting using a multiple-language recognition method to determine that the user-input handwriting includes one or more characters from one or more languages supported by a multiple-language dictionary of the multiple-language recognition process.
  • different displayed search results correspond to different languages having a same character recognized in the user-input handwriting.
  • FIG. 1 depicts an example user device 100 (e.g., an electronic device) that initiates and performs a search in response to or after detecting user-input handwriting 102, 104, 106 in accordance with the teachings of this disclosure.
  • the user device 100 is configured to monitor for handwriting input (e.g., the user-input handwriting 102, 104, 106) on a touch screen display 108.
  • the user device 100 is configured to monitor for the user-input handwriting 102, 104, 106 on a main screen 110 (e.g., an operating system main screen) of an operating system (e.g., an operating system 434 of FIG. 4).
  • a main screen 110 e.g., an operating system main screen
  • an operating system e.g., an operating system 434 of FIG. 4.
  • screen or screen areas may be used to monitor for and receive user-input handwriting.
  • the user device 100 of the illustrated example initiates a search process for information based on characters recognized in the handwriting input.
  • the user device 100 is configured to initiate and perform a search for information in response to or after detecting handwriting input (e.g., the user-input handwriting 102, 104, 106) on the touch screen display 108 without requiring a user to first initiate (e.g., launch) a search application before entering the handwriting and without requiring a user to press a search button or enter a search gesture (e.g., a dedicated gesture that does not form part of a character search string but that signals a request to perform a search based on a separately provided search string) to initiate the search.
  • handwriting input e.g., the user-input handwriting 102, 104, 106
  • a search application e.g., a dedicated gesture that does not form part of a character search string but that signals a request to perform a search based on a separately provided search string
  • the user device 100 initiates a search method in response to or after detecting a user-input stroke drawing the letter "C" of "Chica" in the user-input handwriting 102.
  • a user By not requiring a user to launch a separate search application, press a search button or enter a search gesture, the user can more quickly access search results shortly after inputting the first few handwritten strokes of a character (e.g., a character of the user-input handwriting 102, 104, 106).
  • a character e.g., a character of the user-input handwriting 102, 104, 106.
  • techniques disclosed herein enable a user to perform a search using one-handed operation of the user device 100 by holding the user device 100 in one hand and using the thumb of that hand to input the user-input handwriting 102, 104, 106.
  • Such one-handed handwriting can be input using less accuracy and requires less fine motor skills than otherwise required using other text input techniques (e.g., typing on a keyboard).
  • handwriting-initiated search operations disclosed herein are useful when a user is walking (e.g., going to a meeting, rushing to an airport terminal, etc.) and/or performing other tasks (e.g., writing notes on paper, holding other objects such as airline tickets, luggage, briefcases, beverages, etc.) that keep the user from dedicating all of their efforts to inputting search parameters on the user device 100.
  • walking e.g., going to a meeting, rushing to an airport terminal, etc.
  • other tasks e.g., writing notes on paper, holding other objects such as airline tickets, luggage, briefcases, beverages, etc.
  • examples disclosed herein enable initiating and performing relatively quicker and more user-friendly searches than prior techniques by initiating and performing searches of information in response to or after detecting that a user has started inputting handwriting (e.g., the user-input handwriting 102, 104, 106) rather than requiring a user to launch a separate application, press a search button, or enter a search gesture to initiate and perform a search.
  • handwriting e.g., the user-input handwriting 102, 104, 106
  • a user can pick up the user device 100 and readily input handwritten search parameters (e.g., the user-input handwriting 102, 104, 106) on, for example, the main screen 110 without needing to expend efforts to, for example, launch a search application, press a search button and/or enter a search gesture to initiate a search.
  • handwritten search parameters e.g., the user-input handwriting 102, 104, 106
  • the user device 100 is configured to recognize characters of user-input handwriting 102, 104, 106 using a multiple-language recognition method that enables a user to input written language characters of any language supported by the user device 100 for performing searches.
  • An example multiple-language recognition method of the user device 100 is configured to analyze user-input handwriting 102, 104, 106 based on a multiple-language dictionary (e.g., a multiple-language dictionary 310 of FIG. 3) to identify characters from any language supported in the multiple-language dictionary as corresponding to the user-input handwriting 102, 104, 106.
  • a multiple-language dictionary e.g., a multiple-language dictionary 310 of FIG.
  • any number of languages in the multiple-language dictionary can be simultaneously supported and used for character recognition by a multiple-language recognition method without needing a user to pre-select a language prior to entering handwriting. That is, while a user is inputting handwriting 102, 104, 106, the user device 100 dynamically analyzes the user-input handwriting 102, 104, 106 against all possible languages supported for searches by the user device 100, and dynamically determines the characters and language(s) of the characters in the user-input handwriting 102, 104, 106.
  • Such example techniques are useful for bilingual or multilingual users that store information locally (e.g., information stored on the user device 100) and/or that access remote information (e.g., information accessible from the user device 100 via a network connection and/or the Internet) using multiple languages.
  • the multiple-language recognition method supports recognizing different types of alphabetic and numeric characters including Latin-based alphanumeric characters (e.g., English characters, German characters, Spanish characters, French characters, etc.), logographic characters (e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.), script characters (e.g., Hebrew script characters, Arabic script characters, etc.), etc.
  • the illustrated example of FIG. 1 shows the user device 100 performing a search based on handwritten Latin-based characters of the user-input handwriting 102 comprising English-language characters.
  • the user device 100 of the illustrated example is also shown performing a search based on handwritten logographic characters of the user-input handwriting 104 comprising a Chinese-language character.
  • the user device 100 of the illustrated example is also shown performing a search based on handwritten numeric characters of the user-input handwriting 106 comprising numeric characters.
  • the user device 100 begins recognizing the illustrated characters of FIG.
  • FIG. 2 A depicts the example user device 100 of FIG. 1 performing a search based on the user-input handwriting 102 comprising English-language characters.
  • the user device 100 monitors for user-input handwriting (e.g., the user-input handwriting 102) on the main screen 110.
  • user-input handwriting e.g., the user-input handwriting 102
  • the user device 100 detects a start of user-input handwriting
  • the user device 100 provides a visual cue to a user that the user device 100 has entered a search mode.
  • the user device 100 may provide such a visual cue by dimming a background image (e.g., a wallpaper image) displayed on, for example, the main screen 110.
  • a background image e.g., a wallpaper image
  • a search method initiated in response to or after detecting the user-input handwriting is a function or process of an operating system of the user device 100.
  • the search method may be implemented using a background application (e.g., a software program 436 of FIG. 4) that is started up during a boot process of the operating system of the user device 100 and that runs as a background process during operation of the user device 100.
  • the background application does not require a user to manually launch it to perform a search.
  • a search capability of the user device 100 remains in standby mode as a background process and is automatically activated to perform a search when the user device 100 detects user-input handwriting on the main screen 110. In this manner, a user need only input handwritten characters on the main screen 110 to initiate and perform a search for information matching the handwritten characters.
  • the user device 100 is provided with a search string display area 202 to display characters recognized from the user-input handwriting 102.
  • the user-device 100 recognizes the user-input handwriting 102 as including English- language characters (or Latin-based characters) and displays the recognized English-language characters in the search string display area 202.
  • the user device 100 receives further handwritten characters on the main screen 110, the user device 100 updates the search string in the search string display area 202 with the additional recognized characters.
  • the user device 100 of the illustrated example is also provided with a search results display area 204 to display search results based on the search string shown in the search string display area 202.
  • the search results shown in the search string display area 202 are refined or filtered based on the additional recognized characters shown in the search string of the search string display area 202.
  • the search results in the search results display area 204 show information that matches the search string of the search string display area 202.
  • the search method of the user device 100 guesses or infers a complete word based on a partial input and performs a search based on the guessed or inferred complete word. For example, in FIG.
  • the search method may infer that the user intends to find information for the word "Chicago” because the characters of the user-input handwriting 102 include letters "chica” which is a partial spelling of the word “Chicago.”
  • the search method of the user device 100 does not automatically guess or infer an intended search string (e.g., "Chicago") from a partial word entry (e.g., "chica"), but instead shows search results that contain the character string (e.g., "chica") of any partial handwritten entry.
  • the search results display area 202 shows search results in different languages (e.g., English, German, French, Spanish, etc.) simultaneously in instances in which such different-language results contain the characters of the search string.
  • the search string display area 202 shows results of information locally stored on the user device 100 and results of information accessible through network communications (e.g., via an Internet connection) via the user device 100.
  • the search performed by the user device 100 is referred to as a universal search because the areas searched by the search method are not limited to particular applications or files or to local data (as opposed to remote data), but instead the search areas extend to any locally and/or remotely searchable information.
  • locally stored information shown in the search results display area 202 includes a "Chicago Meeting" calendar event icon 206 that is stored in a local calendar or appointments database of the user device 100.
  • a user selection of the "Chicago Meeting" calendar event icon 206 causes the user device 100 to display a calendar appointment or event corresponding to the "Chicago Meeting" calendar event icon 206.
  • remotely stored information accessible through network communications via the user device 100 is represented in the search results display area 204 as a "Chicago Internet Search" icon 208.
  • user-selection of the "Chicago Internet Search" icon 208 initiates a search on the Internet for the search string parameter "Chicago.”
  • an Internet search icon similar to the "Chicago Internet Search" icon 208 may be shown in the search results display area for the search string parameter "chica" as shown in the search string display area 202 so that an option is available for a user to perform an Internet search on partial words (e.g., "chica") rather than on complete guessed or inferred words (e.g., "Chicago").
  • a "Chicago Weather” icon 210 is shown in the search results display area 204.
  • the "Chicago Weather” icon 210 corresponds to information that was previously retrieved by the user device 100 via the Internet and that is locally cached at a memory of the user device 100. Additionally or alternatively, user-selection of the "Chicago Weather” icon 210 may cause the user device 100 to retrieve current weather conditions via the Internet to present to a user.
  • Other search results which may correspond to similarly cacheable and/or instantly retrievable information may be news icons, financial data icons, rich site summary (RSS) feed icons, etc.
  • such icons may correspond to previously retrieved and locally cached information and/or may cause immediate retrieval of news via the Internet when selected by a user.
  • the "Chicago Weather" icon 210 is a live icon that displays current weather conditions (e.g., 50°F and partially cloudy conditions) when displayed in the search results display area 204.
  • any information displayed in the search results display area 204 may be a live icon if live or updated information is supported or available for that search result information.
  • FIG. 2B depicts the example user device 100 of FIGS. 1 and 2 A performing a search based on the user-input handwriting 104 including a Chinese-language character.
  • a search method of the user device 100 recognizes the user-input
  • the handwriting 104 as being a Chinese-language character, and shows the recognized Chinese- language character in the search string display area 202.
  • the recognized Chinese-language character forms part of the English-language word "Chicago.”
  • the search method of the user device 100 provides search results in the search results display area 204 containing the recognized Chinese-language character.
  • the search results display area 204 displays a calendar event icon 222 having the recognized Chinese-language character of the user-input handwriting 104, an Internet search icon 224 for performing an Internet search on the Chinese-language word for "Chicago" which has the the recognized Chinese-language character of the user-input handwriting 104, and a weather icon 226 for displaying weather conditions for the city of Chicago.
  • FIG. 2C depicts the example user device 100 of FIGS. 1, 2 A, and 2B performing a search based on the user-input handwriting 106 including numeric characters.
  • the user device 100 recognizes the user-input handwriting 106 as including numeric characters and displays the recognized numeric characters in the search string display area 202.
  • the search string display area 202 displays the recognized numeric characters in a North American telephone number format. However, any other format for displaying the recognized numeric characters may be used.
  • the search method of the user device 100 finds contact information (e.g., in an address or contacts database) having a telephone number including the recognized numeric characters of the user- input handwriting 106.
  • the search results display area 204 of the illustrated example shows user-selectable options to initiate different communications with the found contact.
  • the search results display area 204 includes a telephone dial icon 232 to initiate a call (e.g., dial telephone number) to the found contact, and an instant text message icon 234 to send a text message to the found contact.
  • user-selectable action icons to, for example, dial telephone numbers of contacts, send messages to contacts, show meeting events with contacts, etc. can be displayed in the search results display area 204 for contacts matching user-input handwriting containing Latin-based characters (e.g., English characters, German characters, Spanish characters, French characters, etc.), logographic characters (e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.), script characters (e.g., Hebrew script characters, Arabic script characters, etc.), and/or any other type of character supported by a multiple-language dictionary of a handwritten character recognition method of the user device 100.
  • Latin-based characters e.g., English characters, German characters, Spanish characters, French characters, etc.
  • logographic characters e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.
  • script characters e.g., Hebrew script characters, Arabic script characters, etc.
  • any other type of character supported by a multiple-language dictionary of a handwritten character recognition method of the user device 100 e.g., Chinese
  • FIG. 3 depicts an example electronic device 300 that may be used to implement examples disclosed herein to perform handwriting-initiated searches.
  • the electronic device 300 is provided with example subsystems including: an exampl processor (or controller) 302, an example input device 304, an example handwriting detector 306, an example character recognizer 308, an example multiple-language dictionary 310, an example searcher 312, an example display 314, and an example memory 316.
  • the subsystems 302, 304, 306, 308, 310, 312, 314, and 316 may be implemented using exclusively hardware, firmware or software or any desired combination of hardware, firmware, and/or software. For example, one or more integrated circuits, discrete semiconductor components, and/or passive electronic components may be used.
  • the subsystems 302, 304, 306, 308, 310, 312, 314, and 316, or parts thereof could be implemented using one or more circuit(s), programmable processor(s), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), etc.
  • the subsystems 302, 304, 306, 308, 310, 312, 314, and 316, or parts thereof, may be implemented using instructions, code, and/or other software and/or firmware, etc. stored on a machine accessible medium or computer readable medium (e.g., the memory 316) and executable by, for example, a processor (e.g., the example processor 302).
  • At least one of the subsystems 302, 304, 306, 308, 310, 312, 314, and 316 is hereby expressly defined to include a tangible medium such as a solid state memory, a magnetic memory, a digital versatile disk (DVD), a compact disk (CD), a BluRay disk, etc.
  • the example electronic device 300 of FIG. 3 may include one or more elements, methods and/or devices in addition to, or instead of, those illustrated in FIG. 3, and/or may include more than one of any or all of the illustrated elements, methods and devices.
  • the electronic device 300 of the illustrated example is provided with the example processor 302 to control and/or manage operations of the user device 100 of FIGS. 1 and 2A-2C.
  • the processor 302 executes an operating system (e.g., an operating system 434 of FIG. 4) and other software (e.g., software programs 436 of FIG. 4) of the user device 100.
  • the processor 302 makes decisions and facilitates/arbitrates information exchanges between elements of the electronic device 300.
  • the processor 302 receives software and/or hardware interrupts from different subsystems of the user device 100 related to, for example, user input (e.g., touchdown events on a touch screen display of the user device 100, user selected icons, etc.), operating system services and/or background services (e.g., example operations of the search method disclosed herein), etc.
  • user input e.g., touchdown events on a touch screen display of the user device 100, user selected icons, etc.
  • operating system services and/or background services e.g., example operations of the search method disclosed herein
  • the electronic device 300 is provided with the input device 304 to receive user-input handwriting (e.g., the user-input handwriting 102, 104, 106 of FIGS. 1 and 2A-2C).
  • the input device 304 is implemented using a touch screen display (e.g., the touch screen display 108 of FIG. 1).
  • the input device 304 may additionally or alternatively be implemented using other types of input device interfaces (e.g., a mouse interface, a touch pad interface, a pen tablet interface, etc.) that enable entering handwriting input.
  • the electronic device 300 is provided with the handwriting detector 306 to determine when a user has started inputting handwritten characters (e.g., the user- input handwriting 102, 104, 106) via the input device 304.
  • the input device 304 triggers a touchdown event interrupt at the processor 302 in response to or after a user touching the touch screen display 108 (FIG. 1).
  • the processor 302 processes the touchdown event interrupt by causing the handwriting detector 306 to analyze input touches or strokes on the input device 304 to determine whether the input touches or strokes are handwriting.
  • the electronic device 300 is provided with the character recognizer 308 to analyze user-input handwriting (e.g., the user-input handwriting 102, 104, 106) and recognize characters and/or words of the user-input handwriting.
  • the electronic device 300 of the illustrated example is provided with the multiple-language dictionary 310 to enable the character recognizer 308 to recognizing characters and/or words from multiple different languages using a multiple-language recognition process.
  • the character recognizer 308 may compare stroke features of the user-input handwriting 102, 104, 106 to reference characters in the multiple- language dictionary 310 to find matches or substantially close matches (e.g., within a threshold) between reference characters in the multiple-language dictionary 310 and the user-input handwriting 102, 104, 106.
  • the multiple-language dictionary 310 of the illustrated example is configurable to support any quantity of and types of languages including languages with Latin- based alphanumeric characters (e.g., English characters, German characters, Spanish characters, French characters, etc.), logographic characters (e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.), script characters (e.g., Hebrew script characters, Arabic script characters, etc.), and/or any other types of characters.
  • the character recognizer 308 of the illustrated example is configured to automatically detect the language of user-input handwriting and to dynamically switch between different detected languages supported in the multiple-language dictionary 310 without requiring a user to pre-select or pre-inform the user device 100 of a language in which the user is providing handwriting input.
  • a user may provide the user-input handwriting 102 containing English-language characters at a first time (e.g., 10:08 AM), provide the user-input handwriting 104 containing Chinese-language characters at a second time (e.g., 10:09 AM), and provide the user-input handwriting 106 containing numeric characters at a third time (e.g., 10: 10 AM) without needing to pre-select the English language, the Chinese language, or numeric-type entry before providing such user-input handwriting 102, 104, 106.
  • the character recognizer 308 automatically detects languages of user-input handwriting using the multiple-language dictionary 310.
  • the electronic device 300 is provided with the searcher 312 to perform search operations for information stored locally on the user device 100 and/or stored at a remote location accessible through network communications via the user device 100.
  • the searcher 312 uses the recognized characters displayed in the search string display area 202 (FIGS. 2A-2C) as a search string parameter to find matching information.
  • the searcher 312 then causes the user device 100 to display search results (e.g., the search results 206, 208, 210 of FIG. 2A, the search results 222, 224, 226 of FIG. 2B, and/or the search results 232, 234 of FIG. 2C) in the search results display area 204 of FIGS. 2A-2C.
  • the electronic device 300 is provided with the display 314 (e.g., the touch screen display 108 of FIG. 1) that is configured to present graphical user interfaces (e.g., the main screen 110 of FIG. 1, the search string display area 202 of FIGS. 2A- 2C, the search results display area 204 of FIGS. 2A-2C, etc.).
  • the display 314 is a liquid crystal display. Additionally or alternatively, the display 314 may be made of any other suitable type of display technology such as e-paper displays, cathode ray tube (CRT) displays, light-emitting diode (LED) displays, etc.
  • the electronic device 300 is provided with the memory 316 to locally store information, software, firmware, etc. at the user device 100.
  • the memory 316 may be a mass storage memory magnetic or optical memory, a non- volatile integrated circuit memory, or a volatile memory. That is, the memory 316 may be any tangible medium such as a solid state memory, a magnetic memory, a DVD, a CD, a BluRay disk, etc.
  • FIG. 4 depicts a block diagram of an example implementation of a processor system that may be used to implement the user device 100 of FIGS. 1 and 2A-2C and/or the electronic device 300 of FIG. 3.
  • the user device 100 is a two-way communication device with advanced data communication capabilities including the capability to communicate with other wireless-enabled devices or computer systems through a network of transceiver stations.
  • the user device 100 may also have the capability to allow voice
  • FIG. 4 depicts an example implementation of the user device 100 as having a number of components, in some example implementations, some of the components shown in FIG. 4 may be omitted and/or may be externally connected to the user device 100 (e.g., via interface port(s) and/or via wireless interface(s)). To aid the reader in understanding the structure of the user device 100 and how it communicates with other devices and host systems, FIG. 4 will now be described in detail.
  • the user device 100 includes a number of components such as a main processor 402 (e.g., similar or identical to the processor 302 of FIG. 3), that is directly or indirectly connected to the other components, and controls the overall operation of the user device 100.
  • Communication functions including data and voice communications, are performed through a communication subsystem 404.
  • the communication subsystem 404 receives messages from and sends messages to a wireless network 405.
  • the communication subsystem 404 may be configured in accordance with the Global System for Mobile Communication (GSM) and General Packet Radio Services (GPRS) standards, Enhanced Data GSM Environment (EDGE) and Universal Mobile Telecommunications Service (UMTS) standards, 3rd Generation
  • GSM Global System for Mobile Communication
  • GPRS General Packet Radio Services
  • EDGE Enhanced Data GSM Environment
  • UMTS Universal Mobile Telecommunications Service
  • LTE Long Term Evolution
  • Wi-Fi Wi-Fi
  • MOBITEX® communication standards DAT AT AC® communication standards
  • PCS Personal Communication Systems
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • the main processor 402 also interacts with additional subsystems such as a Random Access Memory (RAM) 406, a persistent memory 408 (e.g., a non-volatile memory), a display 410, an auxiliary input/output (I/O) subsystem 412, a data port 414, a keyboard 416, a speaker 418, a microphone 420, short-range communications 422, and other device subsystems 424.
  • RAM Random Access Memory
  • persistent memory 408 e.g., a non-volatile memory
  • I/O subsystem 412 e.g., a data port 414
  • keyboard 416 e.g., a keyboard 416
  • speaker 418 e.g., a speaker 418
  • microphone 420 e.g., a microphone 420
  • short-range communications 422 e.g., short-range communications
  • the user device 100 is provided with a SIM/RUIM card 426 (i.e. Subscriber Identity Module or a Removable User Identity Module) to be inserted into a SIM/RUIM interface 428 in order to communicate with a network.
  • SIM/RUIM card 426 i.e. Subscriber Identity Module or a Removable User Identity Module
  • the SIM card or RUIM 426 is a type of smart card that can be used to identify a subscriber of the user device 100 and to personalize the user device 100, among other things.
  • search operations disclosed herein may be used to search information (e.g., address book entries, calendar entries, notes, etc.) stored on the SIM card or RUIM 426.
  • the user device 100 is a battery-powered device and includes a battery interface 432 for receiving one or more rechargeable batteries 430.
  • a battery interface 432 for receiving one or more rechargeable batteries 430.
  • future technologies such as micro fuel cells or other suitable power sources may provide power to the user device 100.
  • the user device 100 also includes an operating system 434 and software programs 436.
  • the operating system 434 and the software components 436 that are executed by the main processor 402 are typically stored in a persistent store such as the persistent memory 408, which may alternatively be a read-only memory (ROM) or similar storage element (not shown).
  • a persistent store such as the persistent memory 408, which may alternatively be a read-only memory (ROM) or similar storage element (not shown).
  • ROM read-only memory
  • portions of the operating system 434 and/or the software components 436 may be temporarily loaded into and executed from a volatile store such as the RAM 406.
  • the software programs 436 may include software applications that control basic device operations, including data and voice communication applications.
  • Other software applications include, for example, message applications, personal information manager (PIM) applications (e.g., applications to view, create, and/or manage e-mail, contacts, calendar events, voice mails, appointments, task items, etc. ), and other suitable software applications.
  • PIM personal information manager
  • Other types of software applications may include third party applications, which are added after the manufacture of the user device 100. Examples of third party applications include games, calculators, utilities, productivity applications, etc.
  • the data port 414 may be any suitable port that enables data communication between the user device 100 and another computing device for transferring data therebetween.
  • the data port 414 can be a serial or a parallel port.
  • the data port 414 may be a universal serial bus (USB) port that includes data lines for data transfer and a supply line that can provide a charging current to charge the battery 430 of the user device 100.
  • USB universal serial bus
  • the short-range communications subsystem 422 provides for communication between the user device 100 and different systems or devices, without the use of the wireless network 405.
  • the subsystem 422 may include an infrared device and associated circuits and components for short-range communication. Examples of short-range
  • a received signal such as a text message, an e-mail message, web page download, media content, etc. will be processed by the communication subsystem 404 and input to the main processor 402.
  • the main processor 402 will then process the received signal for output to the display 410 or alternatively to the auxiliary I/O subsystem 412.
  • a subscriber may also compose data items, such as e-mail messages, for example, using the keyboard 416 in conjunction with the display 410 and possibly the auxiliary I/O subsystem 412.
  • the auxiliary subsystem 412 may include input devices such as: a touch screen display, a mouse, a track ball, a track pad, an infrared fingerprint detector, or a roller wheel with dynamic button pressing capability.
  • the keyboard 416 is preferably an alphanumeric keyboard and/or telephone-type keypad. However, other types of keyboards may also be used.
  • a composed item may be transmitted over the wireless network 405 through the communication subsystem 404.
  • a voice message recording subsystem can also be implemented on the user device 100.
  • voice or audio signal output is accomplished primarily through the speaker 418, the display 410 can also be used to provide additional information such as the identity of a calling party, duration of a voice call, or other voice call related information.
  • FIG. 5 depicts an example flow diagram representative of machine readable instructions that may be used to implement the electronic device 300 of FIG. 3 and/or the user device 100 of FIGS. 1 and 2A-2C to initiate and perform searches in response to or after detecting user-input handwriting.
  • the example method of FIG. 5 may be performed using one or more processors, controllers, and/or any other suitable processing devices (e.g., the example processor 302 of FIG. 3 and/or the example processor 402 of FIG. 4).
  • coded instructions e.g., computer readable instructions
  • tangible computer readable storage media e.g., a storage device or storage disk
  • flash memory read-only memory (ROM), and/or random-access memory (RAM)
  • CD-ROM compact disc-read only memory
  • floppy disk a hard drive
  • DVD a DVD
  • BluRay disk a memory associated with the processor 302 and/or the processor 402.
  • the method and/or parts thereof could alternatively be executed by a device other than the processor 302 and/or the processor 402 and/or embodied in firmware or dedicated hardware.
  • the term tangible computer readable storage medium is expressly defined to include any type of computer readable storage medium and to exclude propagating signals.
  • the example method of FIG. 5 may be implemented using coded instructions (e.g., computer readable instructions) stored on one or more non-transitory computer readable storage media such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random-access memory, or any other storage media in which information is stored for any duration (e.g., for extended time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information).
  • coded instructions e.g., computer readable instructions
  • non-transitory computer readable storage media such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random-access memory, or any other storage media in which information is stored for any duration (e.g., for extended time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information).
  • non-transitory computer readable storage medium is expressly
  • the example method of FIG. 5 may be implemented using an application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, software, etc.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • FPLD field programmable logic device
  • discrete logic hardware, firmware, software, etc.
  • the example method of FIG. 5 may be implemented manually or as any combination(s) of any of the foregoing techniques, for example, any combination of firmware, software, discrete logic and/or hardware.
  • the example method of FIG. 5 is described with reference to the flow diagram of FIG. 5, other methods of implementing the method of FIG. 5 may be employed. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, sub-divided, or combined.
  • example method of FIG. 5 may be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc.
  • phrase “at least” when used as the transition term in a preamble of a claim, it is open-ended in the same manner as the term “comprising" is open ended.
  • a claim using "at least" as the transition term in its preamble may include elements in addition to those expressly recited in the claim.
  • the handwriting detector 306 (FIG. 3) detects an initiation or start of handwriting (block 502). For example, the handwriting detector 306 detects when a user begins entering user-input handwriting on the input device 304 (FIG. 3). For example, in the illustrated examples of FIGS. 1 and 2A-2C, the handwriting detector 306 may detect a user-input stroke drawing the letter "C" of "Chica" in the user-input handwriting 102, a user-input stroke beginning to draw the Chinese-language character of the user-input
  • the start of user-input handwriting generates a touch event corresponding to the touch screen display 108 (FIG. 1) of the user device 100.
  • the touch event is generated by the operating system 434 (FIG. 4) of the user device 100 in response to or after detecting a touch on the touch screen display 108 that triggers an interrupt (e.g., a touch event interrupt) in the processor 302, causing the processor 302 to respond to the user-input handwriting.
  • the touch event is generated from a stroke of the user-input handwriting that forms a character to be included in a search string for the search operation.
  • the processor 302 initiates a search method in response to or after detecting the handwriting (block 504).
  • the handwriting detector 306 can send the processor 302 a confirmation that handwriting is being entered on the input device 304, and the processor 302 may respond by automatically activating the character recognizer 308 (FIG. 3) to recognize input characters and by causing the searcher 312 (FIG. 3) to perform a search method for information containing the recognized characters.
  • the character recognizer 308 recognizes one or more character(s) input via the handwriting (block 506).
  • the character recognizer 308 uses the multiple-language dictionary 310 (FIG. 3) to perform a multiple-language recognition method that supports recognizing characters in the handwriting from any supported language, alphabet and/or numeric system.
  • the multiple-language recognition method may be used to detect English-language characters as in the user-input handwriting 102, Chinese-language characters as in the user-input handwriting 104, and/or numeric characters as in the user-input handwriting 106.
  • the searcher 312 performs a search operation using the recognized characters as a search string (block 508).
  • the searcher 312 automatically begins performing the search as soon as the character recognizer 308 recognizes one or more characters to form the search string parameter, without needing a user to press a search button or enter a search gesture to start the search process.
  • the display 314 (FIG. 3) displays the search results (block 510).
  • the display 314 may display the search results (e.g., the search results 206, 208, 210 of FIG. 2A, the search results 222, 224, 226 of FIG. 2B, or the search results 232, 234 of FIG. 2C) in the search results display area 204 of FIGS. 2A-2C.
  • the handwriting detector 306 determines whether additional handwriting has been received (block 512) via, for example, the input device 304 of FIG. 3. If additional handwriting is detected at block 512, the processor 302 initiates a refined search method (block 514). In the illustrated example, the processor 302 responds to the additional handwriting by returning control to block 506 so that the character recognizer 308 and the searcher 312 can implement the refined search method by performing the operations of 506 and 508 on the additional handwriting. For example, the processor 302 causes the character recognizer 308 to recognize the additional input characters and causes the searcher 312 to refine the search results based on the additional recognized characters (e.g., as appended to the previous search string formed during a previous iteration of block 506). In some examples, the searcher 312 may refine the search results by applying a filter of the additional recognized characters on already found search results during a previous iteration of block 508 to filter out any of the search results that do not have the additional recognized characters.
  • the processor 302 awaits a user selection of a search result (block 516). If the input device 304 receives a user selection of a search result at block 516, the processor 302 performs a method corresponding to the user selection (block 520), and the example method of FIG. 5 ends. If the input device 304 does not receive a user selection of a search result at block 516 (e.g., within a threshold duration), the processor 302 determines whether it should continue to wait for a user selection (block 522). In some examples, the processor 302 may be configured to clear search results after a particular duration if a user selection is not received.
  • Such clearing of search results may be implemented for security purposes, to return the user device 100 to a ready state to receive other handwriting for a different search, to return to a main screen (e.g., the main screen 110 of FIG. 1), and/or for any other suitable reason. If the processor 302 determines that it should continue to wait for a user selection, control returns to block 516. Otherwise, the example method of FIG. 5 ends.

Abstract

A disclosed example method involves in response to or after detecting user-input handwriting on an operating system main screen of a device, performing a search operation of information accessible via the device based on the user-input handwriting. The example method also involves displaying the search results of the search operation.

Description

HAND WRITING-INITIATED SEARCH
FIELD OF THE DISCLOSURE
[0001] The present disclosure relates generally to electronic devices and, more particularly, to handwriting-initiated searches.
BACKGROUND
[0002] Electronic computing devices offer access to large amounts of information. Such information can be stored locally on the computing devices and/or at remotely located devices accessible via network communications. In addition, such information can be stored in any of a number of different formats (e.g., different file formats, different database structures, etc.). Some computing devices offer search applications to enable users to search for information based on different user-provided search strings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] FIG. 1 depicts an example electronic device that initiates and performs a search in response to or after detecting user-input handwriting in accordance with the teachings of this disclosure.
[0004] FIG. 2 A depicts the example electronic device of FIG. 1 performing a search based on user-input English-language handwriting.
[0005] FIG. 2B depicts the example electronic device of FIG. 1 performing a search based on user-input Chinese-language handwriting.
[0006] FIG. 2C depicts the example electronic device of FIG. 1 performing a search based on user-input numeric handwriting. [0007] FIG. 3 depicts an example electronic device that may be used to implement examples disclosed herein to initiate and perform searches in response to or after detecting user-input handwriting.
[0008] FIG. 4 depicts an example block diagram of a processor system that may be used to implement the electronic device of FIGS. 1 and 2A-2C and/or the electronic device of FIG. 3.
[0009] FIG. 5 depicts an example flow diagram representative of machine readable instructions that may be used to implement the electronic device of FIG. 3 to initiate and perform searches in response to or after detecting user-input handwriting.
DETAILED DESCRIPTION
[0010] Although the following discloses example methods, apparatus, and articles of manufacture including, among other components, software executed on hardware, it should be noted that such methods, apparatus, and articles of manufacture are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, exclusively in firmware, or in any combination of hardware, software, and/or firmware. Accordingly, while the following discloses example methods, apparatus, and articles of manufacture, persons having ordinary skill in the art will readily appreciate that the examples provided are not the only way to implement such methods, apparatus, and articles of
manufacture.
[0011] It will be appreciated that, for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of examples disclosed herein. However, it will be understood by those of ordinary skill in the art that examples disclosed herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure examples disclosed herein. Also, the description is not to be considered as limiting the scope of examples disclosed herein.
[0012] Example methods, apparatus, and articles of manufacture are disclosed herein in connection with electronic devices, which may be any stationary device or mobile device.
Stationary devices include, for example, desktop computers, computer terminals, kiosks, etc. Mobile devices include, for example, mobile communication devices, mobile computing devices, etc. Mobile devices, also referred to as terminals, wireless terminals, mobile stations, communication stations, or user equipment (UE), may include mobile smart phones (e.g., BlackBerry® smart phones), wireless personal digital assistants (PDA), tablets (e.g., the
BlackBerry® Playbook tablet device), laptop/notebook/netbook computers, etc. Examples disclosed herein may be used in connection with electronic devices with or without network communication capabilities (e.g., with or without wired and/or wireless communication adapters). In examples disclosed herein such electronic devices will also be referred to as user devices via which users can initiate searches using user-input handwriting to find information.
[0013] Example methods, apparatus, and articles of manufacture disclosed herein may be used to facilitate searching, finding, viewing, and/or interacting with information stored locally on a user device and/or stored at a remote location accessible through network communications via the user device. In examples disclosed herein, searching is initiated on a user device in response to or after detecting user-input handwriting of characters used as a search string for a search operation. In disclosed examples, a user initiates a search operation by inputting on a touch-sensitive display (e.g., using a finger or stylus) handwritten text including characters and/or words for which to search using the search operation. Examples disclosed herein enable initiating and performing a search without needing a user to first initiate (e.g., launch) a search application before entering the handwritten text or to press a search button or enter a search gesture to initiate the search. In some disclosed examples, an operating system process is configured to monitor for handwriting input on a screen (e.g., a main screen (sometimes referred to as a home screen), or search screen) of the operating system displayed on the user device. In such some examples, in response to or after detecting user-input handwriting (e.g., on the operating system main screen of the user device), the user device initiates and performs a search of information accessible via the user device based on the user-input handwriting. In such examples, characters recognized from the user-input handwriting are used to form a search string for the search. The user device then displays search results of the search. In some examples, the user device displays the search results while further user-input handwriting is input by a user. In such some examples, the user device refines (e.g., filters) the search results based on the further user-input handwriting.
[0014] In some examples, the user device initiates and performs the search of information by receiving a touch event corresponding to a touch-sensitive interface (e.g., a touch screen display) of the user device. In some such examples, the touch event corresponds to a character of the user-input handwriting used in a search string of the search operation. In some such examples, the touch event is generated by the operating system of the user device (e.g., based on an interrupt at a processor) in response to or after detecting a touch on a touch screen display of the user device.
[0015] In some disclosed examples, a user device analyzes user-input handwriting using a multiple-language recognition method to determine that the user-input handwriting includes one or more characters from one or more languages supported by a multiple-language dictionary of the multiple-language recognition process. In some examples, different displayed search results correspond to different languages having a same character recognized in the user-input handwriting.
[0016] FIG. 1 depicts an example user device 100 (e.g., an electronic device) that initiates and performs a search in response to or after detecting user-input handwriting 102, 104, 106 in accordance with the teachings of this disclosure. In the illustrated example, the user device 100 is configured to monitor for handwriting input (e.g., the user-input handwriting 102, 104, 106) on a touch screen display 108. In the illustrated example, the user device 100 is configured to monitor for the user-input handwriting 102, 104, 106 on a main screen 110 (e.g., an operating system main screen) of an operating system (e.g., an operating system 434 of FIG. 4).
Alternatively or additionally, other screen or screen areas (e.g., a search screen or a search input area) may be used to monitor for and receive user-input handwriting. In response to or after detecting a start of handwriting input (e.g., the user input-handwriting 102, 104, 106), the user device 100 of the illustrated example initiates a search process for information based on characters recognized in the handwriting input.
[0017] In the illustrated example, the user device 100 is configured to initiate and perform a search for information in response to or after detecting handwriting input (e.g., the user-input handwriting 102, 104, 106) on the touch screen display 108 without requiring a user to first initiate (e.g., launch) a search application before entering the handwriting and without requiring a user to press a search button or enter a search gesture (e.g., a dedicated gesture that does not form part of a character search string but that signals a request to perform a search based on a separately provided search string) to initiate the search. For example, in the example user-input handwriting 102, the user device 100 initiates a search method in response to or after detecting a user-input stroke drawing the letter "C" of "Chica" in the user-input handwriting 102. Initiating and performing a search based on handwriting input as disclosed herein, without requiring a separate application to be launched by a user, and without requiring a user to press a search button or input a separate search gesture, is useful to enable initiating and performing relatively quicker searches using more user-friendly input techniques. By not requiring a user to launch a separate search application, press a search button or enter a search gesture, the user can more quickly access search results shortly after inputting the first few handwritten strokes of a character (e.g., a character of the user-input handwriting 102, 104, 106). For example, techniques disclosed herein enable a user to perform a search using one-handed operation of the user device 100 by holding the user device 100 in one hand and using the thumb of that hand to input the user-input handwriting 102, 104, 106. Such one-handed handwriting can be input using less accuracy and requires less fine motor skills than otherwise required using other text input techniques (e.g., typing on a keyboard). For example, handwriting-initiated search operations disclosed herein are useful when a user is walking (e.g., going to a meeting, rushing to an airport terminal, etc.) and/or performing other tasks (e.g., writing notes on paper, holding other objects such as airline tickets, luggage, briefcases, beverages, etc.) that keep the user from dedicating all of their efforts to inputting search parameters on the user device 100. Regardless of whether a user uses one-handed operation or two-handed operation of the user device 100, examples disclosed herein enable initiating and performing relatively quicker and more user-friendly searches than prior techniques by initiating and performing searches of information in response to or after detecting that a user has started inputting handwriting (e.g., the user-input handwriting 102, 104, 106) rather than requiring a user to launch a separate application, press a search button, or enter a search gesture to initiate and perform a search. In this manner, a user can pick up the user device 100 and readily input handwritten search parameters (e.g., the user-input handwriting 102, 104, 106) on, for example, the main screen 110 without needing to expend efforts to, for example, launch a search application, press a search button and/or enter a search gesture to initiate a search.
[0018] In examples disclosed herein, the user device 100 is configured to recognize characters of user-input handwriting 102, 104, 106 using a multiple-language recognition method that enables a user to input written language characters of any language supported by the user device 100 for performing searches. An example multiple-language recognition method of the user device 100 is configured to analyze user-input handwriting 102, 104, 106 based on a multiple-language dictionary (e.g., a multiple-language dictionary 310 of FIG. 3) to identify characters from any language supported in the multiple-language dictionary as corresponding to the user-input handwriting 102, 104, 106. In the illustrated examples, any number of languages in the multiple-language dictionary can be simultaneously supported and used for character recognition by a multiple-language recognition method without needing a user to pre-select a language prior to entering handwriting. That is, while a user is inputting handwriting 102, 104, 106, the user device 100 dynamically analyzes the user-input handwriting 102, 104, 106 against all possible languages supported for searches by the user device 100, and dynamically determines the characters and language(s) of the characters in the user-input handwriting 102, 104, 106. Such example techniques are useful for bilingual or multilingual users that store information locally (e.g., information stored on the user device 100) and/or that access remote information (e.g., information accessible from the user device 100 via a network connection and/or the Internet) using multiple languages. In some examples, the multiple-language recognition method supports recognizing different types of alphabetic and numeric characters including Latin-based alphanumeric characters (e.g., English characters, German characters, Spanish characters, French characters, etc.), logographic characters (e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.), script characters (e.g., Hebrew script characters, Arabic script characters, etc.), etc.
[0019] The illustrated example of FIG. 1 shows the user device 100 performing a search based on handwritten Latin-based characters of the user-input handwriting 102 comprising English-language characters. The user device 100 of the illustrated example is also shown performing a search based on handwritten logographic characters of the user-input handwriting 104 comprising a Chinese-language character. The user device 100 of the illustrated example is also shown performing a search based on handwritten numeric characters of the user-input handwriting 106 comprising numeric characters. In the illustrated example, the user device 100 begins recognizing the illustrated characters of FIG. 1 in any supported language in response to or after detecting the user-input handwriting 102, 104, 106 without needing a user to pre-select or specify the language used by the user to enter the user-input handwriting 102, 104, 106.
[0020] FIG. 2 A depicts the example user device 100 of FIG. 1 performing a search based on the user-input handwriting 102 comprising English-language characters. In the illustrated example, the user device 100 monitors for user-input handwriting (e.g., the user-input handwriting 102) on the main screen 110. In some examples, when the user device 100 detects a start of user-input handwriting, the user device 100 provides a visual cue to a user that the user device 100 has entered a search mode. In some such examples, the user device 100 may provide such a visual cue by dimming a background image (e.g., a wallpaper image) displayed on, for example, the main screen 110. In the illustrated example, a search method initiated in response to or after detecting the user-input handwriting is a function or process of an operating system of the user device 100. In other examples, the search method may be implemented using a background application (e.g., a software program 436 of FIG. 4) that is started up during a boot process of the operating system of the user device 100 and that runs as a background process during operation of the user device 100. However, the background application does not require a user to manually launch it to perform a search. Instead, a search capability of the user device 100 remains in standby mode as a background process and is automatically activated to perform a search when the user device 100 detects user-input handwriting on the main screen 110. In this manner, a user need only input handwritten characters on the main screen 110 to initiate and perform a search for information matching the handwritten characters.
[0021] In the illustrated example, the user device 100 is provided with a search string display area 202 to display characters recognized from the user-input handwriting 102. In the illustrated example, the user-device 100 recognizes the user-input handwriting 102 as including English- language characters (or Latin-based characters) and displays the recognized English-language characters in the search string display area 202. As the user device 100 receives further handwritten characters on the main screen 110, the user device 100 updates the search string in the search string display area 202 with the additional recognized characters. The user device 100 of the illustrated example is also provided with a search results display area 204 to display search results based on the search string shown in the search string display area 202. As the user device 100 receives further handwritten characters on the main screen 110, the search results shown in the search string display area 202 are refined or filtered based on the additional recognized characters shown in the search string of the search string display area 202. [0022] In the illustrated example, the search results in the search results display area 204 show information that matches the search string of the search string display area 202. In some examples, the search method of the user device 100 guesses or infers a complete word based on a partial input and performs a search based on the guessed or inferred complete word. For example, in FIG. 1 A, the search method may infer that the user intends to find information for the word "Chicago" because the characters of the user-input handwriting 102 include letters "chica" which is a partial spelling of the word "Chicago." In other examples, the search method of the user device 100 does not automatically guess or infer an intended search string (e.g., "Chicago") from a partial word entry (e.g., "chica"), but instead shows search results that contain the character string (e.g., "chica") of any partial handwritten entry. In some examples, the search results display area 202 shows search results in different languages (e.g., English, German, French, Spanish, etc.) simultaneously in instances in which such different-language results contain the characters of the search string.
[0023] In the illustrated example, the search string display area 202 shows results of information locally stored on the user device 100 and results of information accessible through network communications (e.g., via an Internet connection) via the user device 100. In some examples, the search performed by the user device 100 is referred to as a universal search because the areas searched by the search method are not limited to particular applications or files or to local data (as opposed to remote data), but instead the search areas extend to any locally and/or remotely searchable information.
[0024] In the illustrated example, locally stored information shown in the search results display area 202 includes a "Chicago Meeting" calendar event icon 206 that is stored in a local calendar or appointments database of the user device 100. A user selection of the "Chicago Meeting" calendar event icon 206 causes the user device 100 to display a calendar appointment or event corresponding to the "Chicago Meeting" calendar event icon 206.
[0025] In the illustrated example, remotely stored information accessible through network communications via the user device 100 is represented in the search results display area 204 as a "Chicago Internet Search" icon 208. In the illustrated example, user-selection of the "Chicago Internet Search" icon 208 initiates a search on the Internet for the search string parameter "Chicago." Additionally or alternatively, an Internet search icon similar to the "Chicago Internet Search" icon 208 may be shown in the search results display area for the search string parameter "chica" as shown in the search string display area 202 so that an option is available for a user to perform an Internet search on partial words (e.g., "chica") rather than on complete guessed or inferred words (e.g., "Chicago").
[0026] In the illustrated example, a "Chicago Weather" icon 210 is shown in the search results display area 204. In some examples, the "Chicago Weather" icon 210 corresponds to information that was previously retrieved by the user device 100 via the Internet and that is locally cached at a memory of the user device 100. Additionally or alternatively, user-selection of the "Chicago Weather" icon 210 may cause the user device 100 to retrieve current weather conditions via the Internet to present to a user. Other search results which may correspond to similarly cacheable and/or instantly retrievable information may be news icons, financial data icons, rich site summary (RSS) feed icons, etc. For example, such icons (e.g., a Chicago news icon, a ticker symbol icon for stock quotes, etc.) may correspond to previously retrieved and locally cached information and/or may cause immediate retrieval of news via the Internet when selected by a user. [0027] In some examples, the "Chicago Weather" icon 210 is a live icon that displays current weather conditions (e.g., 50°F and partially cloudy conditions) when displayed in the search results display area 204. In some examples, any information displayed in the search results display area 204 may be a live icon if live or updated information is supported or available for that search result information.
[0028] FIG. 2B depicts the example user device 100 of FIGS. 1 and 2 A performing a search based on the user-input handwriting 104 including a Chinese-language character. In the illustrated example, a search method of the user device 100 recognizes the user-input
handwriting 104 as being a Chinese-language character, and shows the recognized Chinese- language character in the search string display area 202. In the illustrated example, the recognized Chinese-language character forms part of the English-language word "Chicago." The search method of the user device 100 provides search results in the search results display area 204 containing the recognized Chinese-language character. For example, the search results display area 204 displays a calendar event icon 222 having the recognized Chinese-language character of the user-input handwriting 104, an Internet search icon 224 for performing an Internet search on the Chinese-language word for "Chicago" which has the the recognized Chinese-language character of the user-input handwriting 104, and a weather icon 226 for displaying weather conditions for the city of Chicago.
[0029] FIG. 2C depicts the example user device 100 of FIGS. 1, 2 A, and 2B performing a search based on the user-input handwriting 106 including numeric characters. In the illustrated example of FIG. 2C, the user device 100 recognizes the user-input handwriting 106 as including numeric characters and displays the recognized numeric characters in the search string display area 202. In the illustrated example, the search string display area 202 displays the recognized numeric characters in a North American telephone number format. However, any other format for displaying the recognized numeric characters may be used. In the example of FIG. 2C, the search method of the user device 100 finds contact information (e.g., in an address or contacts database) having a telephone number including the recognized numeric characters of the user- input handwriting 106. The search results display area 204 of the illustrated example shows user-selectable options to initiate different communications with the found contact. For exampl the search results display area 204 includes a telephone dial icon 232 to initiate a call (e.g., dial telephone number) to the found contact, and an instant text message icon 234 to send a text message to the found contact.
[0030] In other examples, user-selectable action icons to, for example, dial telephone numbers of contacts, send messages to contacts, show meeting events with contacts, etc. can be displayed in the search results display area 204 for contacts matching user-input handwriting containing Latin-based characters (e.g., English characters, German characters, Spanish characters, French characters, etc.), logographic characters (e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.), script characters (e.g., Hebrew script characters, Arabic script characters, etc.), and/or any other type of character supported by a multiple-language dictionary of a handwritten character recognition method of the user device 100.
[0031] FIG. 3 depicts an example electronic device 300 that may be used to implement examples disclosed herein to perform handwriting-initiated searches. In the illustrated example of FIG. 3, the electronic device 300 is provided with example subsystems including: an exampl processor (or controller) 302, an example input device 304, an example handwriting detector 306, an example character recognizer 308, an example multiple-language dictionary 310, an example searcher 312, an example display 314, and an example memory 316. The subsystems 302, 304, 306, 308, 310, 312, 314, and 316 may be implemented using exclusively hardware, firmware or software or any desired combination of hardware, firmware, and/or software. For example, one or more integrated circuits, discrete semiconductor components, and/or passive electronic components may be used. Thus, for example, the subsystems 302, 304, 306, 308, 310, 312, 314, and 316, or parts thereof, could be implemented using one or more circuit(s), programmable processor(s), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), etc. The subsystems 302, 304, 306, 308, 310, 312, 314, and 316, or parts thereof, may be implemented using instructions, code, and/or other software and/or firmware, etc. stored on a machine accessible medium or computer readable medium (e.g., the memory 316) and executable by, for example, a processor (e.g., the example processor 302). When any of the apparatus or system claims of this patent is read to cover a purely software implementation, at least one of the subsystems 302, 304, 306, 308, 310, 312, 314, and 316 is hereby expressly defined to include a tangible medium such as a solid state memory, a magnetic memory, a digital versatile disk (DVD), a compact disk (CD), a BluRay disk, etc. Further still, the example electronic device 300 of FIG. 3 may include one or more elements, methods and/or devices in addition to, or instead of, those illustrated in FIG. 3, and/or may include more than one of any or all of the illustrated elements, methods and devices.
[0032] Turning in detail to FIG. 3, the electronic device 300 of the illustrated example is provided with the example processor 302 to control and/or manage operations of the user device 100 of FIGS. 1 and 2A-2C. In the illustrated example, the processor 302 executes an operating system (e.g., an operating system 434 of FIG. 4) and other software (e.g., software programs 436 of FIG. 4) of the user device 100. In addition, the processor 302 makes decisions and facilitates/arbitrates information exchanges between elements of the electronic device 300. In some examples, the processor 302 receives software and/or hardware interrupts from different subsystems of the user device 100 related to, for example, user input (e.g., touchdown events on a touch screen display of the user device 100, user selected icons, etc.), operating system services and/or background services (e.g., example operations of the search method disclosed herein), etc.
[0033] In the illustrated example, the electronic device 300 is provided with the input device 304 to receive user-input handwriting (e.g., the user-input handwriting 102, 104, 106 of FIGS. 1 and 2A-2C). In the illustrated example, the input device 304 is implemented using a touch screen display (e.g., the touch screen display 108 of FIG. 1). However, in other examples, the input device 304 may additionally or alternatively be implemented using other types of input device interfaces (e.g., a mouse interface, a touch pad interface, a pen tablet interface, etc.) that enable entering handwriting input.
[0034] In the illustrated example, the electronic device 300 is provided with the handwriting detector 306 to determine when a user has started inputting handwritten characters (e.g., the user- input handwriting 102, 104, 106) via the input device 304. In the illustrated example, the input device 304 triggers a touchdown event interrupt at the processor 302 in response to or after a user touching the touch screen display 108 (FIG. 1). The processor 302 processes the touchdown event interrupt by causing the handwriting detector 306 to analyze input touches or strokes on the input device 304 to determine whether the input touches or strokes are handwriting.
[0035] In the illustrated example, the electronic device 300 is provided with the character recognizer 308 to analyze user-input handwriting (e.g., the user-input handwriting 102, 104, 106) and recognize characters and/or words of the user-input handwriting. The electronic device 300 of the illustrated example is provided with the multiple-language dictionary 310 to enable the character recognizer 308 to recognizing characters and/or words from multiple different languages using a multiple-language recognition process. For example, to recognize characters of the user-input handwriting 102, 104, 106, the character recognizer 308 may compare stroke features of the user-input handwriting 102, 104, 106 to reference characters in the multiple- language dictionary 310 to find matches or substantially close matches (e.g., within a threshold) between reference characters in the multiple-language dictionary 310 and the user-input handwriting 102, 104, 106. The multiple-language dictionary 310 of the illustrated example is configurable to support any quantity of and types of languages including languages with Latin- based alphanumeric characters (e.g., English characters, German characters, Spanish characters, French characters, etc.), logographic characters (e.g., Chinese Hanzi characters, Japanese Kanji characters, etc.), script characters (e.g., Hebrew script characters, Arabic script characters, etc.), and/or any other types of characters. The character recognizer 308 of the illustrated example is configured to automatically detect the language of user-input handwriting and to dynamically switch between different detected languages supported in the multiple-language dictionary 310 without requiring a user to pre-select or pre-inform the user device 100 of a language in which the user is providing handwriting input. For example, in the illustrated example of FIG. 1, a user may provide the user-input handwriting 102 containing English-language characters at a first time (e.g., 10:08 AM), provide the user-input handwriting 104 containing Chinese-language characters at a second time (e.g., 10:09 AM), and provide the user-input handwriting 106 containing numeric characters at a third time (e.g., 10: 10 AM) without needing to pre-select the English language, the Chinese language, or numeric-type entry before providing such user-input handwriting 102, 104, 106. Instead, the character recognizer 308 automatically detects languages of user-input handwriting using the multiple-language dictionary 310. [0036] In the illustrated example, the electronic device 300 is provided with the searcher 312 to perform search operations for information stored locally on the user device 100 and/or stored at a remote location accessible through network communications via the user device 100. For example, the searcher 312 uses the recognized characters displayed in the search string display area 202 (FIGS. 2A-2C) as a search string parameter to find matching information. The searcher 312 then causes the user device 100 to display search results (e.g., the search results 206, 208, 210 of FIG. 2A, the search results 222, 224, 226 of FIG. 2B, and/or the search results 232, 234 of FIG. 2C) in the search results display area 204 of FIGS. 2A-2C.
[0037] In the illustrated example, the electronic device 300 is provided with the display 314 (e.g., the touch screen display 108 of FIG. 1) that is configured to present graphical user interfaces (e.g., the main screen 110 of FIG. 1, the search string display area 202 of FIGS. 2A- 2C, the search results display area 204 of FIGS. 2A-2C, etc.). In the illustrated example, the display 314 is a liquid crystal display. Additionally or alternatively, the display 314 may be made of any other suitable type of display technology such as e-paper displays, cathode ray tube (CRT) displays, light-emitting diode (LED) displays, etc.
[0038] In the illustrated example, to store data and/or machine-readable or computer- readable instructions, the electronic device 300 is provided with the memory 316 to locally store information, software, firmware, etc. at the user device 100. The memory 316 may be a mass storage memory magnetic or optical memory, a non- volatile integrated circuit memory, or a volatile memory. That is, the memory 316 may be any tangible medium such as a solid state memory, a magnetic memory, a DVD, a CD, a BluRay disk, etc.
[0039] FIG. 4 depicts a block diagram of an example implementation of a processor system that may be used to implement the user device 100 of FIGS. 1 and 2A-2C and/or the electronic device 300 of FIG. 3. In the illustrated example, the user device 100 is a two-way communication device with advanced data communication capabilities including the capability to communicate with other wireless-enabled devices or computer systems through a network of transceiver stations. The user device 100 may also have the capability to allow voice
communication. Depending on the functionality provided by the user device 100, it may be referred to as a data messaging device, a two-way pager, a cellular telephone with data messaging capabilities, a smart phone, a wireless Internet appliance, a tablet device, or a data communication device (with or without telephony capabilities). Although FIG. 4 depicts an example implementation of the user device 100 as having a number of components, in some example implementations, some of the components shown in FIG. 4 may be omitted and/or may be externally connected to the user device 100 (e.g., via interface port(s) and/or via wireless interface(s)). To aid the reader in understanding the structure of the user device 100 and how it communicates with other devices and host systems, FIG. 4 will now be described in detail.
[0040] Referring to FIG. 4, the user device 100 includes a number of components such as a main processor 402 (e.g., similar or identical to the processor 302 of FIG. 3), that is directly or indirectly connected to the other components, and controls the overall operation of the user device 100. Communication functions, including data and voice communications, are performed through a communication subsystem 404. The communication subsystem 404 receives messages from and sends messages to a wireless network 405. The communication subsystem 404 may be configured in accordance with the Global System for Mobile Communication (GSM) and General Packet Radio Services (GPRS) standards, Enhanced Data GSM Environment (EDGE) and Universal Mobile Telecommunications Service (UMTS) standards, 3rd Generation
Partnership Project (3GPP) Long Term Evolution (LTE) standards, IEEE 802.11 (Wi-Fi) standards, MOBITEX® communication standards, DAT AT AC® communication standards, Personal Communication Systems (PCS) standards, Time Division Multiple Access (TDMA) standards, Code Division Multiple Access (CDMA) standards, and/or any other suitable communication standard.
[0041] The main processor 402 also interacts with additional subsystems such as a Random Access Memory (RAM) 406, a persistent memory 408 (e.g., a non-volatile memory), a display 410, an auxiliary input/output (I/O) subsystem 412, a data port 414, a keyboard 416, a speaker 418, a microphone 420, short-range communications 422, and other device subsystems 424. In the illustrated example, the display 410 is used to implement the display 314 of FIG. 3.
[0042] To identify a subscriber, the user device 100 is provided with a SIM/RUIM card 426 (i.e. Subscriber Identity Module or a Removable User Identity Module) to be inserted into a SIM/RUIM interface 428 in order to communicate with a network. The SIM card or RUIM 426 is a type of smart card that can be used to identify a subscriber of the user device 100 and to personalize the user device 100, among other things. In some examples, search operations disclosed herein may be used to search information (e.g., address book entries, calendar entries, notes, etc.) stored on the SIM card or RUIM 426.
[0043] The user device 100 is a battery-powered device and includes a battery interface 432 for receiving one or more rechargeable batteries 430. Although current technology makes use of a battery, future technologies such as micro fuel cells or other suitable power sources may provide power to the user device 100.
[0044] The user device 100 also includes an operating system 434 and software programs 436. The operating system 434 and the software components 436 that are executed by the main processor 402 are typically stored in a persistent store such as the persistent memory 408, which may alternatively be a read-only memory (ROM) or similar storage element (not shown). In some examples, portions of the operating system 434 and/or the software components 436 may be temporarily loaded into and executed from a volatile store such as the RAM 406.
[0045] The software programs 436 may include software applications that control basic device operations, including data and voice communication applications. Other software applications include, for example, message applications, personal information manager (PIM) applications (e.g., applications to view, create, and/or manage e-mail, contacts, calendar events, voice mails, appointments, task items, etc. ), and other suitable software applications. Other types of software applications may include third party applications, which are added after the manufacture of the user device 100. Examples of third party applications include games, calculators, utilities, productivity applications, etc.
[0046] The data port 414 may be any suitable port that enables data communication between the user device 100 and another computing device for transferring data therebetween. The data port 414 can be a serial or a parallel port. For example, the data port 414 may be a universal serial bus (USB) port that includes data lines for data transfer and a supply line that can provide a charging current to charge the battery 430 of the user device 100.
[0047] The short-range communications subsystem 422 provides for communication between the user device 100 and different systems or devices, without the use of the wireless network 405. For example, the subsystem 422 may include an infrared device and associated circuits and components for short-range communication. Examples of short-range
communication standards include standards developed by the Infrared Data Association (IrDA), a Bluetooth® communication standard, and the 802.11 family of standards developed by IEEE. [0048] In use, a received signal such as a text message, an e-mail message, web page download, media content, etc. will be processed by the communication subsystem 404 and input to the main processor 402. The main processor 402 will then process the received signal for output to the display 410 or alternatively to the auxiliary I/O subsystem 412. A subscriber may also compose data items, such as e-mail messages, for example, using the keyboard 416 in conjunction with the display 410 and possibly the auxiliary I/O subsystem 412. The auxiliary subsystem 412 may include input devices such as: a touch screen display, a mouse, a track ball, a track pad, an infrared fingerprint detector, or a roller wheel with dynamic button pressing capability. The keyboard 416 is preferably an alphanumeric keyboard and/or telephone-type keypad. However, other types of keyboards may also be used. A composed item may be transmitted over the wireless network 405 through the communication subsystem 404.
[0049] For voice communications, the overall operation of the user device 100 is
substantially similar, except that the received signals are output to the speaker 418, and signals for transmission are generated by the microphone 420. Alternative voice or audio I/O
subsystems, such as a voice message recording subsystem, can also be implemented on the user device 100. Although voice or audio signal output is accomplished primarily through the speaker 418, the display 410 can also be used to provide additional information such as the identity of a calling party, duration of a voice call, or other voice call related information.
[0050] FIG. 5 depicts an example flow diagram representative of machine readable instructions that may be used to implement the electronic device 300 of FIG. 3 and/or the user device 100 of FIGS. 1 and 2A-2C to initiate and perform searches in response to or after detecting user-input handwriting. The example method of FIG. 5 may be performed using one or more processors, controllers, and/or any other suitable processing devices (e.g., the example processor 302 of FIG. 3 and/or the example processor 402 of FIG. 4). For example, the example method of FIG. 5 may be implemented using coded instructions (e.g., computer readable instructions) stored on one or more tangible computer readable storage media (e.g., a storage device or storage disk) such as flash memory, read-only memory (ROM), and/or random-access memory (RAM), a CD-ROM, a floppy disk, a hard drive, a DVD, a BluRay disk, or a memory associated with the processor 302 and/or the processor 402. In some examples, the method and/or parts thereof could alternatively be executed by a device other than the processor 302 and/or the processor 402 and/or embodied in firmware or dedicated hardware. As used herein, the term tangible computer readable storage medium is expressly defined to include any type of computer readable storage medium and to exclude propagating signals. Additionally or alternatively, the example method of FIG. 5 may be implemented using coded instructions (e.g., computer readable instructions) stored on one or more non-transitory computer readable storage media such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random-access memory, or any other storage media in which information is stored for any duration (e.g., for extended time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable storage medium is expressly defined to include any type of computer readable storage medium and to exclude propagating signals.
[0051] Alternatively, the example method of FIG. 5 may be implemented using an application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, software, etc. Also, the example method of FIG. 5 may be implemented manually or as any combination(s) of any of the foregoing techniques, for example, any combination of firmware, software, discrete logic and/or hardware. Further, although the example method of FIG. 5 is described with reference to the flow diagram of FIG. 5, other methods of implementing the method of FIG. 5 may be employed. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, sub-divided, or combined.
Additionally, the example method of FIG. 5 may be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc. As used herein, when the phrase "at least" is used as the transition term in a preamble of a claim, it is open-ended in the same manner as the term "comprising" is open ended. Thus, a claim using "at least" as the transition term in its preamble may include elements in addition to those expressly recited in the claim.
[0052] To begin the example method of FIG. 5, the handwriting detector 306 (FIG. 3) detects an initiation or start of handwriting (block 502). For example, the handwriting detector 306 detects when a user begins entering user-input handwriting on the input device 304 (FIG. 3). For example, in the illustrated examples of FIGS. 1 and 2A-2C, the handwriting detector 306 may detect a user-input stroke drawing the letter "C" of "Chica" in the user-input handwriting 102, a user-input stroke beginning to draw the Chinese-language character of the user-input
handwriting 104, or a user-input stroke beginning to draw the number "5" of "5557" in the user- input handwriting 106. In some examples, the start of user-input handwriting generates a touch event corresponding to the touch screen display 108 (FIG. 1) of the user device 100. In some examples, the touch event is generated by the operating system 434 (FIG. 4) of the user device 100 in response to or after detecting a touch on the touch screen display 108 that triggers an interrupt (e.g., a touch event interrupt) in the processor 302, causing the processor 302 to respond to the user-input handwriting. In such examples, the touch event is generated from a stroke of the user-input handwriting that forms a character to be included in a search string for the search operation.
[0053] The processor 302 initiates a search method in response to or after detecting the handwriting (block 504). For example, the handwriting detector 306 can send the processor 302 a confirmation that handwriting is being entered on the input device 304, and the processor 302 may respond by automatically activating the character recognizer 308 (FIG. 3) to recognize input characters and by causing the searcher 312 (FIG. 3) to perform a search method for information containing the recognized characters.
[0054] The character recognizer 308 recognizes one or more character(s) input via the handwriting (block 506). For example, the character recognizer 308 uses the multiple-language dictionary 310 (FIG. 3) to perform a multiple-language recognition method that supports recognizing characters in the handwriting from any supported language, alphabet and/or numeric system. In the illustrated examples of FIGS. 1 and 2A-2C, the multiple-language recognition method may be used to detect English-language characters as in the user-input handwriting 102, Chinese-language characters as in the user-input handwriting 104, and/or numeric characters as in the user-input handwriting 106.
[0055] The searcher 312 performs a search operation using the recognized characters as a search string (block 508). In the illustrated example, the searcher 312 automatically begins performing the search as soon as the character recognizer 308 recognizes one or more characters to form the search string parameter, without needing a user to press a search button or enter a search gesture to start the search process. The display 314 (FIG. 3) displays the search results (block 510). For example, the display 314 may display the search results (e.g., the search results 206, 208, 210 of FIG. 2A, the search results 222, 224, 226 of FIG. 2B, or the search results 232, 234 of FIG. 2C) in the search results display area 204 of FIGS. 2A-2C.
[0056] The handwriting detector 306 determines whether additional handwriting has been received (block 512) via, for example, the input device 304 of FIG. 3. If additional handwriting is detected at block 512, the processor 302 initiates a refined search method (block 514). In the illustrated example, the processor 302 responds to the additional handwriting by returning control to block 506 so that the character recognizer 308 and the searcher 312 can implement the refined search method by performing the operations of 506 and 508 on the additional handwriting. For example, the processor 302 causes the character recognizer 308 to recognize the additional input characters and causes the searcher 312 to refine the search results based on the additional recognized characters (e.g., as appended to the previous search string formed during a previous iteration of block 506). In some examples, the searcher 312 may refine the search results by applying a filter of the additional recognized characters on already found search results during a previous iteration of block 508 to filter out any of the search results that do not have the additional recognized characters.
[0057] If the handwriting detector 306 does not detect additional handwriting at block 512, the processor 302 awaits a user selection of a search result (block 516). If the input device 304 receives a user selection of a search result at block 516, the processor 302 performs a method corresponding to the user selection (block 520), and the example method of FIG. 5 ends. If the input device 304 does not receive a user selection of a search result at block 516 (e.g., within a threshold duration), the processor 302 determines whether it should continue to wait for a user selection (block 522). In some examples, the processor 302 may be configured to clear search results after a particular duration if a user selection is not received. Such clearing of search results may be implemented for security purposes, to return the user device 100 to a ready state to receive other handwriting for a different search, to return to a main screen (e.g., the main screen 110 of FIG. 1), and/or for any other suitable reason. If the processor 302 determines that it should continue to wait for a user selection, control returns to block 516. Otherwise, the example method of FIG. 5 ends.
[0058] Although certain methods, apparatus, and articles of manufacture have been described herein, the scope of coverage of this patent is not limited thereto. To the contrary, this patent covers all methods, apparatus, and articles of manufacture fairly falling within the scope of the claims either literally or under the doctrine of equivalents.

Claims

What is Claimed is:
1. A method of operating an electronic device to perform a search, comprising:
after detecting user-input handwriting on an operating system main screen of the device, performing a search operation of information accessible via the device based on the user-input handwriting; and
displaying on a display of the device, search results of the search operation.
2. A method as defined in claim 1 further comprising detecting further user-input handwriting, and refining the search results based on the further user-input handwriting.
3. A method as defined in claim 1, wherein performing the search operation after detecting the user-input handwriting comprises performing the search operation after receiving a touch event corresponding to a touch screen display of the device, the touch event corresponding to a character of the user-input handwriting used in a search string of the search operation.
4. A method as defined in claim 3, wherein the touch event is generated by an operating system of the device after detecting a touch on a touch display interface of the device.
5. A method as defined in claim 1, wherein performing the search operation comprises performing the search operation without receiving, separate from the user-input handwriting, a user input to initiate the search operation.
6. A method as defined in claim 1, wherein performing the search operation comprises performing the search operation as an operating system process of the device without starting a separate user-initiated application to perform the search operation.
7. A method as defined in claim 1, further comprising determining that the handwritten text includes at least one character from one or more of a plurality of languages.
8. A method as defined in claim 1, wherein the information accessible via the device is at least one of locally stored on the device or remotely stored at another location accessible using network communications from the device.
9. An electronic device to perform a search, comprising:
a processor configured to:
detect user-input handwriting on a main screen of the device, and perform a search operation of information accessible via the device based on the user-input handwriting, in response to the handwriting detector detecting the user-input handwriting; and
a display to display search results of the search operation.
10. An electronic device as defined in claim 9, wherein the processor is further configured to detect further user-input handwriting, and refine the search results based on the further user-input handwriting.
11. An electronic device as defined in claim 9, wherein the processor is further configured to receive an interrupt associated with a touch event of a touch screen display of the device, the touch event corresponding to a character of the user-input handwriting to be used in a search string of the search operation.
12. An electronic device as defined in claim 11, wherein the touch event is generated by an operating system of the device after detecting a touch on a touch display interface of the device.
13. An electronic device as defined in claim 9, wherein the processor is further configured to perform the search operation without receiving, separate from the user-input handwriting, a user input to initiate the search operation.
14. An electronic device as defined in claim 9, wherein the processor is further configured to perform the search operation as an operating system process of the device without starting a separate user-initiated application to perform the search operation.
15. An electronic device as defined in claim 9, wherein the processor is further configured to determine that the handwritten text includes at least one character from one or more of a plurality of languages supported by the multiple-language dictionary.
16. An electronic device as defined in claim 9, wherein the information accessible via the device is at least one of locally stored on the device or remotely stored at another location accessible using network communications from the device.
17. An electronic device to perform a search, comprising:
a processor configured to:
initiate a search based on a search string, after detecting user-input handwriting of the search string on a touch screen display of the user device;
recognize characters for the search string to recognize the characters as being from one of a plurality of different languages; and
display on a display of the device, search results of the search.
18. An electronic device as defined in claim 17, wherein the processor is configured to recognize the characters as being from one of the plurality of languages supported by a multiple- language recognition method without needing a user to pre-select the one of the plurality of languages.
19. An electronic device as defined in claim 17, wherein the characters recognized include Latin-based characters, logographic characters, and script characters.
20. An electronic device as defined in claim 17, wherein the processor is configured to initiate the search without receiving, separate from the user-input handwriting, a user input to initiate the search and without starting a separate user-initiated application to perform the search.
PCT/CA2012/050894 2012-12-13 2012-12-13 Handwriting-initiated search WO2014089669A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CA2012/050894 WO2014089669A1 (en) 2012-12-13 2012-12-13 Handwriting-initiated search

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CA2012/050894 WO2014089669A1 (en) 2012-12-13 2012-12-13 Handwriting-initiated search

Publications (1)

Publication Number Publication Date
WO2014089669A1 true WO2014089669A1 (en) 2014-06-19

Family

ID=50933624

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2012/050894 WO2014089669A1 (en) 2012-12-13 2012-12-13 Handwriting-initiated search

Country Status (1)

Country Link
WO (1) WO2014089669A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9563285B2 (en) 2014-10-07 2017-02-07 Lg Electronics Inc. Mobile terminal and control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070061317A1 (en) * 2005-09-14 2007-03-15 Jorey Ramer Mobile search substring query completion
US20110202874A1 (en) * 2005-09-14 2011-08-18 Jorey Ramer Mobile search service instant activation
US20110202493A1 (en) * 2010-02-17 2011-08-18 Google Inc. Translating User Interaction With A Touch Screen Into Text
US8094942B1 (en) * 2011-06-13 2012-01-10 Google Inc. Character recognition for overlapping textual user input
WO2012130156A1 (en) * 2011-03-30 2012-10-04 汉王科技股份有限公司 Handwriting input method and apparatus for touch device, and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070061317A1 (en) * 2005-09-14 2007-03-15 Jorey Ramer Mobile search substring query completion
US20110202874A1 (en) * 2005-09-14 2011-08-18 Jorey Ramer Mobile search service instant activation
US20110202493A1 (en) * 2010-02-17 2011-08-18 Google Inc. Translating User Interaction With A Touch Screen Into Text
WO2012130156A1 (en) * 2011-03-30 2012-10-04 汉王科技股份有限公司 Handwriting input method and apparatus for touch device, and electronic device
US8094942B1 (en) * 2011-06-13 2012-01-10 Google Inc. Character recognition for overlapping textual user input

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
APPLE, IPHONE USER GUIDE FOR IOS 6 SOFTWARE., September 2012 (2012-09-01), pages 1 - 156 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9563285B2 (en) 2014-10-07 2017-02-07 Lg Electronics Inc. Mobile terminal and control method thereof

Similar Documents

Publication Publication Date Title
US10871897B2 (en) Identification of candidate characters for text input
USRE46139E1 (en) Language input interface on a device
US8908973B2 (en) Handwritten character recognition interface
CA2793629C (en) Displaying a prediction candidate after typing a mistake
US9026428B2 (en) Text/character input system, such as for use with touch screens on mobile phones
US20170076181A1 (en) Converting text strings into number strings, such as via a touchscreen input
US20120290291A1 (en) Input processing for character matching and predicted word matching
CA2911850C (en) Portable electronic device and method of controlling display of selectable elements
EP2837994A2 (en) Methods and devices for providing predicted words for textual input
WO2014159473A2 (en) Automatic supplementation of word correction dictionaries
CN104662507B (en) Search at user equipment
WO2010109294A1 (en) Method and apparatus for text input
EP2669782B1 (en) Touchscreen keyboard with corrective word prediction
EP2909702B1 (en) Contextually-specific automatic separators
EP1359515B1 (en) System and method for filtering far east languages
WO2014089669A1 (en) Handwriting-initiated search
WO2022188805A1 (en) Keyword search method and apparatus, and electronic device
US20120200508A1 (en) Electronic device with touch screen display and method of facilitating input at the electronic device
CN103020306A (en) Lookup method and system for character indexes based on gesture recognition
CA2766877C (en) Electronic device with touch-sensitive display and method of facilitating input at the electronic device
CN109426359B (en) Input method, device and machine readable medium
EP2485133A1 (en) Electronic device with touch-sensitive display and method of facilitating input at the electronic device
EP4280041A1 (en) Translation method and electronic device
US20240134504A1 (en) Translation method and electronic device
CN115981486A (en) Input method, device and medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12890080

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12890080

Country of ref document: EP

Kind code of ref document: A1