WO2020238938A1 - Procédé d'entrée d'informations et terminal mobile - Google Patents

Procédé d'entrée d'informations et terminal mobile Download PDF

Info

Publication number
WO2020238938A1
WO2020238938A1 PCT/CN2020/092525 CN2020092525W WO2020238938A1 WO 2020238938 A1 WO2020238938 A1 WO 2020238938A1 CN 2020092525 W CN2020092525 W CN 2020092525W WO 2020238938 A1 WO2020238938 A1 WO 2020238938A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
input
field
target
extracted
Prior art date
Application number
PCT/CN2020/092525
Other languages
English (en)
Chinese (zh)
Inventor
郝鹏飞
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020238938A1 publication Critical patent/WO2020238938A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Definitions

  • the embodiments of the present disclosure relate to the field of communication technology, and in particular, to an information input method and a mobile terminal.
  • the more commonly used information input method is for the user to manually enter the required text or character through the soft keyboard, or to use the copy and paste method to achieve rapid input.
  • the mobile terminal can support the user to manually shoot or scan the text information in the picture to extract the text information in the picture for further processing by the user.
  • the embodiments of the present disclosure provide an information input method and a mobile terminal to solve the problem that related information input methods are not convenient enough to cause low information input efficiency.
  • the embodiments of the present disclosure provide an information input method applied to a mobile terminal, and the method includes:
  • a mobile terminal including:
  • a receiving module configured to receive index information input by a user in the first information input interface, where the index information includes characters;
  • An extraction module for extracting field information matching the index information from the text information of the target image
  • the input module is used to input the field information to the target location.
  • embodiments of the present disclosure provide a mobile terminal including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
  • the computer program When the computer program is executed by the processor, Implement the steps in the above information input method.
  • an embodiment of the present disclosure provides a computer-readable storage medium with a computer program stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the above-mentioned information input method are implemented.
  • the mobile terminal can extract the field information matching the index information from the text information of the target image based on the index information input by the user, and can input the extracted field information to the target location, so that it can be guaranteed According to the index information, the field information that meets the user's expectations is accurately extracted from the target graph, without the user needing to further process the extracted field information, so that information input can be completed quickly, and the convenience and information of user operations are improved Input efficiency.
  • Fig. 1 is a flowchart of an information input method provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of an interface for quickly entering a WiFi password provided by an embodiment of the present disclosure
  • FIG. 3 is a flowchart of another information input method provided by an embodiment of the present disclosure.
  • FIG. 4a is a schematic diagram of an interface for generating a floating ball based on index characters input by a user according to an embodiment of the present disclosure
  • 4b is a schematic diagram of an interface for quickly inputting contact information with the help of a floating ball according to an embodiment of the present disclosure
  • FIG. 5 is a schematic structural diagram of a mobile terminal provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of another mobile terminal provided by an embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram of an extraction module of a mobile terminal provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of another mobile terminal provided by an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of another mobile terminal provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of the hardware structure of a mobile terminal provided by an embodiment of the present disclosure.
  • FIG. 1 is a flowchart of an information input method provided by an embodiment of the present disclosure, which is applied to a mobile terminal. As shown in FIG. 1, the method includes the following steps:
  • Step 101 Receive index information input by a user in a first information input interface, where the index information includes characters.
  • the above-mentioned first information input interface may be any information input interface displayed by the mobile terminal, that is, the user can input or edit related information on this interface, such as a wireless-fidelity (WiFi) password input interface, personal information editing Interface, contact information editing interface, etc.
  • WiFi wireless-fidelity
  • the above-mentioned index information may be character information used to index matching fields, such as letters, numbers, words, or symbols.
  • the user may input index information in the first information input interface so that the mobile terminal can read from the target image Extract the field information that matches the index information from the text information.
  • Step 102 Extract field information matching the index information from the text information of the target image.
  • the above-mentioned target image may be an image collected in real time by the mobile terminal through a camera, or an image in the mobile terminal's gallery (such as the latest screenshot), and the target image may display the text to be input that the user desires to extract Information, that is, this embodiment is suitable for a scene in which a user quickly extracts desired text content from an image and inputs it to a corresponding position.
  • the mobile terminal activates the camera to collect the target image, or selects the target image from a gallery, or after the user inputs the index information Only when the mobile terminal enables the camera to collect the target image, or selects the target image from a gallery.
  • the above extraction of field information matching the index information from the text information of the target image may include two implementation schemes, one of which is: pre-adopting image text recognition methods (such as binary clustering extraction method, optical character recognition OCR Method, etc.) Extract all text information from the target image and store it, and then receive the index information input by the user, and extract the field information matching the index information from the stored text information of the target image; the second is : First receive the index information input by the user, and then use the image text recognition method to recognize the text information in the target image, and extract the field information matching the index information from the text information of the target image.
  • pre-adopting image text recognition methods such as binary clustering extraction method, optical character recognition OCR Method, etc.
  • the corresponding prompt information can be output to remind the user that the matching field information is not detected, and can be re-entered Index information.
  • multiple field information matching the index information is retrieved from the text information of the target image, then multiple field information can be extracted, and multiple field information options can be generated , For the user to select the desired input field information.
  • the step 102 includes:
  • the information in the field to be extracted is extracted.
  • the position information of the field to be extracted that matches the index information in the text information of the target image may be determined according to the index information. Specifically, it may be determined according to the index information.
  • the starting position of the field in the text information of the target image, wherein the first character of the field to be extracted matches the index information, such as determining the character in the text information of the target image and the index information.
  • the same target character is used as the first character of the field to be extracted, that is, the position of the target character is the starting position of the field to be extracted.
  • the end position of the field to be extracted can be further determined, specifically, it can be determined by detecting spaces, punctuation marks, or segmentation characters, etc., for example, the end position of the field to be extracted can be derived from the field to be extracted
  • the starting position begins, the position of the character before the first space, punctuation mark, or paragraph break.
  • the position information of the field to be extracted in the text information of the target image can be determined, so that the position information can be quickly and accurately
  • the information in the field to be extracted is extracted from the text information of the target image, and the information is the field information that matches the index information.
  • the determining the starting position of the field to be extracted in the text information of the target image according to the index information includes:
  • the target symbol includes a space character, a punctuation character or a paragraph character.
  • the start position of the field to be extracted when determining the start position of the field to be extracted, it may be to search for a character matching the index information from the text information of the target image, and the position of the character is the start of the field to be extracted.
  • the starting position specifically, matching with the index information may be the same as at least one character in the index information.
  • the index information includes multiple characters, it may be from the text information of the target image. Search for characters that are the same as one or more characters in the index information, and use the position of the character with the largest number of identical characters as the starting position of the field to be extracted. It should be noted that, in order to ensure index accuracy, a character position in the text information of the target image that is exactly the same as the character in the index information can be set as the starting position of the field to be extracted.
  • the target symbol When determining the end position of the field to be extracted, the target symbol may be searched backwards from the starting position of the field to be extracted, such as a space character, a punctuation character, or a paragraph character. If the target symbol is found, then The position of the previous character of the target symbol may be the end position of the field to be extracted. For example, after determining the starting position of the field to be extracted, you can continuously detect whether the following characters are words, letters, numbers, or symbols. If a space character is detected in a certain position, the character position before the space character can be determined That is, the end position of the field to be extracted.
  • the method further includes:
  • the camera of the mobile terminal is activated to collect the target image, where the preset event includes popping up a soft keyboard or displaying an information input interface.
  • some specific events can be preset to trigger the camera activation of the mobile terminal through the preset event, so that the mobile terminal is more intelligent and user-friendly.
  • the preset events may include but are not limited to events such as popping up a soft keyboard or displaying an information input interface.
  • the camera of the mobile terminal can be activated to collect the target image.
  • the soft keyboard pops up or enters the information input interface
  • it can display the switch control for turning on or off the smart input function, or the pop-up window prompts the user whether to turn on the smart input function.
  • the camera of the mobile terminal is automatically activated to collect the target image.
  • Step 103 Input the field information to the target location.
  • the field information can be automatically input to the target location, or the field information can be input to the target location based on a trigger operation of the user.
  • the target position may be the input position of the index information in the first information input interface, or the target position may be the target input position in the second information input interface.
  • the above-mentioned target position may be the input position of the index information in the first information input interface, that is, when the user inputs index information in an information input field in the first information input interface, the mobile terminal may The information input column intelligently inputs field information in the target image that matches the index information, and improves the user's information input efficiency.
  • the above-mentioned target position may also be a target input position in the second information input interface, the second information input interface and the first information input interface are different interfaces, and the target input position may be a user in the second information input interface.
  • the input position selected in the information input interface may also be a default input position preset by the mobile terminal in the second information input interface.
  • the field information extracted from the target image that matches the index information is displayed on the top of the interface by floating, so that the user can switch to the second information input interface , And select to input the field information into the target input position in the second information input interface, thereby making the information input method more flexible.
  • the above-mentioned mobile terminal may be any device with a storage medium, such as: a computer (Computer), a mobile phone, a tablet (Personal Computer), a laptop (Laptop Computer), a personal digital assistant (Personal Digital Assistant) Assistant, PDA for short), Mobile Internet Device (MID for short), or Wearable Device (Wearable Device) and other terminal devices.
  • a computer Computer
  • a mobile phone a tablet (Personal Computer), a laptop (Laptop Computer), a personal digital assistant (Personal Digital Assistant) Assistant, PDA for short), Mobile Internet Device (MID for short), or Wearable Device (Wearable Device) and other terminal devices.
  • PDA Personal Digital Assistant
  • MID Mobile Internet Device
  • Wearable Device Wearable Device
  • the mobile terminal can extract the field information matching the index information from the text information of the target image based on the index information input by the user, and can input the extracted field information to the target location. It can ensure that the field information that meets the user's expectations is accurately extracted from the target graph through the index information, without the user needing to further process the extracted field information, so that the information input can be completed quickly, which improves the convenience of the user operation Sex and information input efficiency.
  • the camera of the mobile terminal when it is detected that the soft keyboard 21 pops up, the camera of the mobile terminal can be started to collect a target image including WiFi password information, and the text information in the target image can be recognized;
  • the mobile terminal can retrieve the relevant field matching the character "5" from the text information in the target image;
  • the mobile terminal can input the remaining password characters "q3n53bs" into the password input field 22 to complete the quick input of the WiFi password.
  • the user does not need to enter the numbers and letters on the soft keyboard 21 Switch the input back and forth.
  • FIG. 3 is a flowchart of another information input method provided by an embodiment of the present disclosure, which is applied to a mobile terminal.
  • an additional display includes matching field information
  • the steps of floating controls are refined, and the steps of how to input the field information into the target position are refined, so that the information input method is more flexible and interesting.
  • the method includes the following steps:
  • Step 301 Receive index information input by a user in a first information input interface, where the index information includes characters.
  • step 101 For a specific implementation manner of this step, reference may be made to the implementation manner of step 101 in the method embodiment shown in FIG. 1. To avoid repetition, details are not described herein again.
  • Step 302 Extract field information matching the index information from the text information of the target image.
  • step 102 For a specific implementation manner of this step, reference may be made to the implementation manner of step 102 in the method embodiment shown in FIG. 1. To avoid repetition, details are not described herein again.
  • Step 303 Display a floating control, wherein the floating control includes the field information.
  • a floating control can be generated, and the floating control can be displayed on the current display interface.
  • the floating control can be in the form of a semi-transparent floating ball or floating window, and can be displayed on the top layer of the interface, that is, when the mobile terminal displays different interfaces, the floating control can always Keep displayed at the top of the interface.
  • the camera can be started to collect a target image including contact information, and the text information in the target image can be recognized.
  • the mobile terminal can retrieve from the text information in the target image the characters “surname” and “electricity” respectively. "", " ⁇ " and “ ⁇ ” matching related fields;
  • a floating ball 42 When the relevant name field, phone field, company field, and address field are retrieved, a floating ball 42 can be generated, and the floating ball 42 can be displayed on the information input interface 40.
  • the floating ball 42 can include the retrieved name field and phone number. Fields, company fields and address fields, and the floating ball 42 can be displayed on the top of the information input interface 40 in a semi-transparent state, and the floating ball 42 can be dragged to any position.
  • the method further includes:
  • the field information in the floating control is edited.
  • the user can also edit the field information in the floating control.
  • the mobile terminal extracts the field information from the target image
  • the user can edit the floating control.
  • the field information in the control is added, deleted or revised to meet the diverse needs of users.
  • the method further includes:
  • the floating control In response to the movement operation, the floating control is moved according to the movement track of the movement operation.
  • the method further includes:
  • the target location is located on the second information input interface.
  • the user can drag the floating ball 42 to the contact icon.
  • the mobile terminal can start the contacts application and jump to the contact information editor shown in Figure 4b under the user's operation. Interface 43.
  • the user can store the field information extracted from the target image in the floating control, and can switch to any desired information input interface, and realize rapid information input with the help of the floating control.
  • Step 304 In the case of receiving the user's first input to the floating control, in response to the first input, input the target field information in the floating control that matches the target position to the target position; wherein , The target position is the information input position indicated by the input end position of the first input.
  • the user can input the target field information in the floating control to the target position by performing corresponding input to the floating control.
  • the user may perform a first input on the floating control, where the first input may be a drag input (such as dragging the floating control to a desired information input position), or a double click input ( For example, click the floating control first, and then click the desired information input position), etc.
  • the mobile terminal may respond to the first input by inputting the field information in the floating control to the end of the first input.
  • the information input location indicated by the location More specifically, when the floating control includes multiple field information, the target field information in the floating control that matches the information input position can be input to the position, for example, according to the input of the information input position Instruction information, input a field of information in the floating control that matches the input instruction information to the information input position.
  • the user can drag the floating control to the contact name input position, where the name input indication information is displayed, and the mobile terminal can correspondingly change the contact information in the floating control Enter the person’s name field into that location.
  • the step 304 can be replaced.
  • the step 304 can be replaced with any one of the following:
  • the user can also input the target field information in the floating control to the target location by performing corresponding input on a certain field information in the floating control.
  • the user can perform a second input on the target field information in the floating control, where the second input can be a drag input (such as dragging the target field information in the floating control to a desired value).
  • Information input position such as dragging the target field information in the floating control to a desired value.
  • copy and paste input such as copying the target field information in the floating control, and then pasting it to the desired information input position, etc.
  • the mobile terminal can respond to the second input and transfer the The target field information is input to the information input position indicated by the input end position of the second input, where the target field information may be any field information in the floating control.
  • the user can drag the name field in the floating ball 42 to the name input position, drag the phone field in the floating ball 42 to the phone input position, and so on. Complete the input of the contact information.
  • the user can perform editing input on the target field information in the floating control to edit the target field information, wherein the editing input can be adding or deleting the target field information Or modify, etc.
  • the mobile terminal can edit the target field information in response to the edit input, and can input the edited target field information into the target location, and the target location can be the current information input interface
  • the information input location associated with the target field information may also be a preset default input location or an input location specified by the user.
  • the user can flexibly use a variety of different input methods to input a certain field of information in the floating control to a desired information input position, which improves information input. Efficiency can increase the fun of inputting information.
  • the method further includes:
  • the user can instruct the mobile terminal to destroy the floating control through a preset operation, even if the mobile terminal cancels the display of the floating control ,
  • the preset operation may be a long-press operation, or an operation of dragging to a specific position. In this way, through this embodiment, the user can cancel the display state of the floating control at any time.
  • the user can press and hold the floating ball 42 to exit the display of the floating ball 42.
  • the mobile terminal may also actively cancel the display of the floating control when the display duration of the floating control exceeds the preset duration, or when no operation on the floating control is detected beyond the preset duration. Control, eliminating the need for the user to manually cancel.
  • the user can flexibly select the desired input field information and input location according to his own needs, and can increase the interest of the information input .
  • this embodiment adds a variety of optional implementation manners to the embodiment shown in FIG. 1. These optional implementation manners can be implemented in combination with each other, or can be implemented separately, and can improve user operations. The technical effect of the convenience and efficiency of information input.
  • FIG. 5 is a schematic structural diagram of a mobile terminal provided by an embodiment of the present disclosure. As shown in FIG. 5, the mobile terminal 500 includes:
  • the receiving module 501 is configured to receive index information input by a user in the first information input interface, where the index information includes characters;
  • the extraction module 502 is configured to extract field information matching the index information from the text information of the target image
  • the input module 503 is used to input the field information to the target location.
  • the target position is an input position of the index information in the first information input interface, or the target position is a target input position in a second information input interface.
  • the mobile terminal 500 further includes:
  • the collection module 504 is configured to activate the camera of the mobile terminal 500 to collect the target image when a preset event is detected;
  • the preset event includes popping up a soft keyboard or displaying an information input interface.
  • the extraction module 502 includes:
  • the first determining unit 5021 is configured to determine the starting position of the field to be extracted in the text information of the target image according to the index information, wherein the first character of the field to be extracted matches the index information;
  • the second determining unit 5022 is configured to determine the end position of the field to be extracted
  • the extraction unit 5023 is configured to extract information in the field to be extracted based on the start position and the end position of the field to be extracted.
  • the first determining unit 5021 is configured to search for a character that is the same as at least one character in the index information from the text information of the target image, and determine the position of the character as the starting position of the field to be extracted ;
  • the second determining unit 5022 is configured to search backward for the target symbol from the start position of the field to be extracted, and determine the position of the previous character of the target symbol as the end position of the field to be extracted;
  • the target symbol includes a space character, a punctuation character or a paragraph character.
  • the mobile terminal 500 further includes:
  • the display module 505 is configured to display the floating control, wherein the floating control includes the field information;
  • the input module 503 is configured to input target field information matching the target position in the floating control to the target position in response to the first input in the case of receiving the user's first input to the floating control ;
  • the target position is the information input position indicated by the input end position of the first input;
  • the input module 503 is configured to input the target field information to the target location in response to the second input in the case of receiving the user's second input of the target field information in the floating control; wherein, The target position is the information input position indicated by the input end position of the second input;
  • the input module 503 is configured to edit the target field information in response to the edit input in the case of receiving the user's edit input of the target field information in the floating control, and then edit all the edited target field information.
  • the target field information is input to the target location.
  • the mobile terminal 500 further includes:
  • the jump module 506 is configured to jump from the first information input interface to the second information input interface
  • the target location is located on the second information input interface.
  • the mobile terminal 500 can implement various processes implemented by the mobile terminal in the method embodiments of FIG. 1 and FIG. 3, and to avoid repetition, details are not described herein again.
  • the mobile terminal 500 of the embodiment of the present disclosure can extract field information matching the index information from the text information of the target image based on the index information input by the user, and can input the extracted field information to the target location, so that it can be guaranteed According to the index information, the field information that meets the user's expectations is accurately extracted from the target graph, without the user needing to further process the extracted field information, so that information input can be completed quickly, and the convenience and information of user operations are improved Input efficiency.
  • FIG. 10 is a schematic diagram of the hardware structure of a mobile terminal implementing various embodiments of the present disclosure.
  • the mobile terminal 1000 includes but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, and a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, a power supply 1011 and other components.
  • Those skilled in the art can understand that the structure of the mobile terminal shown in FIG. 10 does not constitute a limitation on the mobile terminal.
  • the mobile terminal may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • mobile terminals include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the processor 1010 is configured to control the user input unit 1007 to receive the index information input by the user in the first information input interface, where the index information includes characters;
  • the target position is an input position of the index information in the first information input interface, or the target position is a target input position in a second information input interface.
  • processor 1010 is also used to:
  • the camera of the mobile terminal 1000 is activated to collect the target image, where the preset event includes popping up a soft keyboard or displaying an information input interface.
  • processor 1010 is also used to:
  • the information in the field to be extracted is extracted.
  • processor 1010 is also used to:
  • the target symbol includes a space character, a punctuation character or a paragraph character.
  • processor 1010 is also used to:
  • the target field information in the floating control in response to the first input, input the target field information in the floating control that matches the target position to the target position; wherein, the The target position is the information input position indicated by the input end position of the first input;
  • processor 1010 is also used to:
  • the target location is located on the second information input interface.
  • the mobile terminal 1000 can implement various processes implemented by the mobile terminal in the foregoing embodiments, and to avoid repetition, details are not described herein again.
  • the mobile terminal 1000 of the embodiment of the present disclosure can extract field information matching the index information from the text information of the target image based on the index information input by the user, and can input the extracted field information to the target location, so that it can be guaranteed According to the index information, the field information that meets the user's expectations is accurately extracted from the target graph, without the user needing to further process the extracted field information, so that information input can be completed quickly, and the convenience and information of user operations are improved Input efficiency.
  • the radio frequency unit 1001 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, after receiving downlink data from the base station, it is processed by the processor 1010; Uplink data is sent to the base station.
  • the radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1001 can also communicate with the network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 1002, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 1003 can convert the audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into audio signals and output them as sounds. Moreover, the audio output unit 1003 may also provide audio output related to a specific function performed by the mobile terminal 1000 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 1003 includes a speaker, a buzzer, and a receiver.
  • the input unit 1004 is used to receive audio or video signals.
  • the input unit 1004 may include a graphics processing unit (GPU for short) 10041 and a microphone 10042.
  • the graphics processor 10041 is configured to capture still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Image data is processed.
  • the processed image frame can be displayed on the display unit 1006.
  • the image frame processed by the graphics processor 10041 may be stored in the memory 1009 (or other storage medium) or sent via the radio frequency unit 1001 or the network module 1002.
  • the microphone 10042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 1001 in the case of a telephone call mode for output.
  • the mobile terminal 1000 also includes at least one sensor 1005, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 10061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 10061 and the display panel 10061 when the mobile terminal 1000 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 1005 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 1006 is used to display information input by the user or information provided to the user.
  • the display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD for short), Organic Light-Emitting Diode (OLED for short), etc.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • the user input unit 1007 can be used to receive inputted numeric or character information and generate key signal input related to user settings and function control of the mobile terminal.
  • the user input unit 1007 includes a touch panel 10071 and other input devices 10072.
  • the touch panel 10071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 10071 or near the touch panel 10071. operating).
  • the touch panel 10071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it
  • the processor 1010 receives and executes the command sent by the processor 1010.
  • the touch panel 10071 can be realized by various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 1007 may also include other input devices 10072.
  • other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 10071 can be overlaid on the display panel 10061.
  • the touch panel 10071 detects a touch operation on or near it, it transmits it to the processor 1010 to determine the type of the touch event, and then the processor 1010 determines the type of touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 10061.
  • the touch panel 10071 and the display panel 10061 are used as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated
  • the implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 1008 is an interface for connecting an external device with the mobile terminal 1000.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 1008 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the mobile terminal 1000 or can be used to connect to the mobile terminal 1000 and external Transfer data between devices.
  • the memory 1009 can be used to store software programs and various data.
  • the memory 1009 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 1009 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1010 is the control center of the mobile terminal. It uses various interfaces and lines to connect the various parts of the entire mobile terminal, runs or executes software programs and/or modules stored in the memory 1009, and calls data stored in the memory 1009 , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 1010 may include one or more processing units; optionally, the processor 1010 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1010.
  • the mobile terminal 1000 may also include a power supply 1011 (such as a battery) for supplying power to various components.
  • a power supply 1011 (such as a battery) for supplying power to various components.
  • the power supply 1011 may be logically connected to the processor 1010 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the mobile terminal 1000 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a mobile terminal, including a processor 1010, a memory 1009, a computer program stored in the memory 1009 and capable of running on the processor 1010, and the computer program is executed by the processor 1010
  • a mobile terminal including a processor 1010, a memory 1009, a computer program stored in the memory 1009 and capable of running on the processor 1010, and the computer program is executed by the processor 1010
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored.
  • the computer program is executed by a processor, each process of the above-mentioned information input method embodiment is realized, and the same technology can be achieved The effect, in order to avoid repetition, will not be repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the prior art can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present disclosure.
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.
  • each step of the above method or each of the above modules can be completed by an integrated logic circuit of hardware in the processor element or instructions in the form of software.
  • each module, unit, sub-unit or sub-module may be one or more integrated circuits configured to implement the above method, for example: one or more application specific integrated circuits (ASIC), or one or Multiple microprocessors (digital signal processors, DSP), or one or more field programmable gate arrays (Field Programmable Gate Array, FPGA), etc.
  • ASIC application specific integrated circuits
  • DSP digital signal processors
  • FPGA Field Programmable Gate Array
  • the processing element may be a general-purpose processor, such as a central processing unit (CPU) or other processors that can call program codes.
  • these modules can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • SOC system-on-a-chip

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un procédé d'entrée d'informations et un terminal mobile. Le procédé consiste : à recevoir des informations d'index entrées par un utilisateur dans une première interface d'entrée d'informations, les informations d'index comportant des caractères ; à extraire des informations de champ correspondant aux informations d'index des informations de texte d'une image cible ; à entrer les informations de champ dans une position cible, la position cible étant une position d'entrée des informations d'index dans la première interface d'entrée d'informations ou une position d'entrée cible dans une seconde interface d'entrée d'informations.
PCT/CN2020/092525 2019-05-29 2020-05-27 Procédé d'entrée d'informations et terminal mobile WO2020238938A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910456074.9 2019-05-29
CN201910456074.9A CN110196646A (zh) 2019-05-29 2019-05-29 一种信息输入方法及移动终端

Publications (1)

Publication Number Publication Date
WO2020238938A1 true WO2020238938A1 (fr) 2020-12-03

Family

ID=67753352

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092525 WO2020238938A1 (fr) 2019-05-29 2020-05-27 Procédé d'entrée d'informations et terminal mobile

Country Status (2)

Country Link
CN (1) CN110196646A (fr)
WO (1) WO2020238938A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113194024A (zh) * 2021-03-22 2021-07-30 维沃移动通信(杭州)有限公司 信息显示方法、装置和电子设备
WO2022222864A1 (fr) * 2021-04-20 2022-10-27 维沃移动通信(杭州)有限公司 Procédé et appareil de traitement de document et dispositif électronique

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110196646A (zh) * 2019-05-29 2019-09-03 维沃移动通信有限公司 一种信息输入方法及移动终端
JP2021111157A (ja) * 2020-01-10 2021-08-02 富士フイルムビジネスイノベーション株式会社 情報処理装置、及び情報処理プログラム
CN113791860B (zh) * 2021-09-16 2023-09-22 金蝶软件(中国)有限公司 一种信息转换方法、装置和存储介质
CN114385048A (zh) * 2021-12-06 2022-04-22 深圳市亚略特科技股份有限公司 一种信息的录入方法及装置、存储介质、电子设备
CN114816633A (zh) * 2022-04-22 2022-07-29 维沃移动通信有限公司 信息显示方法、装置及电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030028522A1 (en) * 2001-07-30 2003-02-06 Microsoft Corporation System and method for improved string matching under noisy channel conditions
CN102169541A (zh) * 2011-04-02 2011-08-31 郝震龙 一种采用光学定位的字符识别输入系统及其方法
CN104933068A (zh) * 2014-03-19 2015-09-23 阿里巴巴集团控股有限公司 一种信息搜索的方法和装置
CN105404401A (zh) * 2015-11-23 2016-03-16 小米科技有限责任公司 输入处理方法、装置及设备
CN105739717A (zh) * 2016-01-27 2016-07-06 百度在线网络技术(北京)有限公司 信息输入方法和装置
CN107180039A (zh) * 2016-03-09 2017-09-19 腾讯科技(深圳)有限公司 一种基于图片的文字信息识别方法及装置
CN110196646A (zh) * 2019-05-29 2019-09-03 维沃移动通信有限公司 一种信息输入方法及移动终端

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101733539B1 (ko) * 2009-11-24 2017-05-10 삼성전자주식회사 문자인식장치 및 그 제어방법
CN106874443A (zh) * 2017-02-09 2017-06-20 北京百家互联科技有限公司 基于视频文本信息提取的信息查询方法以及装置
CN109739416B (zh) * 2018-04-19 2020-07-03 北京字节跳动网络技术有限公司 一种文本提取方法和装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030028522A1 (en) * 2001-07-30 2003-02-06 Microsoft Corporation System and method for improved string matching under noisy channel conditions
CN102169541A (zh) * 2011-04-02 2011-08-31 郝震龙 一种采用光学定位的字符识别输入系统及其方法
CN104933068A (zh) * 2014-03-19 2015-09-23 阿里巴巴集团控股有限公司 一种信息搜索的方法和装置
CN105404401A (zh) * 2015-11-23 2016-03-16 小米科技有限责任公司 输入处理方法、装置及设备
CN105739717A (zh) * 2016-01-27 2016-07-06 百度在线网络技术(北京)有限公司 信息输入方法和装置
CN107180039A (zh) * 2016-03-09 2017-09-19 腾讯科技(深圳)有限公司 一种基于图片的文字信息识别方法及装置
CN110196646A (zh) * 2019-05-29 2019-09-03 维沃移动通信有限公司 一种信息输入方法及移动终端

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113194024A (zh) * 2021-03-22 2021-07-30 维沃移动通信(杭州)有限公司 信息显示方法、装置和电子设备
WO2022222864A1 (fr) * 2021-04-20 2022-10-27 维沃移动通信(杭州)有限公司 Procédé et appareil de traitement de document et dispositif électronique

Also Published As

Publication number Publication date
CN110196646A (zh) 2019-09-03

Similar Documents

Publication Publication Date Title
WO2020238938A1 (fr) Procédé d'entrée d'informations et terminal mobile
WO2019120191A1 (fr) Procédé de copie de segments multiples de texte et terminal mobile
WO2021104365A1 (fr) Procédé de partage d'objets et dispositif électronique
US20220365641A1 (en) Method for displaying background application and mobile terminal
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2021077897A1 (fr) Procédé et appareil d'envoi de fichier et dispositif électronique
WO2021136159A1 (fr) Procédé de capture d'écran et dispositif électronique
WO2020199934A1 (fr) Procédé de traitement d'informations et dispositif terminal
WO2020259024A1 (fr) Procédé de classification d'icônes, terminal mobile et support de stockage lisible par ordinateur
CN107943390B (zh) 一种文字复制方法及移动终端
WO2021104160A1 (fr) Procédé de révision et dispositif électronique
CN111274777B (zh) 思维导图显示方法及电子设备
WO2021004426A1 (fr) Procédé de sélection de contenu et terminal
WO2019196929A1 (fr) Terminal mobile et procédé de traitement de données vidéo
WO2020151525A1 (fr) Procédé d'envoi de message et dispositif terminal
WO2019120192A1 (fr) Procédé de modification de texte, et dispositif mobile
WO2021109961A1 (fr) Procédé de génération d'icône de raccourci, appareil électronique et support
US11250046B2 (en) Image viewing method and mobile terminal
WO2021036553A1 (fr) Procédé d'affichage d'icônes et dispositif électronique
WO2021169954A1 (fr) Procédé de recherche et dispositif électronique
WO2020181945A1 (fr) Procédé d'affichage d'identifiant et borne
WO2020199988A1 (fr) Procédé de copie de contenu et terminal
WO2021238719A1 (fr) Procédé d'affichage d'informations et dispositif électronique
WO2021057301A1 (fr) Procédé de commande de fichier et dispositif électronique
WO2021208889A1 (fr) Procédé de démarrage d'application et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20815232

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20815232

Country of ref document: EP

Kind code of ref document: A1