CN105975554B - Big data searching method and device based on mobile terminal - Google Patents

Big data searching method and device based on mobile terminal Download PDF

Info

Publication number
CN105975554B
CN105975554B CN201610285039.1A CN201610285039A CN105975554B CN 105975554 B CN105975554 B CN 105975554B CN 201610285039 A CN201610285039 A CN 201610285039A CN 105975554 B CN105975554 B CN 105975554B
Authority
CN
China
Prior art keywords
cursor
keyword text
mobile terminal
camera
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610285039.1A
Other languages
Chinese (zh)
Other versions
CN105975554A (en
Inventor
唐磊
刘小兵
黄东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TCL China Star Optoelectronics Technology Co Ltd
Original Assignee
Shenzhen China Star Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen China Star Optoelectronics Technology Co Ltd filed Critical Shenzhen China Star Optoelectronics Technology Co Ltd
Priority to CN201610285039.1A priority Critical patent/CN105975554B/en
Publication of CN105975554A publication Critical patent/CN105975554A/en
Application granted granted Critical
Publication of CN105975554B publication Critical patent/CN105975554B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5846Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using extracted text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting

Abstract

The invention is suitable for the field of information identification, and provides a big data searching method and device based on a mobile terminal. The method comprises the following steps: when a cursor key is pressed, starting a camera of the mobile terminal to capture data in front of the camera, wherein the cursor key is arranged on the mobile terminal; recognizing a picture comprising the captured data in front of the camera as a keyword text, and searching a sentence library for content corresponding to the keyword text; and outputting the searched content corresponding to the keyword text. The searching speed can be improved by the method.

Description

Big data searching method and device based on mobile terminal
Technical Field
The embodiment of the invention belongs to the field of information identification, and particularly relates to a big data searching method and device based on a mobile terminal.
Background
At present, intelligent equipment products have many search question APPs that solve student's homework difficult problem, such as ape search question, scholar monarch, homework group etc. they all shoot the difficult problem, will shoot content input search question APP again to search for the question APP and search for corresponding answer of solving the question according to the content of shooing. However, the following steps are often required to be executed when the search topic APP is used: open screen, unblock, look for APP, open APP, look for the camera, shoot the test question, search for the test question etc. consequently, make the operating procedure that uses the search for the question APP too loaded down with trivial details, thereby reduce the search for the question speed.
Disclosure of Invention
The embodiment of the invention provides a big data searching method and device based on a mobile terminal, aiming at solving the problem that the problem searching efficiency is too low due to the fact that the existing operation steps of using a problem searching APP are too complicated.
The embodiment of the invention is realized in such a way that a big data searching method based on a mobile terminal comprises the following steps:
when a cursor key is pressed, starting a camera of the mobile terminal to capture data in front of the camera, wherein the cursor key is arranged on the mobile terminal;
recognizing a picture comprising the captured data in front of the camera as a keyword text, and searching a sentence library for content corresponding to the keyword text;
and outputting the searched content corresponding to the keyword text.
Another object of an embodiment of the present invention is to provide a big data searching apparatus based on a mobile terminal, where the apparatus includes:
the data grabbing unit is used for starting a camera of the mobile terminal to grab data in front of the camera when a cursor key is pressed, and the cursor key is arranged on the mobile terminal;
the data identification unit is used for identifying the picture comprising the captured data in front of the camera as a keyword text and searching a sentence library for content corresponding to the keyword text;
and the data output unit is used for outputting the searched content corresponding to the keyword text.
In the embodiment of the invention, the mobile terminal can output the searched content corresponding to the keyword text only by pressing the cursor key arranged on the mobile terminal once by the user, so that the searching steps are simplified, the searching speed is increased, and the good experience of the user is improved.
Drawings
Fig. 1 is a flowchart of a big data searching method based on a mobile terminal according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of a cursor key arranged on a mobile terminal according to a first embodiment of the present invention;
FIG. 3 is a diagram illustrating Tiger and related paraphrases as the searched content corresponding to the keyword text according to the first embodiment of the present invention;
fig. 4 is a schematic diagram illustrating that the searched content corresponding to the keyword text is displayed on a black screen of the mobile terminal when the mobile terminal is in a screen-off state according to the first embodiment of the present invention;
FIG. 5 is a diagram illustrating a cursor graph according to a first embodiment of the present invention;
fig. 6 is a block diagram of a big data searching apparatus based on a mobile terminal according to a second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the embodiment of the invention, when a cursor key is pressed, a camera of the mobile terminal is started to capture data in front of the camera, the cursor key is arranged on the mobile terminal, a picture comprising the captured data in front of the camera is identified as a keyword text, contents corresponding to the keyword text are searched in a word and sentence library, and the searched contents corresponding to the keyword text are output.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
The first embodiment is as follows:
fig. 1 shows a flowchart of a big data searching method based on a mobile terminal according to a first embodiment of the present invention, which is detailed as follows:
and step S11, when the cursor key is pressed, starting a camera of the mobile terminal to capture data in front of the camera, wherein the cursor key is arranged on the mobile terminal.
As shown in fig. 2, the cursor key may be disposed on the left side of the mobile terminal, and of course, the cursor key may be disposed on the right side, upper side, lower side, or the like of the mobile terminal.
When the mobile terminal is in the screen-off state, in order to avoid that other APPs are started due to the starting of the whole mobile terminal to influence a user, when a cursor key is pressed, the mobile terminal is still kept in the screen-off state, but a camera of the mobile terminal is started to run quickly at the background, and an image is automatically captured through an automatic focusing function.
Optionally, to improve the accuracy of capturing the data, before the camera of the initiating mobile terminal captures the data in front of the camera, and after the cursor key is pressed, the method includes:
the cursor is started to send out cursor points through the started cursor and the cursor hole, the cursor points are projected on data in front of the camera, and the cursor hole is formed in the mobile terminal. The cursor comprises an LED lamp cursor, wherein the distance between the cursor hole and a camera of the mobile terminal is smaller than a preset installation distance. Because the distance between the cursor hole and the camera of the mobile terminal is smaller than the preset installation distance, the data projected by the cursor point and captured by the camera are basically the same, and the user can know the data captured by the camera through the cursor point projected on the data in front of the camera.
At this moment, the data that starts mobile terminal's camera and grabs camera the place ahead specifically is:
and the camera of the mobile terminal is started to grab data around a cursor point in front of the camera. Because the distance between the cursor hole and the camera of the mobile terminal is smaller than the preset installation distance, the data captured by the camera is the data around the cursor point.
Optionally, since a certain amount of power is consumed to start the cursor, in order to save power of the mobile terminal, after the camera of the mobile terminal captures data around the cursor point in front of the camera, the method includes the following steps: the cursor is closed.
Optionally, in order to be able to capture clear data, before the camera of the initiating mobile terminal captures data in front of the camera, the method includes:
whether the distance between the data in front of the camera of the mobile terminal and the mobile terminal is smaller than a preset distance threshold value or not is detected, so that the camera of the mobile terminal is started to capture the data in front of the camera when the distance between the data in front of the camera of the mobile terminal and the mobile terminal is smaller than the preset distance threshold value.
Specifically, a distance sensor is arranged on the mobile terminal (such as near a cursor) to detect the distance between the data in front of the camera of the mobile terminal and the mobile terminal, when the distance between the data in front of the camera of the mobile terminal and the mobile terminal is smaller than a preset distance threshold value, the camera can shoot clear data, and at the moment, the camera of the mobile terminal is started again to grab the data in front of the camera, so that resources are effectively saved.
Step S12, recognizing a picture including the captured data in front of the camera as a keyword text, and searching a sentence library for content corresponding to the keyword text.
Wherein the data comprises words or sentences. Specifically, after acquiring a picture corresponding to the captured data, the background performs local Optical Character Recognition (OCR) on the picture corresponding to the data, so as to recognize a word or a sentence as a keyword text. Among them, OCR refers to a process in which an electronic device (e.g., a scanner or a digital camera) checks a character printed on paper, determines its shape by detecting dark and light patterns, and then translates the shape into a computer word by a character recognition method. Since the captured data is recognized by the local OCR, the recognition speed of the data can be improved.
Optionally, before the step S12, the method includes:
automatically framing a picture including the captured data in front of the camera to obtain a small picture including a single word near a cursor point and a large picture including the entire test question.
After a small graph including a single word near a cursor point and a large graph including the entire test question are obtained, the small graph including the single word near the cursor point and the large graph including the entire test question may be simultaneously recognized as a small graph keyword text and a large graph keyword text, and then the small graph keyword text and the large graph keyword text may be searched. However, sometimes the user only wants to search for words near the cursor point, so in order to increase the search speed, the user may also search for the small-image keyword text first, and at this time, the step S12 specifically includes:
and A1, identifying the small graph comprising the single word near the cursor point as the small graph key word text. Specifically, since the small-image keyword text carries less information, a small image including a single word near a cursor point can be recognized as a small-image keyword text directly using locally-provided OCR.
A2, judging whether the small-figure keyword text is English characters, if so, searching the content corresponding to the small-figure keyword text in an English word sentence library, if not, identifying the large figure comprising the whole test question as a large-figure keyword text, and searching the content corresponding to the large-figure keyword text in a title word sentence library. Specifically, because the large-image keyword text carries more information, the OCR arranged at the cloud can be adopted to recognize the large image including the whole test question as the large-image keyword text, so as to improve the accuracy and speed of recognition.
Preferably, in order to improve the comprehensiveness and accuracy of the search, the word and sentence library is a cloud word and sentence library, for example, the english word and sentence library is a cloud english word and sentence library, and the title word and sentence library is a cloud title word and sentence library. Preferably, in order to increase the speed of searching, the word and sentence library can also be a local word and sentence library.
And step S13, outputting the searched content corresponding to the keyword text.
When the content corresponding to the keyword text is searched, the searched content is timely output, for example, when the content corresponding to the small image keyword text is searched, the content corresponding to the small image keyword text is output, and when the content corresponding to the large image keyword text is searched, the content corresponding to the large image keyword text is output.
Optionally, the step S13 specifically includes:
and B1, displaying the searched content corresponding to the keyword text. For example, assuming that words and sentences, and corresponding pronunciations, paraphrases, example sentences, etc. are stored in the english word and sentence library, when the content matching the keyword text is searched, all the searched content is displayed in time.
And/or the presence of a gas in the gas,
and B2, broadcasting the searched content corresponding to the keyword text.
The form that the user acquires the searched content is increased by broadcasting the searched content, so that the user can know the pronunciation of words or sentences, and the searched content can be acquired by broadcasting without looking at the screen of the mobile terminal. As shown in fig. 3, when the searched content corresponding to the keyword text is Tiger and a related paraphrase, the content is broadcasted as "Tiger" and a related paraphrase. Of course, if the searched content is the answer step of the topic, it is preferable that only the answer step of the searched topic is displayed.
Optionally, the displaying the searched content corresponding to the keyword text specifically includes:
and B11, detecting whether the mobile terminal is in the screen-off state.
And B12, when the mobile terminal is in the screen-off state, displaying the searched content corresponding to the keyword text on a black screen of the mobile terminal in a non-black font.
When the mobile terminal is in the off-screen state, the screen is a black background, and therefore, in order to enable the user to clearly view the displayed content, the searched content is displayed in a non-black font. Preferably, the font color of the displayed content is greatly different from black, for example, as shown in fig. 4, the font color of the displayed content is white, and may be red, yellow, or the like. Since the searched content can be displayed when the mobile terminal is in the screen-off state, resource consumption caused by starting the mobile terminal is reduced. Of course, if the mobile terminal is in the non-screen-off state, the searched content corresponding to the keyword text may be displayed in any color font on the black screen of the mobile terminal.
Optionally, in order to facilitate the user to start the camera of the mobile terminal again, after the outputting the searched content corresponding to the keyword text, the method includes:
and C1, displaying a cursor graph. And displaying a cursor graph on a screen of the mobile terminal after the user starts a cursor once by pressing a cursor key and outputs the searched content corresponding to the keyword text. Specifically, the cursor graphic may be displayed in a state of being off screen, and the color of the cursor graphic is non-black, for example, the color of the cursor graphic may be white as shown in fig. 5, and of course, may also be red or yellow, and the like, which is not limited herein. It should be noted that, if the current mobile terminal is not in the screen-off state, the mobile terminal is in the screen-off state and then displays the cursor graph.
And C2, receiving a click command of the cursor graph, and starting a camera of the mobile terminal to capture data in front of the camera according to the click command.
In the above-mentioned C1 and C2, when the user restarts the operation of searching for big data using the mobile terminal, the operation can be started by clicking the cursor graphic displayed on the screen of the mobile terminal, so that the starting method of searching for big data using the mobile terminal is enriched. Of course, the user may still start by pressing the cursor key, which is not limited herein.
In the first embodiment of the invention, when a cursor key is pressed, a camera of the mobile terminal is started to capture data in front of the camera, the cursor key is arranged on the mobile terminal, a picture comprising the captured data in front of the camera is identified as a keyword text, contents corresponding to the keyword text are searched in a sentence library, and the searched contents corresponding to the keyword text are output. The user only needs to press the cursor key arranged on the mobile terminal once, and the mobile terminal can output the searched content corresponding to the keyword text, so that the searching steps are simplified, the searching speed is increased, and the user experience is good.
It should be understood that, in the embodiment of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present invention.
Example two:
fig. 6 shows a block diagram of a mobile terminal-based big data searching apparatus according to a second embodiment of the present invention, which is applicable to a mobile terminal, and the mobile terminal may include a user equipment communicating with one or more core networks via a radio access network RAN, the user equipment may be a mobile phone (or referred to as "cellular" phone), a computer with mobile equipment, etc., and the user equipment may also be a portable, pocket, hand-held, computer-included or vehicle-mounted mobile apparatus, for example, which exchanges voice and/or data with the radio access network. Also for example, the mobile device may include a smartphone, a tablet computer, a Personal Digital Assistant (PDA), a point-of-sale (POS) or a vehicle-mounted computer, etc. For convenience of explanation, only portions related to the embodiments of the present invention are shown.
The big data searching device based on the mobile terminal comprises: a data capture unit 61, a data identification unit 62 and a data output unit 63. Wherein:
and the data grabbing unit 61 is used for starting a camera of the mobile terminal to grab data in front of the camera when a cursor key is pressed, wherein the cursor key is arranged on the mobile terminal.
The cursor key may be disposed on the left side of the mobile terminal, and of course, the cursor key may be disposed on the right side, upper side, lower side, or the like of the mobile terminal.
When the mobile terminal is in the screen-off state, in order to avoid that other APPs are started to influence the learning of the user due to the fact that the whole mobile terminal is started, when a cursor key is pressed, the mobile terminal is still kept in the screen-off state, but a camera of the mobile terminal is started to run quickly at the background, and images are automatically captured through an automatic focusing function.
Optionally, in order to improve the accuracy of capturing data, the mobile terminal-based big data searching apparatus includes:
the cursor starting unit is used for starting a cursor to send out cursor points through the started cursor and the cursor hole, the cursor points are projected on data in front of the camera, and the cursor hole is formed in the mobile terminal. At this time, the data capture unit is specifically configured to start a camera of the mobile terminal to capture data around a cursor point in front of the camera.
The cursor comprises an LED lamp cursor, wherein the distance between the cursor hole and a camera of the mobile terminal is smaller than a preset installation distance. Because the distance between the cursor hole and the camera of the mobile terminal is smaller than the preset installation distance, the data projected by the cursor point and captured by the camera are basically the same, and the user can know the data captured by the camera through the cursor point projected on the data in front of the camera.
Optionally, since a certain amount of power is consumed to start the cursor, in order to save power of the mobile terminal, the mobile terminal-based big data search apparatus includes: and the cursor closing unit is used for closing the cursor.
Optionally, in order to be able to capture clear data, the mobile terminal-based big data search apparatus includes:
the distance detection unit is used for detecting whether the data in front of the camera of the mobile terminal and the distance of the mobile terminal are smaller than a preset distance threshold value or not, so that when the data in front of the camera of the mobile terminal and the distance of the mobile terminal are smaller than the preset distance threshold value, the camera of the mobile terminal is started to grab the data in front of the camera.
Specifically, a distance sensor is arranged on the mobile terminal (such as near a cursor) to detect the distance between the data in front of the camera of the mobile terminal and the mobile terminal, when the distance between the data in front of the camera of the mobile terminal and the mobile terminal is smaller than a preset distance threshold value, the camera can shoot clear data, and at the moment, the camera of the mobile terminal is started again to grab the data in front of the camera, so that resources are effectively saved.
And a data recognition unit 62 configured to recognize a picture including the captured data in front of the camera as a keyword text, and search a sentence library for content corresponding to the keyword text.
Wherein the data comprises words or sentences. Specifically, after the picture corresponding to the captured data is obtained, the background performs local OCR recognition on the picture corresponding to the data, so as to recognize words or sentences as keyword texts. Since the captured data is recognized by the local OCR, the recognition speed of the data can be improved.
Optionally, the mobile terminal-based big data searching apparatus includes:
and the size picture cutting unit is used for automatically framing the pictures including the captured data in front of the camera to obtain a small picture including a single word near the cursor point and a large picture including the whole test question.
After a small graph including a single word near a cursor point and a large graph including the entire test question are obtained, the small graph including the single word near the cursor point and the large graph including the entire test question may be simultaneously recognized as a small graph keyword text and a large graph keyword text, and then the small graph keyword text and the large graph keyword text may be searched. However, sometimes the user only wants to search for words near the cursor point, so in order to increase the search speed, the small graph keyword text may be searched first, and in this case, optionally, the data recognition unit 62 includes:
and the small image keyword text acquisition module is used for identifying a small image comprising a single word near the cursor point as a small image keyword text. Specifically, since the small-image keyword text carries less information, a small image including a single word near a cursor point can be recognized as a small-image keyword text directly using locally-provided OCR.
And the content searching module is used for judging whether the small picture keyword text is an English character or not, searching the content corresponding to the small picture keyword text in an English word and sentence library if the small picture keyword text is the English character, identifying a large picture comprising the whole test question as a large picture keyword text if the small picture keyword text is the English character, and searching the content corresponding to the large picture keyword text in a topic word and sentence library. Specifically, because the large-image keyword text carries more information, the OCR arranged at the cloud can be adopted to recognize the large image including the whole test question as the large-image keyword text, so as to improve the accuracy and speed of recognition.
Preferably, in order to improve the comprehensiveness and accuracy of the search, the word and sentence library is a cloud word and sentence library, for example, the english word and sentence library is a cloud english word and sentence library, and the title word and sentence library is a cloud title word and sentence library. Preferably, in order to increase the speed of searching, the word and sentence library can also be a local word and sentence library.
And a data output unit 63 for outputting the searched content corresponding to the keyword text.
Specifically, when the content corresponding to the small-figure keyword text is searched, the content corresponding to the small-figure keyword text is output, and when the content corresponding to the large-figure keyword text is searched, the content corresponding to the large-figure keyword text is output.
Optionally, the data output unit 63 includes:
and the data display module is used for displaying the searched content corresponding to the keyword text.
And/or the presence of a gas in the gas,
and the data broadcasting module is used for broadcasting the searched content corresponding to the keyword text.
The form that the user acquires the searched content is increased by broadcasting the searched content, so that the user can know the pronunciation of words or sentences, and the searched content can be acquired by broadcasting without looking at the screen of the mobile terminal. Of course, if the searched content is the answer step of the topic, it is preferable that only the answer step of the searched topic is displayed.
Optionally, the data display module includes:
and the screen-off state detection module is used for detecting whether the mobile terminal is in a screen-off state or not.
And the differentiation display module is used for displaying the searched content corresponding to the keyword text on a black screen of the mobile terminal in a non-black font when the mobile terminal is in a screen-off state.
When the mobile terminal is in the off-screen state, the screen is a black background, and therefore, in order to enable the user to clearly view the displayed content, the searched content is displayed in a non-black font. Preferably, the font color of the displayed content is greatly different from black, for example, white, red, yellow, etc. Since the searched content can be displayed when the mobile terminal is in the screen-off state, resource consumption caused by starting the mobile terminal is reduced. Of course, if the mobile terminal is in the non-screen-off state, the searched content corresponding to the keyword text may be displayed in any color font on the black screen of the mobile terminal.
Optionally, in order to facilitate the user to start the camera of the mobile terminal again, the mobile terminal-based big data search apparatus includes:
and the cursor graph display unit is used for displaying a cursor graph. And after the user starts a cursor once by pressing a cursor key and outputs the searched content corresponding to the keyword text, displaying a cursor graph on a screen of the mobile terminal. Specifically, the cursor graphic may be displayed in a state of being off screen, and the color of the cursor graphic is non-black, for example, the color of the cursor graphic may be white as shown in fig. 5, and of course, may also be red or yellow, and the like, which is not limited herein. It should be noted that, if the current mobile terminal is not in the screen-off state, the mobile terminal is in the screen-off state and then displays the cursor graph.
And the click command receiving unit is used for receiving a click command of the cursor graph and starting a camera of the mobile terminal to capture data in front of the camera according to the click command.
When the user starts the camera of the mobile terminal again, the user can start the camera by clicking the cursor graph displayed on the screen of the mobile terminal, so that the starting mode of the camera of the mobile terminal is enriched. Of course, the user may still start by pressing the cursor key, which is not limited herein.
The camera restarting unit is used for detecting the shaking frequency and/or the shaking path of the mobile terminal, so that the shaking frequency of the mobile terminal meets the preset frequency requirement, and/or when the shaking path of the mobile terminal meets the preset shaking path requirement, the camera of the mobile terminal is started to grab the data in front of the camera. Because mobile terminal usually dresses on one's body at the user, consequently, the accessible directly rocks mobile terminal and starts mobile terminal's camera once more and snatchs the data in camera the place ahead to improve mobile terminal's the boot-up speed of camera.
In the second embodiment of the invention, the mobile terminal can output the searched content corresponding to the keyword text only by pressing the cursor key arranged on the mobile terminal once by the user, so that the searching steps are simplified, the searching speed is increased, and the good experience of the user is improved.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A big data searching method based on a mobile terminal is characterized by comprising the following steps:
when a cursor key is pressed, firstly starting a cursor to send out a cursor point through the started cursor and a cursor hole, wherein the cursor point is projected on data in front of a camera, then starting a camera of the mobile terminal to grab data around the cursor point in front of the camera, and the cursor key and the cursor hole are both arranged on the mobile terminal;
recognizing a picture comprising the captured data in front of the camera as a keyword text, and searching a sentence library for content corresponding to the keyword text;
outputting the searched content corresponding to the keyword text;
after the outputting the searched content corresponding to the keyword text, the method comprises the following steps:
and displaying a cursor graph in a screen-off state, wherein the cursor graph is used for starting a camera of the mobile terminal to capture data in front of the camera when the cursor graph is clicked.
2. The method of claim 1, wherein prior to identifying a picture including captured data in front of the camera as keyword text and searching a sentence library for content corresponding to the keyword text, the method comprises:
automatically framing a picture including the captured data in front of the camera to obtain a small picture including a single word near a cursor point and a large picture including the entire test question.
3. The method according to claim 2, wherein the recognizing the picture including the captured data in front of the camera as a keyword text and searching a sentence library for content corresponding to the keyword text specifically comprises:
recognizing a small graph including a single word near a cursor point as a small graph keyword text;
and judging whether the small picture keyword text is an English character or not, if so, searching the content corresponding to the small picture keyword text in an English word sentence library, otherwise, identifying a large picture comprising the whole test question as a large picture keyword text, and searching the content corresponding to the large picture keyword text in a title word sentence library.
4. The method according to any one of claims 1 to 3, wherein the outputting the searched content corresponding to the keyword text specifically includes:
the outputting the searched content corresponding to the keyword text specifically includes:
displaying the searched content corresponding to the keyword text;
and/or the presence of a gas in the gas,
and broadcasting the searched content corresponding to the keyword text.
5. A big data searching device based on a mobile terminal is characterized in that the device comprises:
the cursor starting unit is used for starting a cursor when a cursor key is pressed so as to send out a cursor point through the started cursor and a cursor hole, the cursor point is projected on data in front of the camera, and the cursor key and the cursor hole are both arranged on the mobile terminal;
the data grabbing unit is used for starting a camera of the mobile terminal to grab data around a cursor point in front of the camera when a cursor key is pressed, and the cursor key is arranged on the mobile terminal;
the data identification unit is used for identifying the picture comprising the captured data in front of the camera as a keyword text and searching a sentence library for content corresponding to the keyword text;
a data output unit for outputting the searched content corresponding to the keyword text;
the cursor graphic display unit is used for displaying a cursor graphic in a screen-off state after a user starts a cursor once by pressing a cursor key and outputs searched content corresponding to the keyword text;
and the click command receiving unit is used for receiving a click command of the cursor graph and starting a camera of the mobile terminal to capture data in front of the camera according to the click command.
6. The apparatus of claim 5, wherein the apparatus comprises:
and the size picture cutting unit is used for automatically framing the pictures including the captured data in front of the camera to obtain a small picture including a single word near the cursor point and a large picture including the whole test question.
7. The apparatus of claim 6, wherein the data identification unit comprises:
the small image keyword text acquisition module is used for identifying a small image comprising a single word near a cursor point as a small image keyword text;
and the content searching module is used for judging whether the small picture keyword text is an English character or not, searching the content corresponding to the small picture keyword text in an English word and sentence library if the small picture keyword text is the English character, identifying a large picture comprising the whole test question as a large picture keyword text if the small picture keyword text is the English character, and searching the content corresponding to the large picture keyword text in a topic word and sentence library.
8. The apparatus according to any one of claims 5 to 7, wherein the data output unit comprises:
the data display module is used for displaying the searched content corresponding to the keyword text;
and/or the data broadcasting module is used for broadcasting the searched content corresponding to the keyword text.
CN201610285039.1A 2016-04-29 2016-04-29 Big data searching method and device based on mobile terminal Active CN105975554B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610285039.1A CN105975554B (en) 2016-04-29 2016-04-29 Big data searching method and device based on mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610285039.1A CN105975554B (en) 2016-04-29 2016-04-29 Big data searching method and device based on mobile terminal

Publications (2)

Publication Number Publication Date
CN105975554A CN105975554A (en) 2016-09-28
CN105975554B true CN105975554B (en) 2020-05-29

Family

ID=56993621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610285039.1A Active CN105975554B (en) 2016-04-29 2016-04-29 Big data searching method and device based on mobile terminal

Country Status (1)

Country Link
CN (1) CN105975554B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106933960A (en) * 2017-01-23 2017-07-07 宇龙计算机通信科技(深圳)有限公司 A kind of picture recognition searching method and device
CN107391570A (en) * 2017-06-19 2017-11-24 广东小天才科技有限公司 Topic searching method and device on a kind of mobile terminal
CN107885534B (en) * 2017-10-23 2021-02-02 深圳市金立通信设备有限公司 Screen locking method, terminal and computer readable medium
CN109063076B (en) * 2018-07-24 2021-07-13 维沃移动通信有限公司 Picture generation method and mobile terminal
CN110232141A (en) * 2019-05-31 2019-09-13 三角兽(北京)科技有限公司 Resource acquiring method, resource acquisition device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102169541A (en) * 2011-04-02 2011-08-31 郝震龙 Character recognition input system using optical localization and method thereof
CN103108127A (en) * 2013-02-17 2013-05-15 华为终端有限公司 Method for shooting pictures through portable device and portable device
CN104380254A (en) * 2014-06-11 2015-02-25 华为技术有限公司 A method and a terminal for quick start of an application service
CN105100481A (en) * 2015-07-30 2015-11-25 努比亚技术有限公司 Shooting method and apparatus, and mobile terminal
CN105446612A (en) * 2015-11-06 2016-03-30 深圳市金立通信设备有限公司 Terminal control method and terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101834810B1 (en) * 2011-06-07 2018-03-06 엘지전자 주식회사 Mobile terminal and battery power saving mode switching method thereof
US9311750B2 (en) * 2012-06-05 2016-04-12 Apple Inc. Rotation operations in a mapping application
CN105072371A (en) * 2015-08-25 2015-11-18 广东欧珀移动通信有限公司 Recording method and recording device both applied to mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102169541A (en) * 2011-04-02 2011-08-31 郝震龙 Character recognition input system using optical localization and method thereof
CN103108127A (en) * 2013-02-17 2013-05-15 华为终端有限公司 Method for shooting pictures through portable device and portable device
CN104380254A (en) * 2014-06-11 2015-02-25 华为技术有限公司 A method and a terminal for quick start of an application service
CN105100481A (en) * 2015-07-30 2015-11-25 努比亚技术有限公司 Shooting method and apparatus, and mobile terminal
CN105446612A (en) * 2015-11-06 2016-03-30 深圳市金立通信设备有限公司 Terminal control method and terminal

Also Published As

Publication number Publication date
CN105975554A (en) 2016-09-28

Similar Documents

Publication Publication Date Title
CN105975557B (en) Topic searching method and device applied to electronic equipment
CN105975554B (en) Big data searching method and device based on mobile terminal
US20140304280A1 (en) Text display and selection system
CN106028160A (en) Image data processing method and device
CN105956096B (en) Topic method for fast searching and device based on mobile terminal
CN105930487B (en) Topic searching method and device applied to mobile terminal
RU2643464C2 (en) Method and apparatus for classification of images
US9170714B2 (en) Mixed type text extraction and distribution
CN105930486A (en) Quick search method and apparatus for big data
CN105975551B (en) Information search method and device based on wearable device
KR20100120753A (en) Image sensor and image sensing method for character recognition
WO2019105457A1 (en) Image processing method, computer device and computer readable storage medium
US20160266769A1 (en) Text display and selection system
CN112001312A (en) Document splicing method, device and storage medium
EP4228242A1 (en) Image processing method and apparatus
CN112052911A (en) Method and device for identifying riot and terrorist content in image, electronic equipment and storage medium
CN110022397B (en) Image processing method, image processing device, storage medium and electronic equipment
CN112381091A (en) Video content identification method and device, electronic equipment and storage medium
CN106202360B (en) Test question searching method and device
CN105975193B (en) Method for fast searching and device applied to mobile terminal
CN111093046B (en) Display screen opening method based on image acquisition equipment and terminal equipment
CN105975565B (en) Intelligent search method and device applied to electronic equipment
CN106407386B (en) Method and device for improving topic searching efficiency
CN111353422B (en) Information extraction method and device and electronic equipment
US20200334421A1 (en) System and method for translating text

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant