CN109359582B - Information searching method, information searching device and mobile terminal - Google Patents

Information searching method, information searching device and mobile terminal Download PDF

Info

Publication number
CN109359582B
CN109359582B CN201811195199.2A CN201811195199A CN109359582B CN 109359582 B CN109359582 B CN 109359582B CN 201811195199 A CN201811195199 A CN 201811195199A CN 109359582 B CN109359582 B CN 109359582B
Authority
CN
China
Prior art keywords
information
target object
acquiring
identified
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811195199.2A
Other languages
Chinese (zh)
Other versions
CN109359582A (en
Inventor
郭雄伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811195199.2A priority Critical patent/CN109359582B/en
Publication of CN109359582A publication Critical patent/CN109359582A/en
Application granted granted Critical
Publication of CN109359582B publication Critical patent/CN109359582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application belongs to the technical field of mobile terminals, and provides an information search method, an information search device, a mobile terminal and a computer readable storage medium, which comprise: acquiring a region to be identified in a preview picture; acquiring current position information of the mobile terminal; acquiring a target object corresponding to the area to be identified according to the current position information; and acquiring the information of the target object and displaying the information of the target object. Through the method and the device, the problems that in the prior art, the information searching process is complicated and the efficiency is low can be solved.

Description

Information searching method, information searching device and mobile terminal
Technical Field
The present application belongs to the technical field of mobile terminals, and in particular, to an information search method, an information search device, a mobile terminal, and a computer-readable storage medium.
Background
With the rapid development of communication technology, more and more application programs are installed on the mobile terminal, and various functions such as photographing, video, information search and the like are integrated, so that great convenience is brought to our lives.
At present, when searching for information of a target object through a mobile terminal, a user is generally required to input an identifier of the target object in a search box, searching is performed according to the identifier of the target object, a plurality of objects with the same identifier may be searched, the user is required to screen the target object from the plurality of objects with the same identifier and then obtain the information of the target object, and the information search process is complicated and low in efficiency.
Disclosure of Invention
In view of this, the present application provides an information searching method, an information searching apparatus, a mobile terminal and a computer-readable storage medium, so as to solve the problems of a complicated information searching process and low efficiency in the prior art.
A first aspect of the present application provides an information search method, including:
acquiring a region to be identified in a preview picture;
acquiring current position information of a mobile terminal;
acquiring a target object corresponding to the area to be identified according to the current position information;
and acquiring the information of the target object and displaying the information of the target object.
A second aspect of the present application provides an information search apparatus comprising:
the area acquisition module is used for acquiring an area to be identified in the preview picture;
the information acquisition module is used for acquiring the current position information of the mobile terminal;
the object acquisition module is used for acquiring a target object corresponding to the area to be identified according to the current position information;
and the information processing module is used for acquiring the information of the target object and displaying the information of the target object.
A third aspect of the present application provides a mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the information search method according to the first aspect when executing the computer program.
A fourth aspect of the present application provides a computer-readable storage medium, having stored thereon a computer program, which, when executed by a processor, performs the steps of the information search method according to the first aspect described above.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the information search method as described in the first aspect above.
Therefore, the area to be identified in the preview picture is obtained, the current position information of the mobile terminal is obtained, the target object corresponding to the area to be identified can be obtained according to the current position information, the information of the target object is obtained, and the information of the target object is displayed. According to the scheme, the target object corresponding to the area to be identified can be rapidly identified according to the current position information of the mobile terminal, and the information of the target object is acquired, so that a plurality of objects with the same identification are avoided, the search process of the information of the target object is simplified, and the search efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an information search method according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating an implementation of an information search method provided in the second embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an implementation of an information search method provided in the third embodiment of the present application;
FIG. 4 is a schematic diagram of an information search apparatus according to a fourth embodiment of the present application;
fig. 5 is a schematic diagram of a mobile terminal according to a fifth embodiment of the present application;
fig. 6 is a schematic diagram of a mobile terminal according to a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In particular implementations, the mobile terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a mobile terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the mobile terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The mobile terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic view of an implementation flow of an information search method provided in an embodiment of the present application, where the information search method is applied to a mobile terminal, as shown in the figure, the information search method may include the following steps:
step S101, acquiring a region to be identified in a preview picture.
In the embodiment of the application, when a picture is previewed through a mobile terminal, whether a preset operation on any region in the previewed picture is received or not is detected, and if the preset operation on a certain region in the previewed picture is received, the region is determined to be a region to be identified; and if the preset operation on any area in the preview picture is not received, determining that the area to be identified is not detected.
The preview screen may be a screen previewed by an image pickup device of the mobile terminal, and the image pickup device may be a front camera or a rear camera, which is not limited herein. The preset operation may refer to an operation, such as a click operation, a slide operation, and the like, preset by the user to select the area to be identified. The number of the regions to be identified may be one or more, and is not limited herein. When the number of the areas to be identified is multiple, the embodiment of the application can identify multiple target objects through one-time picture preview, and display information of the multiple target objects. The area to be recognized may refer to an area including an identifier of a target object, for recognizing the target object, and the identifier of the target object may refer to an identifier that can be searched for the target object, such as a name, a trademark, and the like of the target object.
When the number of the regions to be recognized is plural, the plural regions to be recognized may be located on the same preview screen or may be located on different preview screens, and the number is not limited herein. When the multiple areas to be recognized are located in different preview pictures, when the areas to be recognized in the first preview picture are obtained, the areas to be recognized in the first preview picture are stored, at this time, a user may move the terminal, a second preview picture which is not identical to the first preview picture is displayed in the mobile terminal, when the areas to be recognized in the second preview picture are obtained, the areas to be recognized in the second preview picture are stored, and the like until the preview picture displayed by the mobile terminal is detected to be closed.
And step S102, acquiring the current position information of the mobile terminal.
In the embodiment of the application, when the area to be identified in the preview picture is obtained, whether a positioning function of the mobile terminal is started is detected, and if the positioning function of the mobile terminal is detected to be started, the current position of the mobile terminal is obtained; and if the positioning function of the mobile terminal is not started, starting the positioning function, and acquiring the current position information of the mobile terminal through the positioning function. The Positioning function includes, but is not limited to, Global Positioning System (GPS) based Positioning and mobile operator base station based Positioning, which is not limited herein.
Step S103, acquiring a target object corresponding to the area to be identified according to the current position information.
In the embodiment of the application, after the current position information of the mobile terminal is acquired, the target object corresponding to the current position of the area to be identified can be acquired. For example, the area to be identified is the name of the shop a, and the shop a at the current location may be acquired according to the current location information of the mobile terminal.
And step S104, acquiring the information of the target object and displaying the information of the target object.
In the embodiment of the application, after the mobile terminal acquires the target object corresponding to the area to be identified, the mobile terminal can acquire the information of the target object from the internet and display the information of the target object. For example, the current location information of the mobile terminal is a, the target object corresponding to the area to be identified is a shop B located at the location a, and the relevant information of the shop B located at the location a may be searched from the internet and displayed. The information of the target object may refer to information related to the target object, for example, when the target object is a certain shop, the information of the shop includes, but is not limited to, a business scope, a commodity price, discount information, and the like of the shop, and is not limited herein.
It should be noted that, when the number of the areas to be identified is multiple, the acquiring the target object corresponding to each area to be identified in the multiple areas to be identified and the information of the target object may specifically be: when an area to be identified is obtained, a target object corresponding to the area to be identified and information of the target object are obtained according to current position information of the mobile terminal, and when a target object corresponding to the last area to be identified and information of the target object are obtained, information of the target object corresponding to each area to be identified in the plurality of areas to be identified is displayed on the same page; after the multiple areas to be recognized are obtained, the target object corresponding to each area to be recognized and the information of the target object in the multiple areas to be recognized are obtained at the same time, and the information of the target object corresponding to each area to be recognized in the multiple areas to be recognized is displayed on the same page.
According to the embodiment of the application, the target object corresponding to the area to be recognized can be rapidly recognized according to the current position information of the mobile terminal, and the information of the target object is obtained, so that a plurality of objects with the same identification are avoided, the searching process of the information of the target object is simplified, and the searching efficiency is improved.
Referring to fig. 2, which is a schematic view of an implementation flow of an information search method provided in the second embodiment of the present application, where the information search method is applied to a mobile terminal, as shown in the figure, the information search method may include the following steps:
in step S201, a region to be recognized in the preview screen is acquired.
The step is the same as step S101, and reference may be made to the related description of step S101, which is not repeated herein.
Step S202, identifying the area to be identified, and acquiring at least one candidate object corresponding to the area to be identified.
In the embodiment of the application, when the to-be-identified region is acquired from the preview picture, first picture information matched with the to-be-identified region may be acquired through the internet, or first picture information matched with the to-be-identified region may be acquired from a template database pre-stored in a mobile terminal, and after the first picture information matched with the to-be-identified region is acquired, since objects corresponding to the first picture information may exist at different positions, at least one candidate object corresponding to the first picture may be acquired. The at least one candidate object refers to objects which are in different positions and correspond to the to-be-identified area. For example, if the result of the identification of the area to be identified is "kendiry", a plurality of "kendiry" stores located at different positions may be searched. The first picture information may include a text, a picture, or a combination of the text and the picture, which is not limited herein.
Step S203, current location information of the mobile terminal is acquired.
The step is the same as step S102, and reference may be made to the related description of step S102, which is not repeated herein.
Step S204, selecting a target object from the at least one candidate object according to the current position information.
In this embodiment of the application, according to the current location information of the mobile terminal, an object located at the current location may be selected from the at least one candidate object, and the object is a target object.
Optionally, the selecting a target object from the at least one candidate object according to the current location information includes:
acquiring position information of each candidate object in the at least one candidate object;
and searching for an object with the position information being the current position information from the at least one candidate object, wherein the object is a target object.
In the embodiment of the application, the information of each candidate object in the at least one candidate object corresponding to the area to be identified may be obtained, and an object whose position information is the current position information of the mobile terminal may be found from the at least one candidate object, so that the target object corresponding to the area to be identified may be quickly and accurately searched without manual screening by a user.
Step S205, acquiring information of the target object, and displaying the information of the target object.
The step is the same as step S104, and reference may be made to the related description of step S104, which is not repeated herein.
According to the embodiment of the application, the target object can be rapidly screened from the at least one candidate object according to the current position information of the mobile terminal, and the information of the target object is obtained, so that a plurality of objects with the same identification are avoided, the searching process of the information of the target object is simplified, and the searching efficiency is improved.
Referring to fig. 3, which is a schematic view of an implementation flow of an information search method provided in the third embodiment of the present application, where the information search method is applied to a mobile terminal, as shown in the figure, the information search method may include the following steps:
in step S301, a preview mode of the camera is started.
In the embodiment of the present application, the preview mode may refer to a mode capable of previewing a picture to be currently taken by a camera. The mobile terminal enters a preview mode after starting the camera, and can display a preview picture of the camera so that a user can determine whether the current picture is a picture to be shot or not, and when receiving a shooting instruction, the mobile terminal closes the preview mode of the camera and shoots the preview picture of the camera.
Step S302, a preview screen in the preview mode is acquired.
In step S303, a target object recognition function is started.
The target object identification function is a function of acquiring a region to be identified and identifying the region to be identified when a preset operation on any region in the preview picture is received.
In the embodiment of the application, because the existing mobile terminal usually focuses on any region in a preview screen when receiving an operation on the region, a target object identification function option may be set in the preview screen for distinguishing focusing from region identification, and when detecting that the target object identification function is started, if a preset operation on any region in the preview screen is received, a trigger region of the preset operation is obtained, where the trigger region is a to-be-identified region selected by a user; and when the target object recognition function is detected not to be started, if the operation on any area in the preset picture is received, focusing the trigger area of the operation. The target object recognition function option may be located at any position of the preview screen, such as the upper right corner, the upper left corner, the lower right corner, the lower left corner, or a function selection area below the preview screen, which is not limited herein. The trigger area may be an area within a preset range with a touch point on the preset screen as a center.
Step S304, acquiring the area to be identified in the preview picture.
The step is the same as step S101, and reference may be made to the related description of step S101, which is not repeated herein.
In the embodiment of the application, in order to avoid misoperation of a user, when a preset operation is received for any one region in the preset picture, a selection frame is displayed, the region where the selection frame is located is a trigger region of the preset operation, and when a confirmation operation for the selection frame is received, the region in the selection frame is determined to be the region to be identified, and the selection frame is hidden.
In this embodiment of the application, when an adjustment operation on the selection frame is received, the selection frame may be adjusted, and when a confirmation operation on the adjusted selection frame is received, an area in the adjusted selection frame is determined to be an area to be identified, and the selection frame is hidden.
Optionally, the acquiring the to-be-identified region in the preset picture includes:
and acquiring at least one to-be-identified area selected from the preset picture.
In the embodiment of the application, a user can select at least one area to be recognized from a preset picture, and when the number of the areas to be recognized is multiple, multiple target objects can be recognized through one-time picture preview without performing picture previewing for multiple times.
Step S305, current location information of the mobile terminal is acquired.
The step is the same as step S102, and reference may be made to the related description of step S102, which is not repeated herein.
Step S306, acquiring the target object corresponding to the area to be identified according to the current position information.
The step is the same as step S103, and reference may be made to the related description of step S103, which is not described herein again.
Step S307, acquiring the information of the target object, and displaying the information of the target object on a screen locking interface.
In the embodiment of the application, since it takes time to identify the area to be identified and acquire the information of the identified target object, in order to avoid the user waiting for the display of the information of the target object, after the area to be identified in the preview picture is acquired, the user can close the preview mode of the camera, use the mobile terminal to perform other operations, and when it is detected that the mobile terminal is turned off, the mobile terminal can be controlled to be on, and the information of the target object is displayed on the screen locking interface.
The displaying of the information of the target object on the lock screen interface may refer to generating a picture of the information of the target object, and displaying the picture instead of the lock screen wallpaper of the mobile terminal. In addition, when the picture is used for displaying instead of the screen locking wallpaper of the mobile terminal, a timer can be started for timing, and the screen locking wallpaper display of the mobile terminal is recovered after the preset time, namely the picture display is cancelled.
According to the method and the device, after the area to be recognized in the preview picture is acquired according to the current position information of the mobile terminal, the information of the target object can be displayed on the screen locking interface, so that the mobile terminal can conveniently perform other operations in the process of recognizing the area to be recognized.
Fig. 4 is a schematic diagram of an information search apparatus according to the fourth embodiment of the present application, and only the relevant portions of the embodiment of the present application are shown for convenience of description.
The information search apparatus includes:
an area obtaining module 41, configured to obtain an area to be identified in the preview screen;
an information obtaining module 42, configured to obtain current location information of the mobile terminal;
an object obtaining module 43, configured to obtain, according to the current position information, a target object corresponding to the area to be identified;
and the information processing module 44 is configured to acquire the information of the target object and display the information of the target object.
Optionally, the information searching apparatus further includes:
the area identification module 45 is configured to identify the area to be identified, and acquire at least one candidate object corresponding to the area to be identified;
the object obtaining module 43 is specifically configured to:
and selecting a target object from the at least one candidate object according to the current position information.
Optionally, the object obtaining module 43 includes:
an obtaining unit, configured to obtain position information of each candidate object in the at least one candidate object;
and the searching unit is used for searching an object with the position information being the current position information from the at least one candidate object, wherein the object is a target object.
Optionally, the information searching apparatus further includes:
and a mode starting module 46, configured to start a preview mode of the camera, and obtain a preview screen in the preview mode.
Optionally, the information searching apparatus further includes:
a function starting module 47, configured to start a target object identification function, where the target object identification function is a function of acquiring a to-be-identified area and identifying the to-be-identified area when a preset operation on any area in the preview screen is received.
Optionally, the area obtaining module 41 is specifically configured to:
and acquiring at least one to-be-identified area selected from the preset picture.
Optionally, the information processing module 44 is specifically configured to:
and displaying the information of the target object on a screen locking interface.
The information search device provided in the embodiment of the present application may be applied to the first method embodiment, the second method embodiment, and the third method embodiment, and for details, reference is made to the description of the first method embodiment, the second method embodiment, and the third method embodiment, and details are not described herein again.
Fig. 5 is a schematic diagram of a mobile terminal according to a fifth embodiment of the present application. The mobile terminal as shown in the figure may include: one or more processors 501 (only one shown); one or more input devices 502 (only one shown), one or more output devices 503 (only one shown), and a memory 504. The processor 501, the input device 502, the output device 503, and the memory 504 are connected by a bus 505. The memory 504 is used for storing instructions and the processor 501 is used for executing the instructions stored by the memory 504. Wherein:
the processor 501 is configured to acquire a region to be identified in a preview image; acquiring current position information of the mobile terminal; acquiring a target object corresponding to the area to be identified according to the current position information; and acquiring the information of the target object and displaying the information of the target object.
Optionally, the processor 501 is further configured to:
identifying the area to be identified, and acquiring at least one alternative object corresponding to the area to be identified;
and selecting a target object from the at least one candidate object according to the current position information.
Optionally, the processor 501 is specifically configured to:
acquiring position information of each candidate object in the at least one candidate object;
and searching for an object with the position information being the current position information from the at least one candidate object, wherein the object is a target object.
Optionally, the processor 501 is further configured to:
and starting a preview mode of the camera, and acquiring a preview picture in the preview mode.
Optionally, the processor 501 is further configured to:
and starting a target object identification function, wherein the target object identification function is a function of acquiring a region to be identified and identifying the region to be identified when a preset operation on any region in the preview picture is received.
Optionally, the processor 501 is specifically configured to:
and acquiring at least one to-be-identified area selected from the preset picture.
Optionally, the processor 501 is specifically configured to:
and displaying the information of the target object on a screen locking interface.
It should be understood that, in the embodiment of the present Application, the Processor 501 may be a Central Processing Unit (CPU), and the Processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 502 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, a data receiving interface, and the like. The output device 503 may include a display (LCD, etc.), a speaker, a data transmission interface, and the like.
The memory 504 may include a read-only memory and a random access memory, and provides instructions and data to the processor 501. A portion of the memory 504 may also include non-volatile random access memory. For example, the memory 504 may also store device type information.
In a specific implementation, the processor 501, the input device 502, the output device 503, and the memory 504 described in this embodiment of the present application may execute the implementation described in the embodiment of the information search method provided in this embodiment of the present application, or may execute the implementation described in the information search apparatus described in the fourth embodiment of the present application, which is not described herein again.
Fig. 6 is a schematic diagram of a mobile terminal according to a sixth embodiment of the present application. As shown in fig. 6, the mobile terminal 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62 stored in said memory 61 and executable on said processor 60. The processor 60, when executing the computer program 62, implements the steps in the various information search method embodiments described above, such as steps S101 to S104 shown in fig. 1. Alternatively, the processor 60, when executing the computer program 62, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the modules 41 to 47 shown in fig. 4.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 62 in the mobile terminal 6. For example, the computer program 62 may be divided into an area acquisition module, an information acquisition module, an object acquisition module, an information processing module, an area identification module, a mode activation module, and a function activation module, and the specific functions of each module are as follows:
the area acquisition module is used for acquiring an area to be identified in the preview picture;
the information acquisition module is used for acquiring the current position information of the mobile terminal;
the object acquisition module is used for acquiring a target object corresponding to the area to be identified according to the current position information;
and the information processing module is used for acquiring the information of the target object and displaying the information of the target object.
Optionally, the area identification module is configured to identify the area to be identified, and acquire at least one candidate object corresponding to the area to be identified;
the object acquisition module is specifically configured to:
and selecting a target object from the at least one candidate object according to the current position information.
Optionally, the object obtaining module includes:
an obtaining unit, configured to obtain position information of each candidate object in the at least one candidate object;
and the searching unit is used for searching an object with position information being the current position information from the at least one candidate object, wherein the object is a target object.
Optionally, the mode starting module is configured to start a preview mode of the camera, and acquire a preview screen in the preview mode.
Optionally, the function starting module is configured to start a target object identification function, where the target object identification function is a function of acquiring a to-be-identified area and identifying the to-be-identified area when a preset operation on any area in the preview screen is received.
Optionally, the area obtaining module is specifically configured to:
and acquiring at least one to-be-identified area selected from the preset picture.
Optionally, the information processing module is specifically configured to:
and displaying the information of the target object on a screen locking interface.
The mobile terminal 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The mobile terminal may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is merely an example of a mobile terminal 6 and is not intended to limit the mobile terminal 6 and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the mobile terminal may also include input-output devices, network access devices, buses, etc.
The processor 60 may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the mobile terminal 6, such as a hard disk or a memory of the mobile terminal 6. The memory 61 may also be an external storage device of the mobile terminal 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the mobile terminal 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the mobile terminal 6. The memory 61 is used for storing the computer program and other programs and data required by the mobile terminal. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile terminal and method may be implemented in other ways. For example, the above-described apparatus/mobile terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. An information search method, comprising:
acquiring a region to be identified in a preview picture;
acquiring current position information of the mobile terminal;
acquiring a target object corresponding to the area to be identified according to the current position information;
acquiring the information of the target object and displaying the information of the target object;
after the area to be identified in the preview picture is acquired, the method further comprises the following steps:
identifying the area to be identified, and acquiring at least one alternative object corresponding to the area to be identified;
correspondingly, acquiring the target object corresponding to the area to be identified according to the current position information includes:
selecting a target object from the at least one candidate object according to the current position information;
the selecting a target object from the at least one candidate object according to the current location information includes:
acquiring position information of each candidate object in the at least one candidate object;
and searching for an object with the position information being the current position information from the at least one candidate object, wherein the object is a target object.
2. The information search method of claim 1, further comprising:
and starting a preview mode of the camera, and acquiring a preview picture in the preview mode.
3. The information search method of claim 1, further comprising:
and starting a target object identification function, wherein the target object identification function is a function of acquiring a region to be identified and identifying the region to be identified when a preset operation on any region in the preview picture is received.
4. The information search method according to claim 1, wherein the acquiring the area to be recognized in the preview screen includes:
and acquiring at least one to-be-identified area selected from the preview picture.
5. The information search method according to any one of claims 1 to 4, wherein the displaying of the information of the target object includes:
and displaying the information of the target object on a screen locking interface.
6. An information search apparatus, comprising:
the area acquisition module is used for acquiring an area to be identified in the preview picture;
the information acquisition module is used for acquiring the current position information of the mobile terminal;
the object acquisition module is used for acquiring a target object corresponding to the area to be identified according to the current position information;
the information processing module is used for acquiring the information of the target object and displaying the information of the target object;
the information search apparatus further includes:
the area identification module is used for identifying the area to be identified and acquiring at least one candidate object corresponding to the area to be identified;
the object acquisition module is specifically configured to:
selecting a target object from the at least one candidate object according to the current position information;
the object acquisition module includes:
an obtaining unit, configured to obtain position information of each candidate object in the at least one candidate object;
and the searching unit is used for searching an object with position information being the current position information from the at least one candidate object, wherein the object is a target object.
7. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the information search method according to any one of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the information search method according to any one of claims 1 to 5.
CN201811195199.2A 2018-10-15 2018-10-15 Information searching method, information searching device and mobile terminal Active CN109359582B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811195199.2A CN109359582B (en) 2018-10-15 2018-10-15 Information searching method, information searching device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811195199.2A CN109359582B (en) 2018-10-15 2018-10-15 Information searching method, information searching device and mobile terminal

Publications (2)

Publication Number Publication Date
CN109359582A CN109359582A (en) 2019-02-19
CN109359582B true CN109359582B (en) 2022-08-09

Family

ID=65349262

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811195199.2A Active CN109359582B (en) 2018-10-15 2018-10-15 Information searching method, information searching device and mobile terminal

Country Status (1)

Country Link
CN (1) CN109359582B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110141854B (en) * 2019-05-17 2023-03-28 网易(杭州)网络有限公司 Information processing method and device in game, storage medium and electronic equipment
CN110941987B (en) * 2019-10-10 2023-04-07 北京百度网讯科技有限公司 Target object identification method and device, electronic equipment and storage medium
CN111782065B (en) * 2020-06-30 2022-05-31 联想(北京)有限公司 Processing method, processing device and electronic equipment
CN115170661B (en) * 2022-07-05 2024-03-22 深圳新益昌科技股份有限公司 Method, device, terminal equipment and storage medium for generating search path

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006059291A1 (en) * 2004-12-01 2006-06-08 Koninklijke Philips Electronics N.V. Adaptation of location similarity threshold in associative content retrieval
CN102984644A (en) * 2011-09-06 2013-03-20 上海博路信息技术有限公司 Location and information pushing method of terminal optical character reader (OCR)
CN104933068A (en) * 2014-03-19 2015-09-23 阿里巴巴集团控股有限公司 Method and device for information searching
CN106407977A (en) * 2016-09-05 2017-02-15 广东小天才科技有限公司 Target content positioning and search method and device
CN108520029A (en) * 2018-03-27 2018-09-11 四川斐讯信息技术有限公司 A kind of method scanned for based on picture and location information, server and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2092449A1 (en) * 2006-11-14 2009-08-26 Koninklijke Philips Electronics N.V. Method and apparatus for identifying an object captured by a digital image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006059291A1 (en) * 2004-12-01 2006-06-08 Koninklijke Philips Electronics N.V. Adaptation of location similarity threshold in associative content retrieval
CN102984644A (en) * 2011-09-06 2013-03-20 上海博路信息技术有限公司 Location and information pushing method of terminal optical character reader (OCR)
CN104933068A (en) * 2014-03-19 2015-09-23 阿里巴巴集团控股有限公司 Method and device for information searching
CN106407977A (en) * 2016-09-05 2017-02-15 广东小天才科技有限公司 Target content positioning and search method and device
CN108520029A (en) * 2018-03-27 2018-09-11 四川斐讯信息技术有限公司 A kind of method scanned for based on picture and location information, server and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于门牌号识别的移动机器人全局自定位方法研究;李磊等;《高技术通讯》;20030828(第08期);第73-78页 *

Also Published As

Publication number Publication date
CN109359582A (en) 2019-02-19

Similar Documents

Publication Publication Date Title
CN109359582B (en) Information searching method, information searching device and mobile terminal
CN108319592B (en) Translation method and device and intelligent terminal
CN109189879B (en) Electronic book display method and device
CN108961157B (en) Picture processing method, picture processing device and terminal equipment
CN110119733B (en) Page identification method and device, terminal equipment and computer readable storage medium
CN108737739B (en) Preview picture acquisition method, preview picture acquisition device and electronic equipment
CN109215037B (en) Target image segmentation method and device and terminal equipment
WO2016095689A1 (en) Recognition and searching method and system based on repeated touch-control operations on terminal interface
CN107451244B (en) Folder naming method, mobile terminal and computer readable storage medium
CN106991179B (en) Data deleting method and device and mobile terminal
CN107909368B (en) Payment control method and device, terminal and readable storage medium
CN108898082B (en) Picture processing method, picture processing device and terminal equipment
CN110457963B (en) Display control method, display control device, mobile terminal and computer-readable storage medium
CN110266994B (en) Video call method, video call device and terminal
CN111818385B (en) Video processing method, video processing device and terminal equipment
CN111290684B (en) Image display method, image display device and terminal equipment
CN107679222B (en) Picture processing method, mobile terminal and computer readable storage medium
CN109358927B (en) Application program display method and device and terminal equipment
CN108133048B (en) File sorting method and device and mobile terminal
CN109492249B (en) Rapid generation method and device of design drawing and terminal equipment
CN110677586B (en) Image display method, image display device and mobile terminal
CN108521460B (en) Information pushing method and device, mobile terminal and computer readable storage medium
CN112217992A (en) Image blurring method, image blurring device, mobile terminal, and storage medium
CN108776959B (en) Image processing method and device and terminal equipment
CN108932704B (en) Picture processing method, picture processing device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant