WO2016101768A1 - 终端以及基于触摸操作的搜索方法和装置 - Google Patents

终端以及基于触摸操作的搜索方法和装置 Download PDF

Info

Publication number
WO2016101768A1
WO2016101768A1 PCT/CN2015/095863 CN2015095863W WO2016101768A1 WO 2016101768 A1 WO2016101768 A1 WO 2016101768A1 CN 2015095863 W CN2015095863 W CN 2015095863W WO 2016101768 A1 WO2016101768 A1 WO 2016101768A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
search
content
searched
text
Prior art date
Application number
PCT/CN2015/095863
Other languages
English (en)
French (fr)
Inventor
谢军样
吴帅
李豪
张倩倩
Original Assignee
北京奇虎科技有限公司
奇智软件(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201410826801.3A external-priority patent/CN104778194A/zh
Priority claimed from CN201410827079.5A external-priority patent/CN104778195A/zh
Priority claimed from CN201410826911.XA external-priority patent/CN104537051B/zh
Priority claimed from CN201410827065.3A external-priority patent/CN104536688A/zh
Application filed by 北京奇虎科技有限公司, 奇智软件(北京)有限公司 filed Critical 北京奇虎科技有限公司
Publication of WO2016101768A1 publication Critical patent/WO2016101768A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor

Definitions

  • the present invention relates to the field of information search technology, and in particular to a terminal and a touch operation based search method and apparatus.
  • search services have been applied to mobile terminals.
  • various search apps applications
  • smartphones are often installed on smartphones for searching.
  • the search services on existing mobile terminals are all based on the search bar.
  • the search experience is very poor and the efficiency is very low.
  • the user has various instant search needs based on characters, images, etc. on the screen when using the mobile phone, it is very inconvenient to open the search app and pop up the search bar to input.
  • the present invention has been made in order to provide a terminal and a touch-based search method and apparatus that overcome the above problems or at least partially solve the above problems.
  • a touch operation-based search method including: when a user performs a screen capture operation, acquiring a screen capture image corresponding to the screen capture operation, and generating and a touch-screen search interface having the same content; receiving a touch operation performed by the user through the touch search interface to select at least a portion of the content to be searched; determining the to-be-searched content according to the touch operation; Searching using the content to be searched.
  • a touch operation-based search device including: a touch search interface generating module, configured to acquire a screen capture image corresponding to the screen capture operation when detecting a user performing a screen capture operation, And generating, by the touch screen image, a touch search interface having the same content as the screenshot image; the touch operation receiving module, configured to receive, by the user, the content to be searched for at least a part of the content by using the touch search interface a selected touch operation; the to-be-searched content determining module is configured to determine the to-be-searched content according to the touch operation; and a search module, configured to perform searching by using the to-be-searched content.
  • a terminal including:
  • a touch screen adapted to receive a user touch operation and provide a display function
  • a touch search interface generator configured to: when detecting a user performing a screen capture operation, acquire a screen capture image corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image;
  • the touch screen is further configured to further receive, by the user, a touch operation performed by the touch search interface to select a content to be searched for at least a part of the content;
  • the content to be searched is adapted to determine the content to be searched according to the touch operation
  • a searcher adapted to search using the content to be searched.
  • a touch operation based search method comprising:
  • a touch operation based search device comprising:
  • Touching a search interface generating module configured to extract text content copied or cut by the user when detecting that the system clipboard injects the text content copied or cut by the user, and generate a touch search interface for displaying the text content;
  • a touch operation receiving module configured to receive, by the user, a touch operation for selecting at least a part of the text content by the touch search interface
  • a to-be-searched text determining module configured to determine the to-be-searched text according to the touch operation
  • a search module for searching using the text to be searched.
  • a terminal including:
  • a touch screen adapted to receive a user touch operation and provide a display function
  • the touch search interface generator is adapted to extract text content copied or cut by the user when detecting that the system clipboard injects the text content copied or cut by the user, and generate a touch search interface for displaying the text content;
  • the touch screen is further configured to further receive, by the user, a touch operation of selecting a text to be searched for at least a part of the text content by using the touch search interface;
  • the to-be-searched text determiner is adapted to determine the to-be-searched text according to the touch operation
  • a searcher adapted to search using the text to be searched.
  • a computer program comprising computer readable code that, when executed on a computing device, causes the computing device to perform the touch-based operation described above Search method.
  • a computer readable medium wherein the computer program described above is stored.
  • the user performs operations such as copying and cutting on a mobile terminal such as a mobile phone, usually in order to copy or copy the text, image, etc. as the content to be searched, to copy it to the search bar in the search app interface. And then performing a search operation; according to the technical solution of the present invention, when detecting that the user performs the screenshot operation, the user is automatically provided with a touch search interface according to the intercepted picture, so that the user can quickly and accurately select the content to be searched and search; The user does not need to open the search app, paste the cut or copied content into the search bar of the search app, and other convenient operations; the touch search interface retains the content of the screenshot, so that the user can wait in the touch search interface.
  • the choice of search content is fast and accurate.
  • FIG. 1 shows a flow chart of a touch operation based search method in accordance with one embodiment of the present invention
  • FIG. 2A shows a schematic diagram of the operation of a touch operation based search method in accordance with one embodiment of the present invention
  • 2B is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 2C shows a schematic diagram of the operation of a touch operation based search method in accordance with one embodiment of the present invention
  • 2D shows a schematic diagram of the operation of a touch operation based search method in accordance with one embodiment of the present invention
  • FIG. 3 illustrates a flow chart of a touch operation based search method in accordance with one embodiment of the present invention
  • FIG. 4 is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 5 illustrates a flow chart of a touch operation based search method in accordance with one embodiment of the present invention
  • FIG. 6 shows a block diagram of a touch operation based search device in accordance with one embodiment of the present invention
  • Figure 7 shows a block diagram of a terminal in accordance with one embodiment of the present invention.
  • FIG. 8 is a diagram showing a terminal interacting with a search server according to an embodiment of the present invention.
  • FIG. 9 shows a flow chart of a touch operation based search method in accordance with one embodiment of the present invention.
  • FIG. 10A illustrates a working diagram of a touch operation based search method according to an embodiment of the present invention
  • FIG. 10B illustrates a working diagram of a touch operation based search method according to an embodiment of the present invention
  • FIG. 10C is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 10D illustrates a working diagram of a touch operation based search method according to an embodiment of the present invention
  • FIG. 10C' is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 10D' is a schematic diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 11 shows a flow chart of a touch operation based search method in accordance with one embodiment of the present invention
  • FIG. 12A is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • 12B is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 12C is a diagram showing the operation of a touch operation based search method according to an embodiment of the present invention.
  • FIG. 13 shows a flowchart of a touch operation based search method according to an embodiment of the present invention
  • FIG. 14 shows a block diagram of a touch operation based search device in accordance with one embodiment of the present invention
  • Figure 15 shows a block diagram of a touch operation based search device in accordance with one embodiment of the present invention.
  • Figure 16 shows a block diagram of a terminal in accordance with one embodiment of the present invention.
  • Figure 17 shows a block diagram of a terminal in accordance with one embodiment of the present invention.
  • FIG. 18 is a schematic diagram showing interaction between a terminal and a search server according to an embodiment of the present invention.
  • Figure 19 is a schematic block diagram showing a computing device for performing the method according to the present invention.
  • Fig. 20 schematically shows a storage unit for holding or carrying program code implementing the method according to the invention.
  • an embodiment of the present invention provides a touch operation based search method, including:
  • Step 110 When detecting that the user performs a screen capture operation, acquire a screen capture image corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • Step 120 Receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a part of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • Step 130 Determine the content to be searched according to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • the selected area picture is generated based on the touch area obtained based on the touch operation, and then the selected area picture is subjected to element recognition such as text/graphic.
  • the method of recognizing elements such as characters/graphics from a picture is not limited; for example, an OCR (Optical Character Recognition) recognition technique, that is, the background extracts text information from a picture by OCR recognition; for example, UiAutomator Automated testing technology: UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • OCR Optical Character Recognition
  • UiAutomator Automated testing technology UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • There are different application scenarios for each recognition technology Among them, if UiAutomator and OCR recognize the same content, it means that the content is very accurate. Therefore, by calculating the intersection of UiAutomator and OCR
  • Step 140 Search using the content to be searched.
  • the step further includes initiating a search request carrying the content to be searched according to an instruction triggered by the user to search based on the content to be searched.
  • the search request may further send the content to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the search content through the search module or function of the application, and displays the search result to the user.
  • the user when the user performs the search, the user does not need to open the search app, input the text to be searched in the search bar, paste the text to be searched into the search bar, trigger a search, and the like; in this embodiment,
  • the user When detecting that the user performs the screenshot operation, the user is automatically provided with a touch search interface, so that the user can quickly and accurately select the content to be searched and search, which saves the above-mentioned cumbersome operation, is fast and convenient; the content of the screenshot is retained on the touch search interface. It is convenient for the user to select the content to be searched in the touch search interface, which is quick and accurate, and solves the existing department.
  • the sub-search app can only copy the entire content and cannot accurately search for the disadvantages of a single word or a few discrete words, which improves the accuracy of selecting the text to be searched.
  • the user uses the mobile phone to take a screenshot of the desktop; when the background detection operation of the user is detected, the touch operation interface is generated according to the screenshot, and the touch operation interface is in the form of a floating layer or a mask, and the display state is translucent or grayscale.
  • the content in the corresponding screenshot can be displayed, such as each icon and name on the desktop, as shown in FIG. 2A; the user performs a touch operation on the touch operation interface, and the area swiped by the touch operation is as shown by the curved line frame in FIG. 2B.
  • the user's intention selects the touched text as the content to be searched; based on the foregoing graphic recognition technology, the content to be searched is identified as “360 film and television Daquan”; then the “360 film and television Daquan” is used for searching, and the search result is shown in FIG. 2C. Show.
  • the above-mentioned touch interaction mode realizes quick selection and search of content, which completely improves the overall interactive experience of user search. This embodiment is applied to a mobile terminal with a touch screen.
  • a touch-based search method is further provided.
  • step 110 further includes:
  • the touch search interface is set to a translucent mode, and a mask is formed on the current display area so that the content of the current display area is displayed through the touch search interface.
  • the translucent touch operation interface serves as a mask, which enables the user to clearly see the change of the current display area, and can clearly see the content of the touch operation interface, thereby conforming to the user's intention on the touch operation interface. Touch operation.
  • the mask can also be displayed in a grayscale manner, in order to distinguish the touch operation interface from the intercepted display interface, to indicate to the user that the touch operation interface is currently touched.
  • the touch operation interface displayed in FIG. 2A can be set to a translucent form.
  • a touch-based search method is further provided.
  • the method before the step 130, the method further includes:
  • the display manner of the touch area and/or the untouched area corresponding to the touch operation is changed to distinguish the touch area from the untouched area.
  • the display mode is not limited, and includes changes in brightness, color, and contrast.
  • the touch area is distinguished from the untouched area, which is convenient for the user to determine which content is touched by the touch operation.
  • an effect as shown in FIG. 2D can be achieved: the user touches the area for highlighting; the untouched area is significantly darker, and is presented by the oblique line effect in FIG. 2D.
  • an embodiment of the present invention provides a touch operation based search method, including:
  • Step 310 When detecting that the user performs a screen capture operation, acquire a picture corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture picture according to the screen capture picture.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • Step 320 Receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a part of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • Step 330 Select an area including the content to be searched on the touch search interface according to the touch area corresponding to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • a limited area is extracted from the touch operation interface, and a selected area picture can be obtained.
  • the recognition is performed based on the foregoing graphic recognition technology, and the recognition effect can be improved. Further, the selected area including the content to be searched is selected.
  • the size of the user should not be smaller than the touch area of the user. For example, if the part touched by the user's finger touch operation is inaccurate, generating a selected area picture according to the touch area may result in the text not being completely acquired, and the boundary is extended by a certain threshold based on the touch area of the user.
  • the newly selected area picture can contain the text that the user does not completely touch, thus solving the incompleteness of the user's touch area text acquisition.
  • the problem guarantees the integrity of the content to be searched.
  • Step 340 Identify content in the area that includes the content to be searched, and determine the content to be searched according to the recognition result.
  • Step 350 searching using the content to be searched.
  • the step further includes initiating a search request carrying the content to be searched according to an instruction triggered by the user to search based on the content to be searched.
  • the search request may further send the content to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the search content through the search module or function of the application, and displays the search result to the user.
  • the curved line frame in FIG. 2B shows the user touch area, and based on the OCR recognition technology, the area including the content to be searched selected from the touch operation interface is a rectangle as shown in FIG. 4, and the rectangular area is performed. Screenshots and graphic recognition to get "360 film and television Daquan" The text to be searched; it should be noted that the user touch area does not completely cover the information of the "360 film and television book", so in this embodiment, the length and width of the rectangular area to be identified are not strictly determined according to the touch area, but A certain extension, got a rectangular area that completely covers the "360 film and television Daquan".
  • the recognition strategy according to the predetermined line break fingerprint identifies the text in the screenshot corresponding to the area containing the content to be searched, and the strategy is mainly applied to the two lines that the user wants to touch.
  • Mechanisms Since the word recognition is mainly based on the boundary of the pixel points (four points of left and right (x, y)) which are generated by the user's finger sliding on the mask and similar to the rectangular highlight area. However, if the text belongs to two lines and the finger is swiped twice, it will still be based on the two pixels in the back and forth of the merged area, so that the range of the four pixels is much larger than the two areas highlighted by the finger.
  • the solution is: consider the line-change type of the two times their degree of coincidence, if the degree of coincidence is very low (for example, less than 30%, the threshold can be adjusted), as two The screenshots are separately identified, which increases the accuracy.
  • an embodiment of the present invention provides a touch operation based search method, including:
  • Step 510 When detecting that the user performs a screen capture operation, acquire a screen capture image corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • Step 520 Receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a part of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • Step 530 Determine the content to be searched according to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • the selected area picture is generated based on the touch area obtained based on the touch operation, and then the selected area picture is subjected to element recognition such as text/graphic.
  • the method of recognizing elements such as characters/graphics from a picture is not limited; for example, an OCR (Optical Character Recognition) recognition technique, that is, the background extracts text information from a picture by OCR recognition; for example, UiAutomator Automated testing technology: UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • OCR Optical Character Recognition
  • UiAutomator Automated testing technology UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • There are different application scenarios for each recognition technology Among them, if UiAutomator and OCR recognize the same content, it means that the content is very accurate. Therefore, by calculating the intersection of UiAutomator and OCR
  • step 540 the installed application for searching is called to perform a search, and the search result of the application is displayed.
  • an automatic call to the installed search app is implemented, and the user does not need to open the search app, enter the to-be-searched text in the search bar, and paste the to-be-searched text into the search bar.
  • Trivial operations such as triggering a search; in another embodiment, instead of calling an installed app, multiple processes containing the search are designed to be implemented by a single app. For example, the search results shown in Figure 2C are implemented by calling the "360" search already installed in the phone.
  • a touch-based search method is further provided.
  • the step 540 includes: clearing the touch search interface from the current display area. And display the interface of the application, displaying the content to be searched and the search results on the interface.
  • the touch search interface is cleared in time, which is beneficial to save resource consumption and avoid interference of the touch search interface on other operations of the user.
  • an embodiment of the present invention provides a touch operation based search device, including:
  • the touch search interface generating module 610 is configured to: when detecting a user performing a screen capture operation, acquire a screen capture image corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • the touch operation receiving module 620 is configured to receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a part of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • the to-be-searched content determining module 630 is configured to determine the content to be searched according to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • the selected area picture is generated based on the touch area obtained based on the touch operation, and then the selected area picture is subjected to element recognition such as text/graphic.
  • the method of recognizing elements such as characters/graphics from a picture is not limited; for example, an OCR (Optical Character Recognition) recognition technique, that is, the background extracts text information from a picture by OCR recognition; for example, UiAutomator Automated testing technology: UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • OCR Optical Character Recognition
  • UiAutomator Automated testing technology UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • There are different application scenarios for each recognition technology Among them, if UiAutomator and OCR recognize the same content, it means that the content is very accurate. Therefore, by calculating the intersection of UiAutomator and OCR
  • the search module 640 is configured to search using the content to be searched.
  • the step further includes initiating a search request carrying the content to be searched according to an instruction triggered by the user to search based on the content to be searched.
  • the search request may further send the content to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the search content through the search module or function of the application, and displays the search result to the user.
  • the user when the user performs the search, the user does not need to open the search app, input the text to be searched in the search bar, paste the text to be searched into the search bar, trigger a search, and the like; in this embodiment,
  • the user When detecting that the user performs the screenshot operation, the user is automatically provided with a touch search interface, so that the user can quickly and accurately select the content to be searched and search, which saves the above-mentioned cumbersome operation, is fast and convenient; the content of the screenshot is retained on the touch search interface.
  • the user uses the mobile phone to take a screenshot of the desktop; when the background detection operation of the user is detected, the touch operation interface is generated according to the screenshot, and the touch operation interface is in the form of a floating layer or a mask, and the display state is translucent or grayscale.
  • the content in the corresponding screenshot can be displayed, such as each icon and name on the desktop, as shown in FIG. 2A; the user performs a touch operation on the touch operation interface, and the area swiped by the touch operation is as shown by the curved line frame in FIG. 2B.
  • the user's intention selects the touched text as the content to be searched; based on the foregoing graphic recognition technology, the content to be searched is identified as “360 film and television Daquan”; then the “360 film and television Daquan” is used for searching, and the search result is shown in FIG. 2C. Show.
  • the above-mentioned touch interaction mode realizes quick selection and search of content, which completely improves the overall interactive experience of user search. This embodiment is applied to a mobile terminal with a touch screen.
  • a touch operation-based search device is further provided.
  • the touch search interface generation module 610 sets the touch search interface to a translucent manner. And covering the current display area to form a mask, so that the content of the current display area is displayed through the touch search interface.
  • the translucent touch operation interface serves as a mask, which enables the user to clearly see the change of the current display area, and can clearly see the content of the touch operation interface, thereby conforming to the user's intention on the touch operation interface. Touch operation.
  • the mask can also be displayed in a grayscale manner, in order to distinguish the touch operation interface from the intercepted display interface, to indicate to the user that the touch operation interface is currently touched.
  • the touch operation interface displayed in FIG. 2A can be set to a translucent form.
  • a search device based on a touch operation is further provided.
  • the search device of the embodiment further includes:
  • the display mode changing module is configured to change a display manner of the touch area and/or the untouched area corresponding to the touch operation to distinguish the touch area from the untouched area.
  • the display mode is not limited, and includes changes in brightness, color, and contrast.
  • the touch area is distinguished from the untouched area, which is convenient for the user to determine which content is touched by the touch operation. For example, according to the user touch area shown in FIG. 2B, an effect as shown in FIG. 2D can be achieved: the user touches the area for highlighting; the untouched area is significantly darker, and is presented by the oblique line effect in FIG. 2D.
  • An embodiment of the present invention provides a touch operation based search device, including:
  • the touch search interface generating module 610 is configured to: when detecting a user performing a screen capture operation, acquire a screen capture image corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be a picture for an animation, it may be a certain image for the video, it may be for a certain image.
  • the application interface of the app may be a system desktop or an operation interface for the terminal, may be a picture or a photo in the user's photo gallery, and the like. Further, the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • the touch operation receiving module 620 is configured to receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a part of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • the to-be-searched content determining module 630 is configured to select an area including the content to be searched on the touch search interface according to the touch area corresponding to the touch operation, and identify the content in the area that includes the content to be searched, and determine according to the recognition result. Search for content.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • a limited area is extracted from the touch operation interface, and a selected area picture can be obtained.
  • the recognition is performed based on the foregoing graphic recognition technology, and the recognition effect can be improved. Further, the selected area including the content to be searched is selected.
  • the size of the user should not be smaller than the touch area of the user. For example, if the part touched by the user's finger touch operation is inaccurate, generating a selected area picture according to the touch area may result in the text not being completely acquired, and the boundary is extended by a certain threshold based on the touch area of the user.
  • the newly selected area picture can contain the text that the user does not completely touch, thus solving the incompleteness of the user's touch area text acquisition.
  • the problem guarantees the integrity of the content to be searched.
  • the search module 640 is configured to search using the content to be searched.
  • the step further includes initiating a search request carrying the content to be searched according to an instruction triggered by the user to search based on the content to be searched.
  • the search request may further send the content to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the search content through the search module or function of the application, and displays the search result to the user.
  • the curved line frame in FIG. 2B shows the user touch area, and based on the OCR recognition technology, the area including the content to be searched selected from the touch operation interface is a rectangle as shown in FIG. 4, and the rectangular area is performed. Screenshots and graphic recognition to obtain the "360 film and television Daquan" to be searched for text; it should be noted that the user touch area does not completely cover the "360 film and television Daquan" information, so this embodiment does not strictly follow the touch area to determine Recognizing the length and width of the rectangular area, it has been expanded to obtain a rectangular area that completely covers the "360 film and television Daquan".
  • the recognition strategy according to the predetermined line break fingerprint identifies the text in the screenshot corresponding to the area containing the content to be searched, and the strategy is mainly applied to the two lines that the user wants to touch.
  • Mechanisms Since the word recognition is mainly based on the boundary of the pixel points (four points of left and right (x, y)) which are generated by the user's finger sliding on the mask and similar to the rectangular highlight area. However, if the text belongs to two lines and the finger is swiped twice, it will still be based on the two pixels in the back and forth of the merged area, so that the range of the four pixels is much larger than the two areas highlighted by the finger.
  • the solution is: consider the line-change type of the two times their degree of coincidence, if the degree of coincidence is very low (for example, less than 30%, the threshold can be adjusted), as two The screenshots are separately identified, which increases the accuracy.
  • An embodiment of the present invention provides a touch operation based search device, including:
  • the touch search interface generating module 610 is configured to: when detecting a user performing a screen capture operation, acquire a screen capture image corresponding to the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • the touch operation receiving module 620 is configured to receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a part of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • the to-be-searched content determining module 630 is configured to determine the content to be searched according to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • the selected area image is generated according to the touch area obtained based on the touch operation, and then the text of the selected area is displayed.
  • Shape and other element recognition the method of recognizing elements such as characters/graphics from a picture is not limited; for example, an OCR (Optical Character Recognition) recognition technique, that is, the background extracts text information from a picture by OCR recognition; for example, UiAutomator Automated testing technology: UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • OCR Optical Character Recognition
  • UiAutomator Automated testing technology UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • There are different application scenarios for each recognition technology Among them, if UiAutomator and OCR recognize the same content, it means that the content is very accurate. Therefore, by calculating the intersection of UiAutomator and OCR recognition results, the
  • the search module 640 is configured to invoke an installed application for searching to perform a search and display the search result of the application.
  • an automatic call to the installed search app is implemented, and the user does not need to open the search app, enter the to-be-searched text in the search bar, and paste the to-be-searched text into the search bar.
  • Trivial operations such as triggering a search; in another embodiment, instead of calling an installed app, multiple processes containing the search are designed to be implemented by a single app. For example, the search results shown in Figure 2C are implemented by calling the "360" search already installed in the phone.
  • a touch-based search device is further provided.
  • the search module 640 specifically includes: clearing the touch search interface from the current display area. And display the interface of the application, displaying the content to be searched and the search results on the interface.
  • the touch search interface is cleared in time, which is beneficial to save resource consumption and avoid interference of the touch search interface on other operations of the user.
  • an embodiment of the present invention provides a terminal, including:
  • the touch screen 710 is adapted to receive a user touch operation and provide a display function
  • the touch search interface generator 720 is adapted to acquire a screen capture image corresponding to the screen capture operation when the user performs the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • the touch screen 710 is adapted to further receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a portion of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • the to-be-searched content determiner 730 is adapted to determine the content to be searched according to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • the selected area picture is generated based on the touch area obtained based on the touch operation, and then the selected area picture is subjected to element recognition such as text/graphic.
  • the method of recognizing elements such as characters/graphics from a picture is not limited; for example, an OCR (Optical Character Recognition) recognition technique, that is, the background extracts text information from a picture by OCR recognition; for example, UiAutomator Automated testing technology: UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • OCR Optical Character Recognition
  • UiAutomator Automated testing technology UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • There are different application scenarios for each recognition technology Among them, if UiAutomator and OCR recognize the same content, it means that the content is very accurate. Therefore, by calculating the intersection of UiAutomator and OCR
  • the searcher 740 is adapted to search using the content to be searched.
  • the step further includes initiating a search request carrying the content to be searched according to an instruction triggered by the user to search based on the content to be searched.
  • the search request may further send the content to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the search content through the search module or function of the application, and displays the search result to the user.
  • the user when the user performs the search, the user does not need to open the search app, input the text to be searched in the search bar, paste the text to be searched into the search bar, trigger a search, and the like; in this embodiment,
  • the user When detecting that the user performs the screenshot operation, the user is automatically provided with a touch search interface, so that the user can quickly and accurately select the content to be searched and search, which saves the above-mentioned cumbersome operation, is fast and convenient; the content of the screenshot is retained on the touch search interface.
  • the user uses the mobile phone to take a screenshot of the desktop; when the user detects the user's screenshot operation, the touch operation interface is generated according to the screenshot, and the touch operation interface is in the form of a floating layer or a mask, and the display state is translucent or grayscale.
  • the content in the corresponding screenshot can be displayed, such as each icon and name on the desktop, as shown in FIG. 2A; the user performs a touch operation on the touch operation interface, and the area swiped by the touch operation is as shown by the curved line frame in FIG. 2B.
  • the user's intention selects the touched text as the content to be searched; based on the foregoing graphic recognition technology, the content to be searched is identified as “360 film and television Daquan”; then the “360 film and television Daquan” is used for searching, and the search result is shown in FIG. 2C. Show.
  • the above-mentioned touch interaction mode realizes quick selection and search of content, which completely improves the overall interactive experience of user search. This embodiment is applied to a mobile terminal with a touch screen.
  • a terminal is further provided.
  • the touch search interface generator 720 sets the touch search interface to a translucent manner and overwrites the current display.
  • a mask is formed on the area so that the content of the current display area is displayed through the touch search interface.
  • the translucent touch operation interface serves as a mask, which enables the user to clearly see the change of the current display area, and can clearly see the content of the touch operation interface, thereby conforming to the user's intention on the touch operation interface. Touch operation.
  • the mask can also be displayed in a grayscale manner, in order to distinguish the touch operation interface from the intercepted display interface, to indicate to the user that the touch operation interface is currently touched.
  • the touch operation interface displayed in FIG. 2A can be set to a translucent form.
  • a terminal is further provided.
  • the terminal in this embodiment further includes:
  • the display mode changer is adapted to change a display manner of the touch area and/or the untouched area corresponding to the touch operation to distinguish the touch area from the untouched area.
  • the display mode is not limited, and includes changes in brightness, color, and contrast.
  • the touch area is distinguished from the untouched area, which is convenient for the user to determine which content is touched by the touch operation. For example, according to the user touch area shown in FIG. 2B, an effect as shown in FIG. 2D can be achieved: the user touches the area for highlighting; the untouched area is significantly darker, and is presented by the oblique line effect in FIG. 2D.
  • An embodiment of the present invention provides a terminal, including:
  • the touch screen 710 is adapted to receive a user touch operation and provide a display function
  • the touch search interface generator 720 is adapted to acquire a screen capture image corresponding to the screen capture operation when the user performs the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • the touch screen 710 is adapted to further receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a portion of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • the content to be searched determiner 730 is adapted to select an area containing the content to be searched on the touch search interface according to the touch area corresponding to the touch operation, and identify the content in the area including the content to be searched, and determine according to the recognition result. Search for content.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • a limited area is extracted from the touch operation interface, and a selected area picture can be obtained.
  • the recognition is performed based on the foregoing graphic recognition technology, and the recognition effect can be improved. Further, the selected area including the content to be searched is selected.
  • the size of the user should not be smaller than the touch area of the user. For example, if the part touched by the user's finger touch operation is inaccurate, generating a selected area picture according to the touch area may result in the text not being completely acquired, and the boundary is extended by a certain threshold based on the touch area of the user.
  • the newly selected area picture can contain the text that the user does not completely touch, thus solving the incompleteness of the user's touch area text acquisition.
  • the problem guarantees the integrity of the content to be searched.
  • the searcher 740 is adapted to search using the content to be searched.
  • the step further includes initiating a search request carrying the content to be searched according to an instruction triggered by the user to search based on the content to be searched.
  • the search request may further send the content to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the search content through the search module or function of the application, and displays the search result to the user.
  • the curved line frame in FIG. 2B shows the user touch area, and based on the OCR recognition technology, the area including the content to be searched selected from the touch operation interface is a rectangle as shown in FIG. 4, and the rectangular area is performed. Screenshots and graphic recognition to get the "360 film and television Daquan" to be searched for text; it should be noted that the user touch area does not completely cover the "360 video Daquan" information, so there is no in this embodiment
  • the length and width of the rectangular area to be identified are determined strictly according to the touch area, but a certain extent is extended, and a rectangular area completely covering the "360 film and television book" is obtained.
  • the recognition strategy according to the predetermined line break fingerprint identifies the text in the screenshot corresponding to the area containing the content to be searched, and the strategy is mainly applied to the two lines that the user wants to touch.
  • Mechanisms Since the word recognition is mainly based on the boundary of the pixel points (four points of left and right (x, y)) which are generated by the user's finger sliding on the mask and similar to the rectangular highlight area. However, if the text belongs to two lines and the finger is swiped twice, it will still be based on the two pixels in the back and forth of the merged area, so that the range of the four pixels is much larger than the two areas highlighted by the finger.
  • the solution is: consider the line-change type of the two times their degree of coincidence, if the degree of coincidence is very low (for example, less than 30%, the threshold can be adjusted), as two The screenshots are separately identified, which increases the accuracy.
  • An embodiment of the present invention provides a terminal, including:
  • the touch screen 710 is adapted to receive a user touch operation and provide a display function
  • the touch search interface generator 720 is adapted to acquire a screen capture image corresponding to the screen capture operation when the user performs the screen capture operation, and generate a touch search interface having the same content as the screen capture image according to the screen capture image.
  • the manner of detecting screenshots used by different systems is also different.
  • the screen operation of the system can be directly monitored by the FileObserver service of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be a system desktop or an operation interface for the terminal, and may be a user.
  • Pictures or photos in the gallery and more.
  • the content to be searched includes at least one of the following: a text, a picture, and a symbol.
  • the touch search interface is presented in the form of a generated or popped layer or a floating window, and a touch operation of the user may be received to identify the touch area.
  • the touch screen 710 is adapted to further receive a touch operation performed by the user by touching the search interface to select a content to be searched for at least a portion of the content.
  • the user touch operation may include clicking or sliding, the touch search interface giving a response based on the user's touch operation, and indicating the touch area in a different color, wireframe, or the like.
  • the to-be-searched content determiner 730 is adapted to determine the content to be searched according to the touch operation.
  • the touch search interface receives a user touch operation to perform touch area identification on at least a part of the content, and the touch area serves as a basis for identifying the content to be searched.
  • the selected area picture is generated based on the touch area obtained based on the touch operation, and then the selected area picture is subjected to element recognition such as text/graphic.
  • the method of recognizing elements such as characters/graphics from a picture is not limited; for example, an OCR (Optical Character Recognition) recognition technique, that is, the background extracts text information from a picture by OCR recognition; for example, UiAutomator Automated testing technology: UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • OCR Optical Character Recognition
  • UiAutomator Automated testing technology UiAutomator is an android self-contained automated testing tool that can be used to extract the current page text information. This technology can get 100% correct text.
  • There are different application scenarios for each recognition technology Among them, if UiAutomator and OCR recognize the same content, it means that the content is very accurate. Therefore, by calculating the intersection of UiAutomator and OCR
  • the searcher 740 is adapted to invoke an installed application for searching to perform a search and display the search result of the application.
  • an automatic call to the installed search app is implemented, and the user does not need to open the search app, enter the to-be-searched text in the search bar, and paste the to-be-searched text into the search bar.
  • Trivial operations such as triggering a search; in another embodiment, instead of calling an installed app, multiple processes containing the search are designed to be implemented by a single app.
  • the search results shown in Figure 2C are implemented by calling the "360" search already installed in the phone.
  • a terminal is further provided.
  • the searcher 740 specifically includes: clearing the touch search interface from the current display area, and displaying the application.
  • the interface displays the content to be searched and the search result on the interface.
  • the touch search interface is cleared in time, which is beneficial to save resource consumption and avoid interference of the touch search interface on other operations of the user.
  • the searcher of the terminal when searching using the word to be searched, the searcher of the terminal is required to interact with the search server, and the search service of the search server is called to complete the search.
  • the search server may be a server corresponding to the terminal, or may be a server corresponding to the search app installed on the terminal.
  • the specific interaction diagram is shown in FIG. 8 , where the terminal is 810 and the search server is 820.
  • an embodiment of the present invention provides a touch operation based search method, including:
  • Step 910 When detecting that the system clipboard injects the text content copied or cut by the user, extracting the text content copied or cut by the user, and generating a touch search interface for displaying the text content.
  • the detected text content copied or cut by the user may cover text content in various interfaces or various applications (such as an app), and no matter what kind of text content the user selects, only the system clipping is triggered.
  • the board, the invention can extract the text content in the clipboard.
  • the touch search interface for generating display text content may be further displayed to the user in a pop-up form. Further, the extracted text content is placed The large mode is displayed in the touch search interface, which is convenient for the user to select the precision of the text when the operation is touched.
  • the system clipboard can be detected directly by calling the ClipboardManager control of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • Step 920 Receive a touch operation of the user to select a text to be searched for at least a part of the text content by touching the search interface.
  • the user can select a part or all of the text content displayed in the touch search interface on the touch screen by clicking or sliding (such as a finger or a stylus), or can select/cancel the part by repeated touch operations.
  • Text content The selected text is identified by a frame, a color deepening, or a color change, and is visually distinguished from the non-selected text.
  • Step 930 Determine the text to be searched according to the touch operation.
  • the user selects the text content to determine the text to be searched.
  • the touch search interface includes a prompt box or a search bar.
  • the selected text content is synchronized to the prompt box or the search bar as the word to be searched or the word to be searched.
  • step 940 the search is performed using the text to be searched.
  • the step further includes initiating a search request carrying the text to be searched according to an instruction triggered by the user to search based on the text to be searched.
  • the search request may further send the text to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the searched text through the search module or function of the application, and displays the search result to the user.
  • the user when the user performs the search, the user does not need to open the search app, input the text to be searched in the search bar, paste the text to be searched into the search bar, trigger a search, and the like; in this embodiment,
  • the user When detecting the user performing copying and cutting operations, the user is automatically provided with a touch search interface for the user to quickly and accurately select the text to be searched and searched, thereby saving the above-mentioned cumbersome operation, fast and convenient; displaying the text content on the touch search interface After that, the user selects the text to be searched again, which is more accurate, and solves the drawback that the existing part of the search app can only copy the entire piece of content and cannot accurately search for a single word or a plurality of discrete words, thereby improving the accuracy of selecting the text to be searched. .
  • FIG. 9 for the microblog page shown in FIG. 10A viewed by the user on the mobile phone, the user performs a long press on the microblogging page to trigger the copying of the entire microblog content; after the background detects the content injected into the clipboard, A touch search interface is generated or popped up on the mobile phone. As shown in FIG. 10B, the entire microblog content is displayed on the touch search interface. The user selects the text to be searched by a touch operation on the touch search interface, as shown in FIG. 10C. The word to be searched for is a wireframe part; the background automatically uses the word to be searched by the user to search and obtain the result. For example, the 360 search app is called to search and give search results, as shown in FIG. 10D.
  • the text to be searched supports selection across lines or segments.
  • the prompt box/search bar in the interface automatically organizes the interlaced texts in alphabetical order, as shown in Figures 10C and 10D; of course, the searched text also supports single-line selection, see Figures 10C' and 10D'.
  • This embodiment is applied to a mobile terminal with a touch screen.
  • a touch-based search method is further provided.
  • the step 910 further includes:
  • the text content on the touch search interface is enlarged, which is convenient for the user to accurately select the word to be searched, and is not easy to make mistakes.
  • the text displayed on the touch search interface in FIG. 10B is larger in size than the text displayed on the microblog interface shown in FIG. 10A.
  • an embodiment of the present invention provides a touch operation based search method, including:
  • Step 1110 When detecting that the system clipboard injects the text content copied or cut by the user, extracting the text content copied or cut by the user, and generating a touch search interface for displaying the text content, and having a search bar on the touch search interface.
  • the detected text content copied or cut by the user may cover text content in various interfaces or various applications (such as an app), and no matter what kind of text content the user selects, only the system clipping is triggered.
  • the board, the invention can extract the text content in the clipboard.
  • the touch search interface for generating display text content may be further displayed to the user in a pop-up form. Further, the extracted text content is displayed in the touch search interface in an enlarged manner, which is convenient for the user to select the precision of the text when the touch operation is performed.
  • the system clipboard can be detected directly by calling the ClipboardManager control of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • Step 1120 Receive a touch operation of the user to select a text to be searched for at least a part of the text content by touching the search interface.
  • the user can select a part or all of the text content displayed in the touch search interface on the touch screen by clicking or sliding (such as a finger or a stylus), or can select/cancel the part by repeated touch operations.
  • Text content The selected text is identified by a frame, a color deepening, or a color change, and is visually distinguished from the non-selected text.
  • Step 1130 Determine the text to be searched according to the touch operation.
  • the user can further select the text content through various touch operations, such as clicking or sliding, to determine the text to be searched for the real composite search requirement.
  • step 1140 the text to be searched by the user is synchronously displayed in the search bar.
  • the touch search interface includes a prompt box or a search bar.
  • the selected text content is synchronized to the prompt box or the search bar as the word to be searched or the word to be searched.
  • step 1150 the search is performed using the text to be searched.
  • the step further includes initiating a search request carrying the text to be searched according to an instruction triggered by the user to search based on the text to be searched.
  • the search request may further send the text to be searched to the application by using an application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards).
  • the search module or function of the application searches for the searched text and presents the search result to the user.
  • the text to be searched is synchronously displayed in the search bar, which is convenient for the user to confirm what kind of word to be searched.
  • FIG. 11 for a text webpage or a notebook as shown in FIG. 12A browsed by a user on a mobile phone, the user selects the text content in the webpage page or the notebook, and the selected text content is the framed text in the figure;
  • the touch search interface generated on the mobile phone is as shown in FIG. 12B, and the text content selected by the user is displayed on the touch search interface, and the search bar is displayed above the touch search interface; as shown in FIG. 12C, the user The text to be searched by touch operation on the touch search interface is synchronized to the search bar.
  • This embodiment is applied to a mobile terminal with a touch screen.
  • an embodiment of the present invention provides a touch operation based search method, including:
  • Step 1310 When detecting that the system clipboard injects the text content copied or cut by the user, extracting the text content copied or cut by the user, and generating a touch search interface for displaying the text content.
  • Step 1320 Receive a touch operation of the user to select a text to be searched for at least a part of the text content by touching the search interface.
  • Step 1330 determining the text to be searched according to the touch operation.
  • Step 1340 calling the installed application for searching to perform a search and displaying the search result of the application.
  • an automatic call to the installed search app is implemented, and the user does not need to open the search app, enter the to-be-searched text in the search bar, and paste the to-be-searched text into the search bar.
  • Trivial operations such as triggering a search; in another embodiment, instead of calling an installed app, multiple processes containing the search are designed to be implemented by a single app.
  • the search result shown in FIG. 10D is implemented by calling "360 search" already installed in the mobile phone.
  • a touch operation-based search method is further provided.
  • the search method of the embodiment further includes: clearing text content from the touch search interface, and touching Search results are displayed on the search interface.
  • the search result is directly displayed on the touch search interface, which avoids the cumbersome operation of the user switching between the touch search interface and the interface of other search apps, so that the user selects the search word and views the search result. All can be done in one interface.
  • an embodiment of the present invention provides a touch operation based search device, including:
  • the touch search interface generating module 1410 is configured to extract text content copied or cut by the user when detecting that the system clipboard injects the text content copied or cut by the user, and generate a touch search interface for displaying the text content.
  • the detected text content copied or cut by the user may cover text content in various interfaces or various applications (such as an app), and no matter what kind of text content the user selects, only the system clipping is triggered.
  • the board, the invention can extract the text content in the clipboard.
  • the touch search interface for generating display text content may be further displayed to the user in a pop-up form. Further, the extracted text content is displayed in the touch search interface in an enlarged manner, which is convenient for the user to select the precision of the text when the touch operation is performed.
  • the system clipboard can be detected directly by calling the ClipboardManager control of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the touch operation receiving module 1420 is configured to receive a touch operation of the user to select a text to be searched for at least a part of the text content by touching the search interface.
  • the user can select a part or all of the text content displayed in the touch search interface on the touch screen by clicking or sliding (such as a finger or a stylus), or can select/cancel the part by repeated touch operations.
  • Text content The selected text is identified by a frame, a color deepening, or a color change, and is visually distinguished from the non-selected text.
  • the to-be-searched text determining module 1430 is configured to determine a character to be searched according to the touch operation.
  • the user selects the text content to determine the text to be searched.
  • the touch search interface includes a prompt box or a search bar.
  • the selected text content is synchronized to the prompt box or the search bar as the word to be searched or the word to be searched.
  • the search module 1440 is configured to search using the text to be searched.
  • the step further includes initiating a search request carrying the text to be searched according to an instruction triggered by the user to search based on the text to be searched.
  • the search request may further send the text to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the searched text through the search module or function of the application, and displays the search result to the user.
  • the user when the user performs the search, the user does not need to open the search app, input the text to be searched in the search bar, paste the text to be searched into the search bar, trigger a search, and the like; in this embodiment,
  • the user When detecting the user performing copying and cutting operations, the user is automatically provided with a touch search interface for the user to quickly and accurately select the text to be searched and searched, thereby saving the above-mentioned cumbersome operation, fast and convenient; displaying the text content on the touch search interface After that, the user selects the text to be searched again, which is more accurate, and solves the drawback that the existing part of the search app can only copy the entire piece of content and cannot accurately search for a single word or a plurality of discrete words, thereby improving the accuracy of selecting the text to be searched. .
  • the user performs a long press on the microblogging page to trigger the copying of the entire microblog content; after the background detects that the content exists in the clipboard,
  • the touch search interface generated on the mobile phone displays the entire microblog content on the touch search interface; the user selects the text to be searched by touch operation on the touch search interface, as shown in FIG. 10C, wherein the selected search is to be searched.
  • the word is the wireframe part; the background automatically searches using the word to be searched by the user, and the obtained result is as shown in FIG. 10D.
  • the present invention and embodiments can be applied to a user terminal with a touch screen, such as a smart phone, a tablet computer, an ipad, etc., because touch selection of text is realized by touch interaction.
  • a search device based on a touch operation is further provided.
  • the search device of the embodiment further includes:
  • a size enlargement module for enlarging the size of the text content displayed on the touch search interface.
  • the text content on the touch search interface is enlarged, which is convenient for the user to accurately select the word to be searched, and is not easy to make mistakes.
  • the text displayed on the touch search interface in FIG. 10B is larger in size than the text displayed on the microblog interface shown in FIG. 10A.
  • an embodiment of the present invention provides a touch operation based search device, including:
  • the touch search interface generating module 1510 is configured to: when detecting the text content injected or cut by the system clipboard, extract the text content copied or cut by the user, and generate a touch search interface for displaying the text content, and touch the search interface. There is a search bar on it.
  • the detected text content copied or cut by the user may cover text content in various interfaces or various applications (such as an app), and no matter what kind of text content the user selects, only the system clipping is triggered.
  • the board, the invention can extract the text content in the clipboard.
  • the touch search interface for generating display text content may be further displayed to the user in a pop-up form. Further, the extracted text content is displayed in the touch search interface in an enlarged manner, which is convenient for the user to select the precision of the text when the touch operation is performed.
  • the system clipboard can be detected directly by calling the ClipboardManager control of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the touch operation receiving module 1520 is configured to receive a touch operation of the user to select a text to be searched for at least a part of the text content by touching the search interface.
  • the user can select a part or all of the text content displayed in the touch search interface on the touch screen by clicking or sliding (such as a finger or a stylus), or can select/cancel the part by repeated touch operations.
  • Text content The selected text is identified by a frame, a color deepening, or a color change, and is visually distinguished from the non-selected text.
  • the to-be-searched text determining module 1530 is configured to determine a character to be searched according to the touch operation.
  • the user can further select the text content through various touch operations, such as clicking or sliding, to determine the text to be searched for the real composite search requirement.
  • the synchronization display module 1540 is configured to synchronously display the text to be searched by the user into the search bar.
  • the touch search interface includes a prompt box or a search bar.
  • the selected text content is synchronized to the prompt box or the search bar as the word to be searched or the word to be searched.
  • the search module 1550 is configured to search using the text to be searched.
  • the search request may further send the text to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the searched text through the search module or function of the application, and displays the search result to the user.
  • the text to be searched is synchronously displayed in the search bar, which is convenient for the user to confirm what kind of word to be searched.
  • the user selects the text content of the webpage page, and the selected text content is the framed text in the figure; the background detects that the content exists in the clipboard.
  • the touch search interface generated on the mobile phone is as shown in FIG. 12B, and the text content selected by the user is displayed on the touch search interface, and the search bar is displayed above the touch search interface; as shown in FIG. 12C, the user touches the touch search interface.
  • the selected text to be searched is synchronized to the search bar.
  • a search device based on a touch operation is further provided.
  • the search device of the embodiment further includes: the search module clears the text content from the touch search interface, compared to the foregoing embodiment. And display search results on the touch search interface.
  • the search result is directly displayed on the touch search interface, which avoids the cumbersome operation of the user switching between the touch search interface and the interface of other search apps, so that the user selects the search word and views the search result. All can be done in one interface.
  • an embodiment of the present invention provides a terminal, including:
  • the touch screen 1610 is adapted to receive a user touch operation and provide a display function
  • the touch search interface generator 1620 is adapted to extract text content copied or cut by the user when detecting that the system clipboard injects the text content copied or cut by the user, and generate a touch search interface for displaying the text content.
  • the detected text content copied or cut by the user may cover text content in various interfaces or various applications (such as an app), and no matter what kind of text content the user selects, only the system clipping is triggered.
  • the board, the invention can extract the text content in the clipboard.
  • the touch search interface for generating display text content may be further displayed to the user in a pop-up form. Further, the extracted text content is displayed in the touch search interface in an enlarged manner, which is convenient for the user to select the precision of the text when the touch operation is performed.
  • the system clipboard can be detected directly by calling the ClipboardManager control of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the touch screen 1610 is adapted to further receive a touch operation of the user to select a text to be searched for at least a part of the text content by touching the search interface.
  • the user can select a part or all of the text content displayed in the touch search interface on the touch screen by clicking or sliding (such as a finger or a stylus), or can select/cancel the part by repeated touch operations.
  • Text content The selected text is identified by a frame, a color deepening, or a color change, and is visually distinguished from the non-selected text.
  • the to-be-searched text determiner 1630 is adapted to determine the text to be searched according to the touch operation.
  • the user selects the text content to determine the text to be searched.
  • the touch search interface includes a prompt box or a search bar.
  • the selected text content is synchronized to the prompt box or the search bar as the word to be searched or the word to be searched.
  • the searcher 1640 is adapted to search using the text to be searched.
  • the step further includes initiating a search request carrying the text to be searched according to an instruction triggered by the user to search based on the text to be searched.
  • the search request may further send the text to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the searched text through the search module or function of the application, and displays the search result to the user.
  • the user when the user performs the search, the user does not need to open the search app, input the text to be searched in the search bar, paste the text to be searched into the search bar, trigger a search, and the like; in this embodiment,
  • the user When detecting the user performing copying and cutting operations, the user is automatically provided with a touch search interface for the user to quickly and accurately select the text to be searched and searched, thereby saving the above-mentioned cumbersome operation, fast and convenient; displaying the text content on the touch search interface After that, the user selects the text to be searched again, which is more accurate, and solves the drawback that the existing part of the search app can only copy the entire piece of content and cannot accurately search for a single word or a plurality of discrete words, thereby improving the accuracy of selecting the text to be searched. .
  • the user performs a long press on the microblogging page to trigger the copying of the entire microblog content; after the background detects that the content exists in the clipboard,
  • the touch search interface generated on the mobile phone displays the entire microblog content on the touch search interface; the user selects the text to be searched by touch operation on the touch search interface, as shown in FIG. 10C, wherein the selected search is to be searched.
  • the word is the wireframe part; the background automatically searches using the word to be searched by the user, and the obtained result is as shown in FIG. 10D.
  • the present invention and embodiments can be applied to a user terminal with a touch screen, such as a smart phone, a tablet computer, an ipad, etc., because touch selection of text is realized by touch interaction.
  • a terminal is further provided.
  • the terminal of the embodiment further includes:
  • a size amplifier that is suitable for amplifying the size of the text content displayed on the touch search interface.
  • the text content on the touch search interface is enlarged, which is convenient for the user to accurately select the word to be searched, and is not easy to make mistakes.
  • the text displayed on the touch search interface in FIG. 10B is larger in size than the text displayed on the microblog interface shown in FIG. 10A.
  • an embodiment of the present invention provides a terminal, including:
  • the touch screen 1710 is adapted to receive a user touch operation and provide a display function
  • the touch search interface generator 1720 is adapted to extract the text content copied or cut by the user when the system clipboard detects the text content copied or cut by the user, and generate a touch search interface for displaying the text content, and touch the search interface. There is a search bar on it.
  • the detected text content copied or cut by the user may cover text content in various interfaces or various applications (such as an app), and no matter what kind of text content the user selects, only the system clipping is triggered.
  • the board, the invention can extract the text content in the clipboard.
  • the touch search interface for generating display text content may be further displayed to the user in a pop-up form. Further, the extracted text content is displayed in the touch search interface in an enlarged manner, which is convenient for the user to select the precision of the text when the touch operation is performed.
  • the system clipboard can be detected directly by calling the ClipboardManager control of the android system.
  • the invention is not limited to the Andriod system and can be covered by other operating systems.
  • the touch screen 1710 is further configured to further receive, by using a touch search interface, the user to select a text to be searched for at least a part of the text content. Touch the operation.
  • the user can select a part or all of the text content displayed in the touch search interface on the touch screen by clicking or sliding (such as a finger or a stylus), or can select/cancel the part by repeated touch operations.
  • Text content The selected text is identified by a frame, a color deepening, or a color change, and is visually distinguished from the non-selected text.
  • the to-be-searched text determiner 1730 is adapted to determine the character to be searched according to the touch operation.
  • the user can further select the text content through various touch operations, such as clicking or sliding, to determine the text to be searched for the real composite search requirement.
  • the synchronization display 1740 is adapted to synchronously display the text to be searched by the user into the search bar.
  • the touch search interface includes a prompt box or a search bar.
  • the selected text content is synchronized to the prompt box or the search bar as the word to be searched or the word to be searched.
  • the searcher 1750 is adapted to search using the text to be searched.
  • the search request may further send the text to be searched to by sending an installed application for searching (such as a 360 search app) or a search module in an application (such as a search module in 360 guards)
  • the application searches for the searched text through the search module or function of the application, and displays the search result to the user.
  • the text to be searched is synchronously displayed in the search bar, which is convenient for the user to confirm what kind of word to be searched.
  • the user selects the text content of the webpage page, and the selected text content is the framed text in the figure; the background detects that the content exists in the clipboard.
  • the touch search interface generated on the mobile phone is as shown in FIG. 12B, and the text content selected by the user is displayed on the touch search interface, and the search bar is displayed above the touch search interface; as shown in FIG. 12C, the user touches the touch search interface.
  • the selected text to be searched is synchronized to the search bar.
  • a terminal is further provided.
  • the method further includes: the searcher clears the text content from the touch search interface, and on the touch search interface. Display search results.
  • the search result is directly displayed on the touch search interface, which avoids the cumbersome operation of the user switching between the touch search interface and the interface of other search apps, so that the user selects the search word and views the search result. All can be done in one interface.
  • the searcher of the terminal when searching using the word to be searched, the searcher of the terminal is required to interact with the search server, and the search service of the search server is called to complete the search.
  • the search server may be a server corresponding to the terminal, or may be a server corresponding to the search app installed on the terminal. The specific interaction diagram is shown in FIG. 18, where the terminal is 1810 and the search server is 1820.
  • modules in the devices of the embodiments can be adaptively changed and placed in one or more devices different from the embodiment.
  • the modules or units or components of the embodiments may be combined into one module or unit or component, and further they may be divided into a plurality of sub-modules or sub-units or sub-components.
  • any combination of the features disclosed in the specification, including the accompanying claims, the abstract and the drawings, and any methods so disclosed, or All processes or units of the device are combined.
  • Each feature disclosed in this specification (including the accompanying claims, the abstract and the drawings) may be replaced by alternative features that provide the same, equivalent or similar purpose.
  • the various component embodiments of the present invention may be implemented in hardware, or in a software module running on one or more processors, or in a combination thereof.
  • a microprocessor or digital signal processor can be used in practice.
  • the invention can also be implemented as a device or device program (e.g., a computer program and a computer program product) for performing some or all of the methods described herein.
  • a program implementing the invention may be stored on a computer readable medium or may be in the form of one or more signals.
  • Such signals may be downloaded from an Internet website, provided on a carrier signal, or provided in any other form.
  • Figure 19 schematically illustrates a block diagram of a computing device for performing the method in accordance with the present invention.
  • the computing device conventionally includes a processor 1910 and a computer program product or computer readable medium in the form of a memory 1920.
  • the memory 1920 may be an electronic memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), an EPROM, a hard disk, or a ROM.
  • Memory 1920 has a memory space 1930 for program code 1931 for performing any of the method steps described above.
  • storage space 1930 for program code may include various program code 1931 for implementing various steps in the above methods, respectively.
  • the program code can be read from or written to one or more computer program products.
  • Such computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such a computer program product is typically a portable or fixed storage unit as described with reference to FIG.
  • the storage unit may have storage segments, storage spaces, and the like that are similar to the storage 1920 in the computing device of FIG.
  • the program code can be compressed, for example, in an appropriate form.
  • the storage unit comprises computer readable code 1931' for performing the steps of the method according to the invention, ie code that can be read by a processor such as, for example, 1910, which when executed by the computing device causes the calculation The device performs the various steps in the methods described above.
  • the present invention is applicable to computer systems/servers that can operate with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations suitable for use with computer systems/servers include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, based on Microprocessor systems, set-top boxes, programmable consumer electronics, networked personal computers, small computer systems, mainframe computer systems, and distributed cloud computing technology environments including any of the above, and the like.
  • the computer system/server can be described in the general context of computer system executable instructions (such as program modules) being executed by a computer system.
  • program modules may include routines, programs, target programs, components, logic, data structures, and the like that perform particular tasks or implement particular abstract data types.
  • the computer system/server can be implemented in a distributed cloud computing environment where tasks are performed by remote processing devices that are linked through a communication network.
  • program modules may be located on a local or remote computing system storage medium including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明提供了终端以及基于触摸操作的搜索方法和装置,主要涉及信息搜索技术领域,主要目的在于提供快捷地输入待搜索文字并进行搜索的技术方案。方法包括:当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面;接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作;根据触摸操作确定待搜索内容;使用待搜索内容进行搜索。根据本发明,不需要用户进行打开搜索app、将剪切或复制的内容粘贴到搜索app的搜索栏中等繁琐的操作,快捷便利;触摸搜索界面上保留了截图的内容,方便用户在触摸搜索界面在进行待搜索内容的选择,快捷而准确。

Description

终端以及基于触摸操作的搜索方法和装置 技术领域
本发明涉及信息搜索技术领域,具体而言,涉及终端以及基于触摸操作的搜索方法和装置。
背景技术
目前,搜索服务已经应用到移动终端上,例如,智能手机上往往安装了各种搜索app(应用程序)用于进行搜索。
现有的移动终端上的搜索服务,都是基于搜索栏输入的。而由于智能手机等移动终端的触摸屏输入不便捷的问题,导致搜索体验非常差,效率非常低。尤其是当用户在使用手机时基于屏幕上的字符、图像等有各种即时的搜索需求时,再打开搜索app,弹出搜索栏中再输入,非常的不便捷。
发明内容
鉴于上述问题,提出了本发明以便提供克服上述问题或者至少部分地解决上述问题的终端以及基于触摸操作的搜索方法和装置。
依据本发明的一个方面,提供了一种基于触摸操作的搜索方法,其包括:当检测到用户进行截屏操作时,获取所述截屏操作对应的截屏图片,并根据所述截屏图片生成与所述截屏图片具有相同内容的触摸搜索界面;接收所述用户通过所述触摸搜索界面进行的对所述内容的至少一部分进行待搜索内容选择的触摸操作;根据所述触摸操作确定所述待搜索内容;使用所述待搜索内容进行搜索。
依据本发明的另一方面,还提供了一种基于触摸操作的搜索装置,其包括:触摸搜索界面生成模块,用于当检测到用户进行截屏操作时,获取所述截屏操作对应的截屏图片,并根据所述截屏图片生成与所述截屏图片具有相同内容的触摸搜索界面;触摸操作接收模块,用于接收所述用户通过所述触摸搜索界面进行的对所述内容的至少一部分进行待搜索内容选择的触摸操作;待搜索内容确定模块,用于根据所述触摸操作确定所述待搜索内容;搜索模块,用于使用所述待搜索内容进行搜索。
依据本发明的又一方面,还提供了一种终端,其包括:
触摸屏,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器,适于当检测到用户进行截屏操作时,获取所述截屏操作对应的截屏图片,并根据所述截屏图片生成与所述截屏图片具有相同内容的触摸搜索界面;
所述触摸屏,适于进一步接收所述用户通过所述触摸搜索界面进行的对所述内容的至少一部分进行待搜索内容选择的触摸操作;
待搜索内容确定器,适于根据所述触摸操作确定所述待搜索内容;
搜索器,适于使用所述待搜索内容进行搜索。
依据本发明的一个方面,提供了一种基于触摸操作的搜索方法,其包括:
当检测到系统剪切板注入用户复制或剪切的文字内容时,提取所述用户复制或剪切的文字内容,并生成显示所述文字内容的触摸搜索界面;
接收所述用户通过所述触摸搜索界面对所述文字内容中的至少一部分进行待搜索文字选择的触摸操作;
根据所述触摸操作确定所述待搜索文字;
使用所述待搜索文字进行搜索。
依据本发明的另一方面,还提供了一种基于触摸操作的搜索装置,其包括:
触摸搜索界面生成模块,用于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取所述用户复制或剪切的文字内容,并生成显示所述文字内容的触摸搜索界面;
触摸操作接收模块,用于接收所述用户通过所述触摸搜索界面对所述文字内容中的至少一部分进行待搜索文字选择的触摸操作;
待搜索文字确定模块,用于根据所述触摸操作确定所述待搜索文字;
搜索模块,用于使用所述待搜索文字进行搜索。
依据本发明的又一方面,还提供了一种终端,其包括:
触摸屏,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器,适于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取所述用户复制或剪切的文字内容,并生成显示所述文字内容的触摸搜索界面;
所述触摸屏,适于进一步接收所述用户通过所述触摸搜索界面对所述文字内容中的至少一部分进行待搜索文字选择的触摸操作;
待搜索文字确定器,适于根据所述触摸操作确定所述待搜索文字;
搜索器,适于使用所述待搜索文字进行搜索。
根据本发明的又一个方面,提出了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算设备上运行时,导致所述计算设备执行上文所述的基于触摸操作的搜索方法。
根据本发明的再一个方面,提出了一种计算机可读介质,其中存储了上述的计算机程序。
根据大量经验可知,用户在手机等移动终端上进行复制、剪切等操作,通常是为了将剪切或复制的文字、图像等作为待搜索内容,以将其复制到搜索app界面中的搜索栏中,然后执行搜索操作;根据本发明的技术方案,检测到用户进行截图操作时,自动根据截取的图片为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索内容并进行搜索;这个过程中不需要用户进行打开搜索app、将剪切或复制的内容粘贴到搜索app的搜索栏中等繁琐的操作,快捷便利;触摸搜索界面上保留了截图的内容,方便用户在触摸搜索界面在进行待搜索内容的选择,快捷而准确。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。
附图说明
通过阅读下文优选实施方式的详细描述,各种其他的优点和益处对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本发明的限制。而且在整个附图中,用相同的参考符号表示相同的部件。在附图中:
图1示出了根据本发明的一个实施例的基于触摸操作的搜索方法的流程图;
图2A示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图2B示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图2C示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图2D示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图3示出了根据本发明的一个实施例的基于触摸操作的搜索方法的流程图;
图4示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图5示出了根据本发明的一个实施例的基于触摸操作的搜索方法的流程图;
图6示出了根据本发明的一个实施例的基于触摸操作的搜索装置的框图;
图7示出了根据本发明的一个实施例的终端的框图;
图8示出了根据本发明的一个实施例的终端与搜索服务器交互的示意图;
图9示出了根据本发明的一个实施例的基于触摸操作的搜索方法的流程图;
图10A示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图10B示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图10C示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图10D示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图10C’示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图10D’示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图11示出了根据本发明的一个实施例的基于触摸操作的搜索方法的流程图;
图12A示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图12B示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图12C示出了根据本发明的一个实施例的基于触摸操作的搜索方法的工作示意图;
图13示出了根据本发明的一个实施例的基于触摸操作的搜索方法的流程图;
图14示出了根据本发明的一个实施例的基于触摸操作的搜索装置的框图;
图15示出了根据本发明的一个实施例的基于触摸操作的搜索装置的框图;
图16示出了根据本发明的一个实施例的终端的框图;
图17示出了根据本发明的一个实施例的终端的框图;
图18示出了根据本发明的一个实施例的终端与搜索服务器的交互示意图
图19示意性地示出了用于执行根据本发明的方法的计算设备的框图;以及
图20示意性地示出了用于保持或者携带实现根据本发明的方法的程序代码的存储单元。
具体实施例
下面将参照附图更详细地描述本公开的示例性实施例。虽然附图中显示了本公开的示例性实施例,然而应当理解,可以以各种形式实现本公开而不应被这里阐述的实施例所限制。相反,提供这些实施例是为了能够更透彻地理解本公开,并且能够将本公开的范围完整的传达给本领域的技术人员。
如图1所示,本发明的一个实施例提供了一种基于触摸操作的搜索方法,其包括:
步骤110,当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
步骤120,接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
步骤130,根据触摸操作确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,根据基于触摸操作得到的触摸区域生成选择区域图片,然后对选择区域图片进行文字/图形等元素识别。这里对从图片中识别文字/图形等元素的方法不进行限制;例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,其中,UiAutomator与OCR如果识别出相同的内容则代表该内容准确性很强,所以通过计算UiAutomator与OCR识别结果的交集确定待搜索内容,能极大提高识别准确度。
步骤140,使用待搜索内容进行搜索。
该步骤进一步包括根据用户触发的基于待搜索内容进行搜索的指令,发起携带待搜索内容的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索内容发给所述应用,通过所述应用的搜索模块或功能来对待搜索内容进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在本实施例中,检测到用户进行截图操作时,自动为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索内容并进行搜索,节省了上述的繁琐操作,快捷便利;触摸搜索界面上保留了截图的内容,方便用户在触摸搜索界面在进行待搜索内容的选择,快捷而准确,解决了现有部 分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了选取待搜索文字精准性。
根据图1,用户使用手机对其桌面进行截图;后台检测到用户的截图操作时,根据截图生成了触摸操作界面,触摸操作界面为浮层或蒙板形式,显示状态为半透明或灰度,能够显示对应截图中的内容,比如桌面上的各个图标及名称,具体如图2A所示;用户在触摸操作界面上进行触摸操作,触摸操作滑动过的区域如图2B中曲线线框所示,可知用户的意图中选择触摸到的文字作为待搜索内容;基于前述的图形识别技术,识别出待搜索内容为“360影视大全”;则使用“360影视大全”进行搜索,搜索结果如图2C所示。通过上述触摸交互方式实现对内容快捷选取和搜索,完全提高了用户搜索的整体交互体验。本实施例应用于带触摸屏的移动终端。
本发明的一个实施例中还提供了一种基于触摸操作的搜索方法,相比于前述的实施例,本实施例的搜索方法中,步骤110,还包括:
将触摸搜索界面设置为半透明方式,并覆盖在当前显示区域上形成蒙板,以使当前显示区域的内容透过触摸搜索界面进行显示。根据本实施例的技术方案,半透明的触摸操作界面作为蒙板,既能够使得用户看清楚当前显示区域的变化,又能够看清楚触摸操作界面的内容,从而在触摸操作界面上进行符合用户意向的触摸操作。进一步地,还可以将蒙板以灰度方式进行显示,这为了将触摸操作界面与截取的显示界面进行区别,向用户表明当前是对触摸操作界面进行触摸操作。例如,图2A所示显示的触摸操作界面,就可以设置为半透明的形式。
本发明的一个实施例中还提供了一种基于触摸操作的搜索方法,相比于前述的实施例,本实施例的搜索方法中,在步骤130之前,还包括:
改变触摸操作对应的触摸区域和/或未触摸区域的显示方式,以将触摸区域与未触摸区域进行区分。在本实施例中,对显示方式不进行限制,其包括亮度、颜色、对比度方面的改变。根据本实施例的技术方案,将触摸区域与未触摸区域进行区别,有利于用户确定自己的触摸操作到底摸到了哪些内容。例如,根据图2B所示的用户触摸区域,可以实现如图2D所示的效果:用户触摸区域进行高亮显示;未触摸区域明显较暗,在图2D中通过斜线效果呈现。
如图3所示,本发明的一个实施例提供了一种基于触摸操作的搜索方法,其包括:
步骤310,当检测到用户进行截屏操作时,获取截屏操作对应的图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
步骤320,接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
步骤330,根据触摸操作对应的触摸区域,在触摸搜索界面上选择包含待搜索内容的区域。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,基于触摸操作,从触摸操作界面上提取有限的区域可以得到选择区域图片,基于前述的图形识别技术进行识别,可以提高识别效果;进一步地,选取的包含待搜索内容的区域的大小不应小于用户触摸区域,例如,若用户手指触摸操作摸出的部分不准确,按触摸区域生成选择区域图片会造成文字不能完整获取的情形,则基于用户触摸区域,将边界扩展一定阈值,相对于严格按照触摸区域,边界扩展后就有了一张略大的新选择区域图片,新选择区域图片能够将用户未完整触摸的文字包含,这样就解决了用户触摸区域文字获取不完整的问题,保证了待搜索内容获取的完整性。
步骤340,对包含所述待搜索内容的区域中的内容进行识别,根据识别结果确定待搜索内容。
步骤350,使用待搜索内容进行搜索。
该步骤进一步包括根据用户触发的基于待搜索内容进行搜索的指令,发起携带待搜索内容的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索内容发给所述应用,通过所述应用的搜索模块或功能来对待搜索内容进行搜索,并给出搜索结果显示给用户。
结合前述的实施例,图2B中曲线线框展示了用户触摸区域,则基于OCR识别技术,从触摸操作界面中选择的包含待搜索内容的区域如图4所示的矩形,对该矩形区域进行截图并进行图形识别得到“360影视大全” 的待搜索文字;需要注意的是,用户触摸区域并没有完整覆盖“360影视大全”的信息,所以本实施例中并没有严格按照触摸区域来确定待识别矩形区域的长宽,而是进行了一定的扩展,得到了完整涵盖“360影视大全”的矩形区域。
在本发明的一个实施例中,根据预定的换行摸字的识别策略在包含待搜索内容的区域对应的截图中识别出文字,这种策略主要应用于用户想要摸出的字分别在两行的机制。由于摸字识别主要是基于用户手指滑动在蒙板上产生的类似矩形高亮区域的像素点的边界(左右上下(x,y)四个点)来识别的。但是如果文本属于两行,手指滑动两次,也依然会根据两次合并后区左右上下四个像素点,这样四个像素点圈出的范围就比手指滑动两次高亮区域要大很多,就使得没有精准在高亮词上,因此解决的方法是:考虑换行摸字这两次他们的重合度高低,如果重合度很低(例如30%以下,阈值可以再调节),就作为两张截图来分别识别,这样就提高了精准程度。
如图5所示,本发明的一个实施例提供了一种基于触摸操作的搜索方法,其包括:
步骤510,当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
步骤520,接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
步骤530,根据触摸操作确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,根据基于触摸操作得到的触摸区域生成选择区域图片,然后对选择区域图片进行文字/图形等元素识别。这里对从图片中识别文字/图形等元素的方法不进行限制;例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,其中,UiAutomator与OCR如果识别出相同的内容则代表该内容准确性很强,所以通过计算UiAutomator与OCR识别结果的交集确定待搜索内容,能极大提高识别准确度。
步骤540,调用已安装的用于搜索的应用程序执行搜索,并显示应用程序的搜索结果。根据本实施例的技术方案,实现了对已安装搜索app的自动调用,避免用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在另一实施例中,也可以不调用已安装app,而是将包含搜索的多个过程设计为由单独一个app来实现。例如,图2C中所展示的搜索结果,就是调用手机中已经安装的“360”搜索来实现。
本发明的一个实施例中还提供了一种基于触摸操作的搜索方法,相比于前述的实施例,本实施例的搜索方法中,步骤540,具体包括:从当前显示区域上清除触摸搜索界面,并显示应用程序的界面,将待搜索内容和搜索结果显示在界面上。根据本实施例的技术方案,及时将触摸搜索界面清除,有利于节省资源消耗,以及避免触摸搜索界面对用户进行其他操作的干扰。
如图6所示,本发明的一个实施例提供了一种基于触摸操作的搜索装置,其包括:
触摸搜索界面生成模块610,用于当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
触摸操作接收模块620,用于接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
待搜索内容确定模块630,用于根据触摸操作确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,根据基于触摸操作得到的触摸区域生成选择区域图片,然后对选择区域图片进行文字/图形等元素识别。这里对从图片中识别文字/图形等元素的方法不进行限制;例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,其中,UiAutomator与OCR如果识别出相同的内容则代表该内容准确性很强,所以通过计算UiAutomator与OCR识别结果的交集确定待搜索内容,能极大提高识别准确度。
搜索模块640,用于使用待搜索内容进行搜索。
该步骤进一步包括根据用户触发的基于待搜索内容进行搜索的指令,发起携带待搜索内容的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索内容发给所述应用,通过所述应用的搜索模块或功能来对待搜索内容进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在本实施例中,检测到用户进行截图操作时,自动为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索内容并进行搜索,节省了上述的繁琐操作,快捷便利;触摸搜索界面上保留了截图的内容,方便用户在触摸搜索界面在进行待搜索内容的选择,快捷而准确,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了选取待搜索文字精准性。
根据图6,用户使用手机对其桌面进行截图;后台检测到用户的截图操作时,根据截图生成了触摸操作界面,触摸操作界面为浮层或蒙板形式,显示状态为半透明或灰度,能够显示对应截图中的内容,比如桌面上的各个图标及名称,具体如图2A所示;用户在触摸操作界面上进行触摸操作,触摸操作滑动过的区域如图2B中曲线线框所示,可知用户的意图中选择触摸到的文字作为待搜索内容;基于前述的图形识别技术,识别出待搜索内容为“360影视大全”;则使用“360影视大全”进行搜索,搜索结果如图2C所示。通过上述触摸交互方式实现对内容快捷选取和搜索,完全提高了用户搜索的整体交互体验。本实施例应用于带触摸屏的移动终端。
本发明的一个实施例中还提供了一种基于触摸操作的搜索装置,相比于前述的实施例,本实施例的搜索装置中,触摸搜索界面生成模块610将触摸搜索界面设置为半透明方式,并覆盖在当前显示区域上形成蒙板,以使当前显示区域的内容透过触摸搜索界面进行显示。根据本实施例的技术方案,半透明的触摸操作界面作为蒙板,既能够使得用户看清楚当前显示区域的变化,又能够看清楚触摸操作界面的内容,从而在触摸操作界面上进行符合用户意向的触摸操作。进一步地,还可以将蒙板以灰度方式进行显示,这为了将触摸操作界面与截取的显示界面进行区别,向用户表明当前是对触摸操作界面进行触摸操作。例如,图2A所示显示的触摸操作界面,就可以设置为半透明的形式。
本发明的一个实施例中还提供了一种基于触摸操作的搜索装置,相比于前述的实施例,本实施例的搜索装置中,还包括:
显示方式改变模块,用于改变触摸操作对应的触摸区域和/或未触摸区域的显示方式,以将触摸区域与未触摸区域进行区分。在本实施例中,对显示方式不进行限制,其包括亮度、颜色、对比度方面的改变。根据本实施例的技术方案,将触摸区域与未触摸区域进行区别,有利于用户确定自己的触摸操作到底摸到了哪些内容。例如,根据图2B所示的用户触摸区域,可以实现如图2D所示的效果:用户触摸区域进行高亮显示;未触摸区域明显较暗,在图2D中通过斜线效果呈现。
本发明的一个实施例提供了一种基于触摸操作的搜索装置,其包括:
触摸搜索界面生成模块610,用于当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个 app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
触摸操作接收模块620,用于接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
待搜索内容确定模块630,用于根据触摸操作对应的触摸区域,在触摸搜索界面上选择包含待搜索内容的区域,并对包含所述待搜索内容的区域中的内容进行识别,根据识别结果确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,基于触摸操作,从触摸操作界面上提取有限的区域可以得到选择区域图片,基于前述的图形识别技术进行识别,可以提高识别效果;进一步地,选取的包含待搜索内容的区域的大小不应小于用户触摸区域,例如,若用户手指触摸操作摸出的部分不准确,按触摸区域生成选择区域图片会造成文字不能完整获取的情形,则基于用户触摸区域,将边界扩展一定阈值,相对于严格按照触摸区域,边界扩展后就有了一张略大的新选择区域图片,新选择区域图片能够将用户未完整触摸的文字包含,这样就解决了用户触摸区域文字获取不完整的问题,保证了待搜索内容获取的完整性。
搜索模块640,用于使用待搜索内容进行搜索。
该步骤进一步包括根据用户触发的基于待搜索内容进行搜索的指令,发起携带待搜索内容的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索内容发给所述应用,通过所述应用的搜索模块或功能来对待搜索内容进行搜索,并给出搜索结果显示给用户。
结合前述的实施例,图2B中曲线线框展示了用户触摸区域,则基于OCR识别技术,从触摸操作界面中选择的包含待搜索内容的区域如图4所示的矩形,对该矩形区域进行截图并进行图形识别得到“360影视大全”的待搜索文字;需要注意的是,用户触摸区域并没有完整覆盖“360影视大全”的信息,所以本实施例中并没有严格按照触摸区域来确定待识别矩形区域的长宽,而是进行了一定的扩展,得到了完整涵盖“360影视大全”的矩形区域。
在本发明的一个实施例中,根据预定的换行摸字的识别策略在包含待搜索内容的区域对应的截图中识别出文字,这种策略主要应用于用户想要摸出的字分别在两行的机制。由于摸字识别主要是基于用户手指滑动在蒙板上产生的类似矩形高亮区域的像素点的边界(左右上下(x,y)四个点)来识别的。但是如果文本属于两行,手指滑动两次,也依然会根据两次合并后区左右上下四个像素点,这样四个像素点圈出的范围就比手指滑动两次高亮区域要大很多,就使得没有精准在高亮词上,因此解决的方法是:考虑换行摸字这两次他们的重合度高低,如果重合度很低(例如30%以下,阈值可以再调节),就作为两张截图来分别识别,这样就提高了精准程度。
本发明的一个实施例提供了一种基于触摸操作的搜索装置,其包括:
触摸搜索界面生成模块610,用于当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
触摸操作接收模块620,用于接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
待搜索内容确定模块630,用于根据触摸操作确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,根据基于触摸操作得到的触摸区域生成选择区域图片,然后对选择区域图片进行文字/图 形等元素识别。这里对从图片中识别文字/图形等元素的方法不进行限制;例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,其中,UiAutomator与OCR如果识别出相同的内容则代表该内容准确性很强,所以通过计算UiAutomator与OCR识别结果的交集确定待搜索内容,能极大提高识别准确度。
搜索模块640,用于调用已安装的用于搜索的应用程序执行搜索,并显示应用程序的搜索结果。根据本实施例的技术方案,实现了对已安装搜索app的自动调用,避免用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在另一实施例中,也可以不调用已安装app,而是将包含搜索的多个过程设计为由单独一个app来实现。例如,图2C中所展示的搜索结果,就是调用手机中已经安装的“360”搜索来实现。
本发明的一个实施例中还提供了一种基于触摸操作的搜索装置,相比于前述的实施例,本实施例的搜索装置中,搜索模块640具体包括:从当前显示区域上清除触摸搜索界面,并显示应用程序的界面,将待搜索内容和搜索结果显示在界面上。根据本实施例的技术方案,及时将触摸搜索界面清除,有利于节省资源消耗,以及避免触摸搜索界面对用户进行其他操作的干扰。
如图7所示,本发明的一个实施例提供了一种终端,其包括:
触摸屏710,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器720,适于当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
触摸屏710,适于进一步接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
待搜索内容确定器730,适于根据触摸操作确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,根据基于触摸操作得到的触摸区域生成选择区域图片,然后对选择区域图片进行文字/图形等元素识别。这里对从图片中识别文字/图形等元素的方法不进行限制;例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,其中,UiAutomator与OCR如果识别出相同的内容则代表该内容准确性很强,所以通过计算UiAutomator与OCR识别结果的交集确定待搜索内容,能极大提高识别准确度。
搜索器740,适于使用待搜索内容进行搜索。
该步骤进一步包括根据用户触发的基于待搜索内容进行搜索的指令,发起携带待搜索内容的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索内容发给所述应用,通过所述应用的搜索模块或功能来对待搜索内容进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在本实施例中,检测到用户进行截图操作时,自动为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索内容并进行搜索,节省了上述的繁琐操作,快捷便利;触摸搜索界面上保留了截图的内容,方便用户在触摸搜索界面在进行待搜索内容的选择,快捷而准确,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了选取待搜索文字精准性。
根据图7,用户使用手机对其桌面进行截图;后台检测到用户的截图操作时,根据截图生成了触摸操作界面,触摸操作界面为浮层或蒙板形式,显示状态为半透明或灰度,能够显示对应截图中的内容,比如桌面上的各个图标及名称,具体如图2A所示;用户在触摸操作界面上进行触摸操作,触摸操作滑动过的区域如图2B中曲线线框所示,可知用户的意图中选择触摸到的文字作为待搜索内容;基于前述的图形识别技术,识别出待搜索内容为“360影视大全”;则使用“360影视大全”进行搜索,搜索结果如图2C所示。通过上述触摸交互方式实现对内容快捷选取和搜索,完全提高了用户搜索的整体交互体验。本实施例应用于带触摸屏的移动终端。
本发明的一个实施例中还提供了一种终端,相比于前述的实施例,本实施例的终端中,触摸搜索界面生成器720将触摸搜索界面设置为半透明方式,并覆盖在当前显示区域上形成蒙板,以使当前显示区域的内容透过触摸搜索界面进行显示。根据本实施例的技术方案,半透明的触摸操作界面作为蒙板,既能够使得用户看清楚当前显示区域的变化,又能够看清楚触摸操作界面的内容,从而在触摸操作界面上进行符合用户意向的触摸操作。进一步地,还可以将蒙板以灰度方式进行显示,这为了将触摸操作界面与截取的显示界面进行区别,向用户表明当前是对触摸操作界面进行触摸操作。例如,图2A所示显示的触摸操作界面,就可以设置为半透明的形式。
本发明的一个实施例中还提供了一种终端,相比于前述的实施例,本实施例的终端中,还包括:
显示方式改变器,适于改变触摸操作对应的触摸区域和/或未触摸区域的显示方式,以将触摸区域与未触摸区域进行区分。在本实施例中,对显示方式不进行限制,其包括亮度、颜色、对比度方面的改变。根据本实施例的技术方案,将触摸区域与未触摸区域进行区别,有利于用户确定自己的触摸操作到底摸到了哪些内容。例如,根据图2B所示的用户触摸区域,可以实现如图2D所示的效果:用户触摸区域进行高亮显示;未触摸区域明显较暗,在图2D中通过斜线效果呈现。
本发明的一个实施例提供了一种终端,其包括:
触摸屏710,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器720,适于当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
触摸屏710,适于进一步接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
待搜索内容确定器730,适于根据触摸操作对应的触摸区域,在触摸搜索界面上选择包含待搜索内容的区域,并对包含所述待搜索内容的区域中的内容进行识别,根据识别结果确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,基于触摸操作,从触摸操作界面上提取有限的区域可以得到选择区域图片,基于前述的图形识别技术进行识别,可以提高识别效果;进一步地,选取的包含待搜索内容的区域的大小不应小于用户触摸区域,例如,若用户手指触摸操作摸出的部分不准确,按触摸区域生成选择区域图片会造成文字不能完整获取的情形,则基于用户触摸区域,将边界扩展一定阈值,相对于严格按照触摸区域,边界扩展后就有了一张略大的新选择区域图片,新选择区域图片能够将用户未完整触摸的文字包含,这样就解决了用户触摸区域文字获取不完整的问题,保证了待搜索内容获取的完整性。
搜索器740,适于使用待搜索内容进行搜索。
该步骤进一步包括根据用户触发的基于待搜索内容进行搜索的指令,发起携带待搜索内容的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索内容发给所述应用,通过所述应用的搜索模块或功能来对待搜索内容进行搜索,并给出搜索结果显示给用户。
结合前述的实施例,图2B中曲线线框展示了用户触摸区域,则基于OCR识别技术,从触摸操作界面中选择的包含待搜索内容的区域如图4所示的矩形,对该矩形区域进行截图并进行图形识别得到“360影视大全”的待搜索文字;需要注意的是,用户触摸区域并没有完整覆盖“360影视大全”的信息,所以本实施例中并没有 严格按照触摸区域来确定待识别矩形区域的长宽,而是进行了一定的扩展,得到了完整涵盖“360影视大全”的矩形区域。
在本发明的一个实施例中,根据预定的换行摸字的识别策略在包含待搜索内容的区域对应的截图中识别出文字,这种策略主要应用于用户想要摸出的字分别在两行的机制。由于摸字识别主要是基于用户手指滑动在蒙板上产生的类似矩形高亮区域的像素点的边界(左右上下(x,y)四个点)来识别的。但是如果文本属于两行,手指滑动两次,也依然会根据两次合并后区左右上下四个像素点,这样四个像素点圈出的范围就比手指滑动两次高亮区域要大很多,就使得没有精准在高亮词上,因此解决的方法是:考虑换行摸字这两次他们的重合度高低,如果重合度很低(例如30%以下,阈值可以再调节),就作为两张截图来分别识别,这样就提高了精准程度。
本发明的一个实施例提供了一种终端,其包括:
触摸屏710,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器720,适于当检测到用户进行截屏操作时,获取截屏操作对应的截屏图片,并根据截屏图片生成与截屏图片具有相同内容的触摸搜索界面。
在本实施例中,对于不同的系统采用的检测截图的方式也有所不同,对于android系统,可以直接通过android系统的FileObserver服务监听系统截屏操作。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的系统桌面或操作界面,可能是用户的图片库中的图片或照片,等等。进一步地,待搜索内容包括以下至少一种:文字、图片、符号。
所述触摸搜索界面以生成或弹出的图层或浮窗形式展现,可以接收用户的触摸操作以标识出触摸区域。
触摸屏710,适于进一步接收用户通过触摸搜索界面进行的对内容的至少一部分进行待搜索内容选择的触摸操作。
用户触摸操作可以包括点选或滑动,所述触摸搜索界面基于用户触摸操作给予响应,并以不同的颜色、线框等方式标示出触摸区域。
待搜索内容确定器730,适于根据触摸操作确定待搜索内容。
本实施例中,所述触摸搜索界面接收用户触摸操作对上述内容的至少一部分进行触摸区域标识,所述触摸区域作为识别待搜索内容的基础。
在本实施例中,根据基于触摸操作得到的触摸区域生成选择区域图片,然后对选择区域图片进行文字/图形等元素识别。这里对从图片中识别文字/图形等元素的方法不进行限制;例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,其中,UiAutomator与OCR如果识别出相同的内容则代表该内容准确性很强,所以通过计算UiAutomator与OCR识别结果的交集确定待搜索内容,能极大提高识别准确度。
搜索器740,适于调用已安装的用于搜索的应用程序执行搜索,并显示应用程序的搜索结果。根据本实施例的技术方案,实现了对已安装搜索app的自动调用,避免用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在另一实施例中,也可以不调用已安装app,而是将包含搜索的多个过程设计为由单独一个app来实现。例如,图2C中所展示的搜索结果,就是调用手机中已经安装的“360”搜索来实现。
本发明的一个实施例中还提供了一种终端,相比于前述的实施例,本实施例的终端中,搜索器740具体包括:从当前显示区域上清除触摸搜索界面,并显示应用程序的界面,将待搜索内容和搜索结果显示在界面上。根据本实施例的技术方案,及时将触摸搜索界面清除,有利于节省资源消耗,以及避免触摸搜索界面对用户进行其他操作的干扰。
根据以上的实施例,进一步地,在使用待搜索词进行搜索时,需要终端(的搜索器)与搜索服务器进行交互,调用搜索服务器的搜索服务来完成搜索。该搜索服务器可以是对应于终端的服务器,也可以是对应终端上安装的搜索app的服务器,具体的交互示意图如图8所示,其中终端为810,搜索服务器为820。
如图9所示,本发明的一个实施例中提供了一种基于触摸操作的搜索方法,包括:
步骤910,当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面。
在本实施例中,检测到的用户复制或剪切的文字内容可以涵盖各种界面或各种应用(比如app)中的文字内容,无论用户对何种文字内容进行选取操作,只要触发系统剪贴板,本发明即能提取剪贴板中的文字内容。所述生成显示文字内容的触摸搜索界面可以进一步为弹出的形式显示给用户。进一步地,提取的文字内容以放 大的方式显示在所述触摸搜索界面中,这样便于用户触摸操作时对文字的精度选取。
对于剪贴板的监控不同系统有不同的方式,例如,对于Android系统,可以直接通过调用android系统的ClipboardManager控件来检测系统剪切板。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
步骤920,接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的触摸操作。
进一步地,用户可以通过点选或滑动(比如手指或触控笔)在触摸屏上,对触摸搜索界面中显示的文字内容的一部分或全部进行选择操作,也可以通过反复触摸操作来选择/取消部分文字内容。选择的文字以加框、颜色加深或变色等形式进行标识,与非选择文字进行视觉上的区分。
步骤930,根据触摸操作确定待搜索文字。
进一步地,基于上述步骤用户对文字内容进行选择,确定出待搜索文字。其中一种情况,触摸搜索界面包含一个提示框或搜索栏,用户在选择所述待搜索文字时会将选择的文字内容同步到提示框或搜索栏中,作为待搜索文字或待搜索词。
步骤940,使用待搜索文字进行搜索。
该步骤进一步包括根据用户触发的基于待搜索文字进行搜索的指令,发起携带待搜索文字的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索文字发给所述应用,通过所述应用的搜索模块或功能来对待搜索文字进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在本实施例中,检测到用户进行复制、剪切操作时,自动为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索文字并进行搜索,节省了上述的繁琐操作,快捷便利;触摸搜索界面上显示文字内容后,用户再进行待搜索文字的选取,更加精确,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了选取待搜索文字精准性。
根据图9,对于用户在手机上浏览的如图10A所示的微博页面,用户在微博页面上进行长按后触发复制整条微博内容;后台检测到剪贴板中注入内容后,在手机上生成或弹出触摸搜索界面,如图10B所示,触摸搜索界面上显示整条微博内容;用户在触摸搜索界面上通过触摸操作选择待搜索的文字,如图10C所示,其中选择的待搜索词为线框部分;后台自动使用用户选择的待搜索词进行搜索,得到的结果。比如调起360搜索app进行搜索并给出搜索结果,如图10D所示。进一步地,待搜索文字支持跨行或跨段选取。另外界面中的提示框/搜索栏会自动将跨行文字按照文字先后顺序进行组织,见附图10C和10D;当然待搜索文字也支持单行选取,参见附图10C’和10D’。
通过上述触摸交互方式实现对文字快捷选取和搜索,完全提高了用户搜索的整体交互体验。本实施例应用于带触摸屏的移动终端。
本发明的一个实施例中还提供了一种基于触摸操作的搜索方法,相比于前述的实施例,本实施例的搜索方法中,步骤910,还包括:
将触摸搜索界面上显示的文字内容的尺寸放大。根据本实施例的技术方案,将触摸搜索界面上的文字内容进行放大,有利于用户精确地选择待搜索词,不容易出错。例如,图10B中的触摸搜索界面上显示的文字,其尺寸要大于图10A所示的微博界面上显示的文字。
如图11所示,本发明的一个实施例中提供了一种基于触摸操作的搜索方法,包括:
步骤1110,当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面,触摸搜索界面上具有搜索栏。
在本实施例中,检测到的用户复制或剪切的文字内容可以涵盖各种界面或各种应用(比如app)中的文字内容,无论用户对何种文字内容进行选取操作,只要触发系统剪贴板,本发明即能提取剪贴板中的文字内容。所述生成显示文字内容的触摸搜索界面可以进一步为弹出的形式显示给用户。进一步地,提取的文字内容以放大的方式显示在所述触摸搜索界面中,这样便于用户触摸操作时对文字的精度选取。
对于剪贴板的监控不同系统有不同的方式,例如,对于Android系统,可以直接通过调用android系统的ClipboardManager控件来检测系统剪切板。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
步骤1120,接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的触摸操作。
进一步地,用户可以通过点选或滑动(比如手指或触控笔)在触摸屏上,对触摸搜索界面中显示的文字内容的一部分或全部进行选择操作,也可以通过反复触摸操作来选择/取消部分文字内容。选择的文字以加框、颜色加深或变色等形式进行标识,与非选择文字进行视觉上的区分。
步骤1130,根据触摸操作确定待搜索文字。
进一步地,基于上述步骤用户可以通过各种触摸操作,比如点选或滑动的方式,对文字内容进行进一步的选择,确定出真正复合搜索需求的待搜索文字。
步骤1140,将用户选择的待搜索文字,同步显示到搜索栏中。
触摸搜索界面包含一个提示框或搜索栏,用户在选择所述待搜索文字时会将选择的文字内容同步到提示框或搜索栏中,作为待搜索文字或待搜索词。
步骤1150,使用待搜索文字进行搜索。
该步骤进一步包括根据用户触发的基于待搜索文字进行搜索的指令,发起携带待搜索文字的搜索请求。所述搜索请求进一步可以通过调起用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索文字发给所述应用,通过所述应用的搜索模块或功能来对待搜索文字进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,将待搜索文字同步显示到搜索栏中,有利于用户确认自己到底选择了什么样的待搜索词。
根据图11,对于用户在手机上浏览的如图12A所示的文字网页或记事本,用户对网页页面或记事本中的文字内容进行选择,所选择的文字内容为图中被框的文字;后台检测到剪贴板中存在内容后,在手机上生成的触摸搜索界面如图12B所示,触摸搜索界面上显示用户选择的文字内容,且触摸搜索界面的上方具有搜索栏;如图12C,用户在触摸搜索界面上通过触摸操作选择的待搜索文字,均同步到搜索栏中。本实施例应用于带触摸屏的移动终端。
如图13所示,本发明的一个实施例中提供了一种基于触摸操作的搜索方法,包括:
步骤1310,当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面。
步骤1320,接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的触摸操作。
步骤1330,根据触摸操作确定待搜索文字。
步骤1340,调用已安装的用于搜索的应用程序执行搜索,并显示应用程序的搜索结果。根据本实施例的技术方案,实现了对已安装搜索app的自动调用,避免用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在另一实施例中,也可以不调用已安装app,而是将包含搜索的多个过程设计为由单独一个app来实现。例如,图10D中所展示的搜索结果,就是调用手机中已经安装的“360搜索”来实现。
本发明的一个实施例中还提供了一种基于触摸操作的搜索方法,相比于前述的实施例,本实施例的搜索方法中,还包括:从触摸搜索界面上清除文字内容,以及在触摸搜索界面上显示搜索结果。根据本实施例的技术方案,将搜索结果直接显示在触摸搜索界面,避免了用户在触摸搜索界面与其他搜索app的界面之间进行切换的繁琐操作,使得用户的选择待搜索词和查看搜索结果都在一个界面上完成即可。
如图14所示,本发明的一个实施例中提供了一种基于触摸操作的搜索装置,包括:
触摸搜索界面生成模块1410,用于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面。
在本实施例中,检测到的用户复制或剪切的文字内容可以涵盖各种界面或各种应用(比如app)中的文字内容,无论用户对何种文字内容进行选取操作,只要触发系统剪贴板,本发明即能提取剪贴板中的文字内容。所述生成显示文字内容的触摸搜索界面可以进一步为弹出的形式显示给用户。进一步地,提取的文字内容以放大的方式显示在所述触摸搜索界面中,这样便于用户触摸操作时对文字的精度选取。
在本实施例中,对于剪贴板的监控不同系统有不同的方式,例如,对于Android系统,可以直接通过调用android系统的ClipboardManager控件来检测系统剪切板。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
触摸操作接收模块1420,用于接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的触摸操作。
进一步地,用户可以通过点选或滑动(比如手指或触控笔)在触摸屏上,对触摸搜索界面中显示的文字内容的一部分或全部进行选择操作,也可以通过反复触摸操作来选择/取消部分文字内容。选择的文字以加框、颜色加深或变色等形式进行标识,与非选择文字进行视觉上的区分。
待搜索文字确定模块1430,用于根据触摸操作确定待搜索文字。
进一步地,基于上述步骤用户对文字内容进行选择,确定出待搜索文字。其中一种情况,触摸搜索界面包含一个提示框或搜索栏,用户在选择所述待搜索文字时会将选择的文字内容同步到提示框或搜索栏中,作为待搜索文字或待搜索词。
搜索模块1440,用于使用待搜索文字进行搜索。
该步骤进一步包括根据用户触发的基于待搜索文字进行搜索的指令,发起携带待搜索文字的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索文字发给所述应用,通过所述应用的搜索模块或功能来对待搜索文字进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在本实施例中,检测到用户进行复制、剪切操作时,自动为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索文字并进行搜索,节省了上述的繁琐操作,快捷便利;触摸搜索界面上显示文字内容后,用户再进行待搜索文字的选取,更加精确,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了选取待搜索文字精准性。
根据图14,对于用户在手机上浏览的如图10A所示的微博页面,用户在微博页面上进行长按后触发复制整条微博内容;后台检测到剪贴板中存在内容后,在手机上生成的触摸搜索界面如图10B所示,触摸搜索界面上显示整条微博内容;用户在触摸搜索界面上通过触摸操作选择待搜索的文字,如图10C所示,其中选择的待搜索词为线框部分;后台自动使用用户选择的待搜索词进行搜索,得到的结果如图10D所示。因通过触摸交互方式实现对文字的触摸选取,本发明及实施例可以应用于带有触摸屏的用户终端,比如智能手机、平板电脑、ipad等。
本发明的一个实施例中还提供了一种基于触摸操作的搜索装置,相比于前述的实施例,本实施例的搜索装置中,进一步还包括:
尺寸放大模块,用于将触摸搜索界面上显示的文字内容的尺寸放大。根据本实施例的技术方案,将触摸搜索界面上的文字内容进行放大,有利于用户精确地选择待搜索词,不容易出错。例如,图10B中的触摸搜索界面上显示的文字,其尺寸要大于图10A所示的微博界面上显示的文字。
如图15所示,本发明的一个实施例中提供了一种基于触摸操作的搜索装置,包括:
触摸搜索界面生成模块1510,用于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面,触摸搜索界面上具有搜索栏。
在本实施例中,检测到的用户复制或剪切的文字内容可以涵盖各种界面或各种应用(比如app)中的文字内容,无论用户对何种文字内容进行选取操作,只要触发系统剪贴板,本发明即能提取剪贴板中的文字内容。所述生成显示文字内容的触摸搜索界面可以进一步为弹出的形式显示给用户。进一步地,提取的文字内容以放大的方式显示在所述触摸搜索界面中,这样便于用户触摸操作时对文字的精度选取。
对于剪贴板的监控不同系统有不同的方式,例如,对于Android系统,可以直接通过调用android系统的ClipboardManager控件来检测系统剪切板。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
触摸操作接收模块1520,用于接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的触摸操作。
进一步地,用户可以通过点选或滑动(比如手指或触控笔)在触摸屏上,对触摸搜索界面中显示的文字内容的一部分或全部进行选择操作,也可以通过反复触摸操作来选择/取消部分文字内容。选择的文字以加框、颜色加深或变色等形式进行标识,与非选择文字进行视觉上的区分。
待搜索文字确定模块1530,用于根据触摸操作确定待搜索文字。
进一步地,基于上述步骤用户可以通过各种触摸操作,比如点选或滑动的方式,对文字内容进行进一步的选择,确定出真正复合搜索需求的待搜索文字。
同步显示模块1540,用于将用户选择的待搜索文字,同步显示到搜索栏中。
触摸搜索界面包含一个提示框或搜索栏,用户在选择所述待搜索文字时会将选择的文字内容同步到提示框或搜索栏中,作为待搜索文字或待搜索词。
搜索模块1550,用于使用待搜索文字进行搜索。
进一步包括根据用户触发的基于待搜索文字进行搜索的指令,发起携带待搜索文字的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索文字发给所述应用,通过所述应用的搜索模块或功能来对待搜索文字进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,将待搜索文字同步显示到搜索栏中,有利于用户确认自己到底选择了什么样的待搜索词。
根据图15,对于用户在手机上浏览的如图12A所示的网页,用户对网页页面的文字内容进行选择,所选择的文字内容为图中被框的文字;后台检测到剪贴板中存在内容后,在手机上生成的触摸搜索界面如图12B所示,触摸搜索界面上显示用户选择的文字内容,且触摸搜索界面的上方具有搜索栏;如图12C,用户在触摸搜索界面上通过触摸操作选择的待搜索文字,均同步到搜索栏中。本实施例应用于带触摸屏的移动终端。
本发明的一个实施例中还提供了一种基于触摸操作的搜索装置,相比于前述的实施例,本实施例的搜索装置中,进一步还包括:搜索模块从触摸搜索界面上清除文字内容,以及在触摸搜索界面上显示搜索结果。根据本实施例的技术方案,将搜索结果直接显示在触摸搜索界面,避免了用户在触摸搜索界面与其他搜索app的界面之间进行切换的繁琐操作,使得用户的选择待搜索词和查看搜索结果都在一个界面上完成即可。
如图16所示,本发明的一个实施例中提供了一种终端,包括:
触摸屏1610,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器1620,适于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面。
在本实施例中,检测到的用户复制或剪切的文字内容可以涵盖各种界面或各种应用(比如app)中的文字内容,无论用户对何种文字内容进行选取操作,只要触发系统剪贴板,本发明即能提取剪贴板中的文字内容。所述生成显示文字内容的触摸搜索界面可以进一步为弹出的形式显示给用户。进一步地,提取的文字内容以放大的方式显示在所述触摸搜索界面中,这样便于用户触摸操作时对文字的精度选取。
在本实施例中,对于剪贴板的监控不同系统有不同的方式,例如,对于Android系统,可以直接通过调用android系统的ClipboardManager控件来检测系统剪切板。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
触摸屏1610,适于进一步接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的触摸操作。
进一步地,用户可以通过点选或滑动(比如手指或触控笔)在触摸屏上,对触摸搜索界面中显示的文字内容的一部分或全部进行选择操作,也可以通过反复触摸操作来选择/取消部分文字内容。选择的文字以加框、颜色加深或变色等形式进行标识,与非选择文字进行视觉上的区分。
待搜索文字确定器1630,适于根据触摸操作确定待搜索文字。
进一步地,基于上述步骤用户对文字内容进行选择,确定出待搜索文字。其中一种情况,触摸搜索界面包含一个提示框或搜索栏,用户在选择所述待搜索文字时会将选择的文字内容同步到提示框或搜索栏中,作为待搜索文字或待搜索词。
搜索器1640,适于使用待搜索文字进行搜索。
该步骤进一步包括根据用户触发的基于待搜索文字进行搜索的指令,发起携带待搜索文字的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索文字发给所述应用,通过所述应用的搜索模块或功能来对待搜索文字进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,用户进行搜索时并不需要打开搜索app、在搜索栏中输入待搜索文字、将待搜索文字粘贴到搜索栏中、触发搜索等繁琐操作;在本实施例中,检测到用户进行复制、剪切操作时,自动为用户提供触摸搜索界面,以供用户快捷准确地选择待搜索文字并进行搜索,节省了上述的繁琐操作,快捷便利;触摸搜索界面上显示文字内容后,用户再进行待搜索文字的选取,更加精确,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了选取待搜索文字精准性。
根据图16,对于用户在手机上浏览的如图10A所示的微博页面,用户在微博页面上进行长按后触发复制整条微博内容;后台检测到剪贴板中存在内容后,在手机上生成的触摸搜索界面如图10B所示,触摸搜索界面上显示整条微博内容;用户在触摸搜索界面上通过触摸操作选择待搜索的文字,如图10C所示,其中选择的待搜索词为线框部分;后台自动使用用户选择的待搜索词进行搜索,得到的结果如图10D所示。因通过触摸交互方式实现对文字的触摸选取,本发明及实施例可以应用于带有触摸屏的用户终端,比如智能手机、平板电脑、ipad等。
本发明的一个实施例中还提供了一种终端,相比于前述的实施例,本实施例的终端中,进一步还包括:
尺寸放大器,适于将触摸搜索界面上显示的文字内容的尺寸放大。根据本实施例的技术方案,将触摸搜索界面上的文字内容进行放大,有利于用户精确地选择待搜索词,不容易出错。例如,图10B中的触摸搜索界面上显示的文字,其尺寸要大于图10A所示的微博界面上显示的文字。
如图17所示,本发明的一个实施例中提供了一种终端,包括:
触摸屏1710,适于接收用户触摸操作并提供显示功能;
触摸搜索界面生成器1720,适于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取用户复制或剪切的文字内容,并生成显示文字内容的触摸搜索界面,触摸搜索界面上具有搜索栏。
在本实施例中,检测到的用户复制或剪切的文字内容可以涵盖各种界面或各种应用(比如app)中的文字内容,无论用户对何种文字内容进行选取操作,只要触发系统剪贴板,本发明即能提取剪贴板中的文字内容。所述生成显示文字内容的触摸搜索界面可以进一步为弹出的形式显示给用户。进一步地,提取的文字内容以放大的方式显示在所述触摸搜索界面中,这样便于用户触摸操作时对文字的精度选取。
对于剪贴板的监控不同系统有不同的方式,例如,对于Android系统,可以直接通过调用android系统的ClipboardManager控件来检测系统剪切板。本发明不限定在Andriod系统,在其他操作系统都可以涵盖的。
触摸屏1710,适于进一步接收用户通过触摸搜索界面对文字内容中的至少一部分进行待搜索文字选择的 触摸操作。
进一步地,用户可以通过点选或滑动(比如手指或触控笔)在触摸屏上,对触摸搜索界面中显示的文字内容的一部分或全部进行选择操作,也可以通过反复触摸操作来选择/取消部分文字内容。选择的文字以加框、颜色加深或变色等形式进行标识,与非选择文字进行视觉上的区分。
待搜索文字确定器1730,适于根据触摸操作确定待搜索文字。
进一步地,基于上述步骤用户可以通过各种触摸操作,比如点选或滑动的方式,对文字内容进行进一步的选择,确定出真正复合搜索需求的待搜索文字。
同步显示器1740,适于将用户选择的待搜索文字,同步显示到搜索栏中。
触摸搜索界面包含一个提示框或搜索栏,用户在选择所述待搜索文字时会将选择的文字内容同步到提示框或搜索栏中,作为待搜索文字或待搜索词。
搜索器1750,适于使用待搜索文字进行搜索。
进一步包括根据用户触发的基于待搜索文字进行搜索的指令,发起携带待搜索文字的搜索请求。所述搜索请求进一步可以通过调起已安装的用于搜索的应用(比如360搜索app)或某一应用中的搜索模块(比如360卫士中的搜索模块),将所述待搜索文字发给所述应用,通过所述应用的搜索模块或功能来对待搜索文字进行搜索,并给出搜索结果显示给用户。
根据本实施例的技术方案,将待搜索文字同步显示到搜索栏中,有利于用户确认自己到底选择了什么样的待搜索词。
根据图17,对于用户在手机上浏览的如图12A所示的网页,用户对网页页面的文字内容进行选择,所选择的文字内容为图中被框的文字;后台检测到剪贴板中存在内容后,在手机上生成的触摸搜索界面如图12B所示,触摸搜索界面上显示用户选择的文字内容,且触摸搜索界面的上方具有搜索栏;如图12C,用户在触摸搜索界面上通过触摸操作选择的待搜索文字,均同步到搜索栏中。本实施例应用于带触摸屏的移动终端。
本发明的一个实施例中还提供了一种终端,相比于前述的实施例,本实施例的终端中,进一步还包括:搜索器从触摸搜索界面上清除文字内容,以及在触摸搜索界面上显示搜索结果。根据本实施例的技术方案,将搜索结果直接显示在触摸搜索界面,避免了用户在触摸搜索界面与其他搜索app的界面之间进行切换的繁琐操作,使得用户的选择待搜索词和查看搜索结果都在一个界面上完成即可。
根据以上的实施例,进一步地,在使用待搜索词进行搜索时,需要终端(的搜索器)与搜索服务器进行交互,调用搜索服务器的搜索服务来完成搜索。该搜索服务器可以是对应于终端的服务器,也可以是对应终端上安装的搜索app的服务器,具体的交互示意图如图18所示,其中终端为1810,搜索服务器为1820。
在此提供的算法和显示不与任何特定计算机、虚拟系统或者其它设备固有相关。各种通用系统也可以与基于在此的示教一起使用。根据上面的描述,构造这类系统所要求的结构是显而易见的。此外,本发明也不针对任何特定编程语言。应当明白,可以利用各种编程语言实现在此描述的本发明的内容,并且上面对特定语言所做的描述是为了披露本发明的最佳实施方式。
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本发明的实施例可以在没有这些具体细节的情况下实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。
类似地,应当理解,为了精简本公开并帮助理解各个发明方面中的一个或多个,在上面对本发明的示例性实施例的描述中,本发明的各个特征有时被一起分组到单个实施例、图、或者对其的描述中。然而,并不应将该公开的方法解释成反映如下意图:即所要求保护的本发明要求比在每个权利要求中所明确记载的特征更多的特征。更确切地说,如下面的权利要求书所反映的那样,发明方面在于少于前面公开的单个实施例的所有特征。因此,遵循具体实施方式的权利要求书由此明确地并入该具体实施方式,其中每个权利要求本身都作为本发明的单独实施例。
本领域那些技术人员可以理解,可以对实施例中的设备中的模块进行自适应性地改变并且把它们设置在与该实施例不同的一个或多个设备中。可以把实施例中的模块或单元或组件组合成一个模块或单元或组件,以及此外可以把它们分成多个子模块或子单元或子组件。除了这样的特征和/或过程或者单元中的至少一些是相互排斥之外,可以采用任何组合对本说明书(包括伴随的权利要求、摘要和附图)中公开的所有特征以及如此公开的任何方法或者设备的所有过程或单元进行组合。除非另外明确陈述,本说明书(包括伴随的权利要求、摘要和附图)中公开的每个特征可以由提供相同、等同或相似目的的替代特征来代替。
此外,本领域的技术人员能够理解,尽管在此所述的一些实施例包括其它实施例中所包括的某些特征而不是其它特征,但是不同实施例的特征的组合意味着处于本发明的范围之内并且形成不同的实施例。例如,在下面的权利要求书中,所要求保护的实施例的任意之一都可以以任意的组合方式来使用。
本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实 现根据本发明实施例的基于触摸操作的搜索装置中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本发明的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。
例如,图19示意性地示出了用于执行根据本发明的方法的计算设备的框图。该计算设备传统上包括处理器1910和以存储器1920形式的计算机程序产品或者计算机可读介质。存储器1920可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器1920具有用于执行上述方法中的任何方法步骤的程序代码1931的存储空间1930。例如,用于程序代码的存储空间1930可以包括分别用于实现上面的方法中的各种步骤的各个程序代码1931。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图20所述的便携式或者固定存储单元。该存储单元可以具有与图19的计算设备中的存储器1920类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括用于执行根据本发明的方法步骤的计算机可读代码1931’,即可以由例如诸如1910之类的处理器读取的代码,这些代码当由计算设备运行时,导致该计算设备执行上面所描述的方法中的各个步骤。
应该注意的是上述实施例对本发明进行说明而不是对本发明进行限制,并且本领域技术人员在不脱离所附权利要求的范围的情况下可设计出替换实施例。在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本发明可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。
本发明可以应用于计算机系统/服务器,其可与众多其它通用或专用计算系统环境或配置一起操作。适于与计算机系统/服务器一起使用的众所周知的计算系统、环境和/或配置的例子包括但不限于:个人计算机系统、服务器计算机系统、瘦客户机、厚客户机、手持或膝上设备、基于微处理器的系统、机顶盒、可编程消费电子产品、网络个人电脑、小型计算机系统、大型计算机系统和包括上述任何系统的分布式云计算技术环境,等等。
计算机系统/服务器可以在由计算机系统执行的计算机系统可执行指令(诸如程序模块)的一般语境下描述。通常,程序模块可以包括例程、程序、目标程序、组件、逻辑、数据结构等等,它们执行特定的任务或者实现特定的抽象数据类型。计算机系统/服务器可以在分布式云计算环境中实施,分布式云计算环境中,任务是由通过通信网络链接的远程处理设备执行的。在分布式云计算环境中,程序模块可以位于包括存储设备的本地或远程计算系统存储介质上。
本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本发明的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。

Claims (38)

  1. 一种基于触摸操作的搜索方法,其包括:
    当检测到用户进行截屏操作时,获取所述截屏操作对应的截屏图片,并生成与所述截屏图片具有相同内容的触摸搜索界面;
    接收所述用户通过所述触摸搜索界面进行的对所述内容的至少一部分进行待搜索内容选择的触摸操作;
    根据所述触摸操作确定所述待搜索内容;
    使用所述待搜索内容进行搜索。
  2. 根据权利要求1所述的方法,其中,根据所述截屏图片生成与所述截屏图片具有相同内容的触摸搜索界面,还包括:
    将所述触摸搜索界面设置为半透明方式,并覆盖在当前显示区域上形成蒙板,以使所述当前显示区域的内容透过所述触摸搜索界面进行显示。
  3. 根据权利要求1-2任一项所述的方法,其中,根据所述触摸操作确定所述待搜索内容,具体包括:
    根据所述触摸操作对应的触摸区域,在所述触摸搜索界面上选择包含所述待搜索内容的区域;
    对所述包含所述待搜索内容的区域中的内容进行识别,根据识别结果确定所述待搜索内容。
  4. 根据权利要求1-3任一项所述的方法,其中,在根据所述触摸操作确定所述待搜索内容之前,还包括:
    改变所述触摸操作对应的触摸区域和/或未触摸区域的显示方式,以将所述触摸区域与所述未触摸区域进行区分。
  5. 根据权利要求1-4任一项所述的方法,其中,使用所述待搜索内容进行搜索,具体包括:
    调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果。
  6. 根据权利要求1-5任一项所述的方法,其中,调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果,具体包括:
    从所述当前显示区域上清除所述触摸搜索界面,并显示所述应用程序的界面,将所述待搜索内容和所述搜索结果显示在所述界面上。
  7. 根据权利要求1至6中任一项的方法,其中,所述待搜索内容包括以下至少一种:
    文字、图片、符号。
  8. 一种基于触摸操作的搜索装置,其包括:
    触摸搜索界面生成模块,用于当检测到用户进行截屏操作时,获取所述截屏操作对应的截屏图片,并根据所述截屏图片生成与所述截屏图片具有相同内容的触摸搜索界面;
    触摸操作接收模块,用于接收所述用户通过所述触摸搜索界面进行的对所述内容的至少一部分进行待搜索内容选择的触摸操作;
    待搜索内容确定模块,用于根据所述触摸操作确定所述待搜索内容;
    搜索模块,用于使用所述待搜索内容进行搜索。
  9. 根据权利要求8所述的装置,其中,
    所述触摸搜索界面生成模块将所述触摸搜索界面设置为半透明方式,并覆盖在当前显示区域上形成蒙板,以使所述当前显示区域的内容透过所述触摸搜索界面进行显示。
  10. 根据权利要求8-9任一项所述的装置,其中,
    所述待搜索内容确定模块根据所述触摸操作对应的触摸区域,在所述触摸搜索界面上选择包含所述待搜索内容的区域,并对所述包含所述待搜索内容的区域中的内容进行识别,根据识别结果确定所述待搜索内容。
  11. 根据权利要求8-10任一项所述的装置,其中,还包括:
    显示方式改变模块,用于改变所述触摸操作对应的触摸区域和/或未触摸区域的显示方式,以将所述触摸区域与所述未触摸区域进行区分。
  12. 根据权利要求8-11任一项所述的装置,其中,
    所述搜索模块调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果。
  13. 根据权利要求8-12任一项所述的装置,其中,
    所述搜索模块从所述当前显示区域上清除所述触摸搜索界面,并显示所述应用程序的界面,将所述待搜索内容和所述搜索结果显示在所述界面上。
  14. 根据权利要求8-13任一项所述的装置,其中,
    所述待搜索内容包括以下至少一种:文字、图片、符号。
  15. 一种终端,其包括:
    触摸屏,适于接收用户触摸操作并提供显示功能;
    触摸搜索界面生成器,适于当检测到用户进行截屏操作时,获取所述截屏操作对应的截屏图片,并根据所述截屏图片生成与所述截屏图片具有相同内容的触摸搜索界面;
    所述触摸屏,适于进一步接收所述用户通过所述触摸搜索界面进行的对所述内容的至少一部分进行待搜索内容选择的触摸操作;
    待搜索内容确定器,适于根据所述触摸操作确定所述待搜索内容;
    搜索器,适于使用所述待搜索内容进行搜索。
  16. 根据权利要求12所述的终端,其中,
    所述触摸搜索界面生成器将所述触摸搜索界面设置为半透明方式,并覆盖在当前显示区域上形成蒙板,以使所述当前显示区域的内容透过所述触摸搜索界面进行显示。
  17. 根据权利要求15-16任一项所述的终端,其中,
    所述待搜索内容确定器根据所述触摸操作对应的触摸区域,在所述触摸搜索界面上选择包含所述待搜索内容的区域,并对所述包含所述待搜索内容的区域中的内容进行识别,根据识别结果确定所述待搜索内容。
  18. 根据权利要求15-17任一项所述的终端,其中,还包括:
    显示方式改变器,适于改变所述触摸操作对应的触摸区域和/或未触摸区域的显示方式,以将所述触摸区域与所述未触摸区域进行区分。
  19. 根据权利要求15-18任一项所述的终端,其中,
    所述搜索器调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果。
  20. 根据权利要求15-19任一项所述的终端,其中,
    所述搜索器从所述当前显示区域上清除所述触摸搜索界面,并显示所述应用程序的界面,将所述待搜索内容和所述搜索结果显示在所述界面上。
  21. 根据权利要求15-20任一项的终端,其中,所述待搜索内容包括以下至少一种:
    文字、图片、符号。
  22. 一种基于触摸操作的搜索方法,其包括:
    当检测到系统剪切板注入用户复制或剪切的文字内容时,提取所述用户复制或剪切的文字内容,并生成显示所述文字内容的触摸搜索界面;
    接收所述用户通过所述触摸搜索界面对所述文字内容中的至少一部分进行待搜索文字选择的触摸操作;
    根据所述触摸操作确定所述待搜索文字;
    使用所述待搜索文字进行搜索。
  23. 根据权利要求22所述的方法,其中,在生成显示所述文字内容的触摸搜索界面,还包括:
    将所述触摸搜索界面上显示的所述文字内容的尺寸放大。
  24. 根据权利要求22所述的方法,其中,所述触摸搜索界面上具有搜索栏;在使用所述待搜索文字进行搜索之前,还包括:
    将所述用户选择的所述待搜索文字,同步显示到所述搜索栏中。
  25. 根据权利要求22所述的方法,其中,使用所述待搜索文字进行搜索,具体包括:
    调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果。
  26. 根据权利要求22至25中任一项所述的方法,其中,使用所述待搜索文字进行搜索,还包括:
    从所述触摸搜索界面上清除所述文字内容,以及在所述触摸搜索界面上显示搜索结果。
  27. 一种基于触摸操作的搜索装置,其包括:
    触摸搜索界面生成模块,用于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取所述用户复制或剪切的文字内容,并生成显示所述文字内容的触摸搜索界面;
    触摸操作接收模块,用于接收所述用户通过所述触摸搜索界面对所述文字内容中的至少一部分进行待搜索文字选择的触摸操作;
    待搜索文字确定模块,用于根据所述触摸操作确定所述待搜索文字;
    搜索模块,用于使用所述待搜索文字进行搜索。
  28. 根据权利要求27所述的装置,其中,还包括:
    尺寸放大模块,用于将所述触摸搜索界面上显示的所述文字内容的尺寸放大。
  29. 根据权利要求27所述的装置,其中,所述触摸搜索界面上具有搜索栏;所述装置还包括:
    同步显示模块,用于将所述用户选择的所述待搜索文字,同步显示到所述搜索栏中。
  30. 根据权利要求27所述的装置,其中,
    所述搜索模块调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果。
  31. 根据权利要求27至30中任一项所述的装置,其中,
    所述搜索模块从所述触摸搜索界面上清除所述文字内容,以及在所述触摸搜索界面上显示搜索结果。
  32. 一种终端,其包括:
    触摸屏,适于接收用户触摸操作并提供显示功能;
    触摸搜索界面生成器,适于当检测到系统剪切板注入用户复制或剪切的文字内容时,提取所述用户复制或剪切的文字内容,并生成显示所述文字内容的触摸搜索界面;
    所述触摸屏,适于进一步接收所述用户通过所述触摸搜索界面对所述文字内容中的至少一部分进行待搜索文字选择的触摸操作;
    待搜索文字确定器,适于根据所述触摸操作确定所述待搜索文字;
    搜索器,适于使用所述待搜索文字进行搜索。
  33. 根据权利要求32所述的终端,其中,还包括:
    尺寸放大器,适于将所述触摸搜索界面上显示的所述文字内容的尺寸放大。
  34. 根据权利要求32所述的终端,其中,所述触摸搜索界面上具有搜索栏;所述终端还包括:
    同步显示器,适于将所述用户选择的所述待搜索文字,同步显示到所述搜索栏中。
  35. 根据权利要求32所述的终端,其中,
    所述搜索器调用已安装的用于搜索的应用程序执行搜索,并显示所述应用程序的搜索结果。
  36. 根据权利要求32至35中任一项所述的终端,其中,
    所述搜索器从所述触摸搜索界面上清除所述文字内容,以及在所述触摸搜索界面上显示搜索结果。
  37. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算设备上运行时,导致所述计算设备执行根据权利要求1-7中的任一项所述的基于触摸操作的搜索方法,或导致所述计算设备执行根据权利要求22-26中的任一项所述的基于触摸操作的搜索方法。
  38. 一种计算机可读介质,其中存储了如权利要求37所述的计算机程序。
PCT/CN2015/095863 2014-12-26 2015-11-27 终端以及基于触摸操作的搜索方法和装置 WO2016101768A1 (zh)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
CN201410827079.5 2014-12-26
CN201410826801.3A CN104778194A (zh) 2014-12-26 2014-12-26 基于触摸操作的搜索方法和装置
CN201410826911.X 2014-12-26
CN201410827079.5A CN104778195A (zh) 2014-12-26 2014-12-26 终端和基于触摸操作的搜索方法
CN201410826911.XA CN104537051B (zh) 2014-12-26 2014-12-26 终端和基于触摸操作的搜索方法
CN201410827065.3A CN104536688A (zh) 2014-12-26 2014-12-26 基于触摸操作的搜索方法和装置
CN201410827065.3 2014-12-26
CN201410826801.3 2014-12-26

Publications (1)

Publication Number Publication Date
WO2016101768A1 true WO2016101768A1 (zh) 2016-06-30

Family

ID=56149213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/095863 WO2016101768A1 (zh) 2014-12-26 2015-11-27 终端以及基于触摸操作的搜索方法和装置

Country Status (1)

Country Link
WO (1) WO2016101768A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113032264A (zh) * 2021-03-29 2021-06-25 网易(杭州)网络有限公司 页面视图控件的检测方法及装置
WO2022242302A1 (zh) * 2021-05-17 2022-11-24 北京字节跳动网络技术有限公司 文本搜索方法, 装置, 可读介质及电子设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102667764A (zh) * 2009-08-07 2012-09-12 谷歌公司 用于为视觉查询的多个区域展示搜索结果的用户接口
CN102929926A (zh) * 2012-09-20 2013-02-13 百度在线网络技术(北京)有限公司 一种基于浏览内容的取词搜索方法及装置
CN103092520A (zh) * 2013-01-25 2013-05-08 广东欧珀移动通信有限公司 一种屏幕图像截取方法、装置及触摸屏移动设备
CN103186671A (zh) * 2013-03-28 2013-07-03 百度在线网络技术(北京)有限公司 用于移动终端的搜索方法、搜索系统及移动终端
CN104536995A (zh) * 2014-12-12 2015-04-22 北京奇虎科技有限公司 基于终端界面触控操作进行搜索的方法及系统
CN104537051A (zh) * 2014-12-26 2015-04-22 北京奇虎科技有限公司 终端和基于触摸操作的搜索方法
CN104536688A (zh) * 2014-12-26 2015-04-22 北京奇虎科技有限公司 基于触摸操作的搜索方法和装置
CN104778194A (zh) * 2014-12-26 2015-07-15 北京奇虎科技有限公司 基于触摸操作的搜索方法和装置
CN104778195A (zh) * 2014-12-26 2015-07-15 北京奇虎科技有限公司 终端和基于触摸操作的搜索方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102667764A (zh) * 2009-08-07 2012-09-12 谷歌公司 用于为视觉查询的多个区域展示搜索结果的用户接口
CN102929926A (zh) * 2012-09-20 2013-02-13 百度在线网络技术(北京)有限公司 一种基于浏览内容的取词搜索方法及装置
CN103092520A (zh) * 2013-01-25 2013-05-08 广东欧珀移动通信有限公司 一种屏幕图像截取方法、装置及触摸屏移动设备
CN103186671A (zh) * 2013-03-28 2013-07-03 百度在线网络技术(北京)有限公司 用于移动终端的搜索方法、搜索系统及移动终端
CN104536995A (zh) * 2014-12-12 2015-04-22 北京奇虎科技有限公司 基于终端界面触控操作进行搜索的方法及系统
CN104537051A (zh) * 2014-12-26 2015-04-22 北京奇虎科技有限公司 终端和基于触摸操作的搜索方法
CN104536688A (zh) * 2014-12-26 2015-04-22 北京奇虎科技有限公司 基于触摸操作的搜索方法和装置
CN104778194A (zh) * 2014-12-26 2015-07-15 北京奇虎科技有限公司 基于触摸操作的搜索方法和装置
CN104778195A (zh) * 2014-12-26 2015-07-15 北京奇虎科技有限公司 终端和基于触摸操作的搜索方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113032264A (zh) * 2021-03-29 2021-06-25 网易(杭州)网络有限公司 页面视图控件的检测方法及装置
WO2022242302A1 (zh) * 2021-05-17 2022-11-24 北京字节跳动网络技术有限公司 文本搜索方法, 装置, 可读介质及电子设备

Similar Documents

Publication Publication Date Title
US10489047B2 (en) Text processing method and device
US11003349B2 (en) Actionable content displayed on a touch screen
CN107256109B (zh) 信息显示方法、装置及终端
US11756246B2 (en) Modifying a graphic design to match the style of an input design
WO2016101717A1 (zh) 基于触摸交互的搜索方法及装置
CN105190644B (zh) 用于使用触摸控制的基于图像的搜索的技术
US20150277571A1 (en) User interface to capture a partial screen display responsive to a user gesture
WO2016091095A1 (zh) 基于终端界面触控操作进行搜索的方法及系统
WO2016095689A1 (zh) 基于终端界面多次触控操作进行识别搜索的方法及系统
US20120044179A1 (en) Touch-based gesture detection for a touch-sensitive device
US10685256B2 (en) Object recognition state indicators
US20150058790A1 (en) Electronic device and method of executing application thereof
US11556605B2 (en) Search method, device and storage medium
WO2014176938A1 (en) Method and apparatus of retrieving information
US10824306B2 (en) Presenting captured data
US20150058710A1 (en) Navigating fixed format document in e-reader application
US10970476B2 (en) Augmenting digital ink strokes
WO2016101768A1 (zh) 终端以及基于触摸操作的搜索方法和装置
WO2016018682A1 (en) Processing image to identify object for insertion into document
CA3003002C (en) Systems and methods for using image searching with voice recognition commands
KR20120133149A (ko) 데이터 태깅 장치, 그의 데이터 태깅 방법 및 데이터 검색 방법
US20240118803A1 (en) System and method of generating digital ink notes
CN114995698A (zh) 图像处理方法、装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15871837

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15871837

Country of ref document: EP

Kind code of ref document: A1