WO2016101717A1 - 基于触摸交互的搜索方法及装置 - Google Patents

基于触摸交互的搜索方法及装置 Download PDF

Info

Publication number
WO2016101717A1
WO2016101717A1 PCT/CN2015/094151 CN2015094151W WO2016101717A1 WO 2016101717 A1 WO2016101717 A1 WO 2016101717A1 CN 2015094151 W CN2015094151 W CN 2015094151W WO 2016101717 A1 WO2016101717 A1 WO 2016101717A1
Authority
WO
WIPO (PCT)
Prior art keywords
search
touch
current interface
user
sliding
Prior art date
Application number
PCT/CN2015/094151
Other languages
English (en)
French (fr)
Inventor
谢军样
张倩倩
吴帅
郑相振
凌灵
颜显进
Original Assignee
北京奇虎科技有限公司
奇智软件(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京奇虎科技有限公司, 奇智软件(北京)有限公司 filed Critical 北京奇虎科技有限公司
Priority to US15/539,943 priority Critical patent/US20170351371A1/en
Publication of WO2016101717A1 publication Critical patent/WO2016101717A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to the field of Internet search, and in particular, to a touch interaction based search method and apparatus.
  • Search services on mobile terminals such as smart phones, such as various search applications (apps), are all based on a search box.
  • search app When using, you need to open the search app, enter the search term in the search box, click OK, trigger the search operation.
  • the present invention has been made in order to provide a touch interaction based search method and corresponding apparatus that overcomes the above problems or at least partially solves or alleviates the above problems.
  • a touch interaction based search method including:
  • An object is extracted based on the sliding area, and a search is performed for the object.
  • a touch interaction based search device comprising:
  • a receiving module configured to receive a trigger instruction that the user performs a touch search based on the current interface
  • the receiving module is further adapted to receive a touch sliding operation performed by the user on the current interface
  • a region determining module adapted to determine a sliding region according to the sliding operation
  • An object recognition module adapted to extract an object therein based on the sliding area
  • a search module adapted to search for the object.
  • a computer program comprising computer readable code, when the computer readable code is run on a computing device, causing the computing device to perform execution based on any of the above Touch the interactive search method.
  • a computer readable medium storing a computer program as described above is provided.
  • the user issues a triggering instruction for searching based on the current interface, and then, according to the touch sliding operation performed by the user on the current interface, determines a sliding area, and then extracts an object based on the sliding area, and performs an extraction on the extracted object. search for. It can be seen from the above that the search in the embodiment of the present invention does not need to open the search app, input the object in the search box, or copy and paste the selected object into the search box, and then search.
  • the embodiment of the present invention defines a sliding area by a user's touch sliding operation, directly extracts and searches for objects in the sliding area, and solves the problem that the keyboard (including the soft keyboard input search is slow and inconvenient), thereby saving the above.
  • the search box is opened, copied and pasted, etc., which makes the search operation simple and easy, solves the problem that the terminal based on the touch interaction wants to search, saves time, and improves the user experience.
  • the sliding area can fully reflect the user's willingness to search, and solves the disadvantage that the existing part of the search app can only copy the entire piece of content and cannot accurately search for a single word or a certain discrete words, thereby improving the accuracy of the search word.
  • FIG. 1 is a flow chart schematically showing a process of a touch interaction based search method according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram showing a short message interface according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram showing the addition of an operational layer on the interface of FIG. 2, in accordance with one embodiment of the present invention
  • FIG. 4 is a schematic diagram showing a result of a query for searching for a courier delivery number according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram showing an IM chat record according to an embodiment of the present invention.
  • FIG. 6 is a schematic view showing an operation layer added to the interface shown in FIG. 5 according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram showing an address of a "360 company headquarters" according to an embodiment of the present invention.
  • FIG. 8 is a schematic block diagram showing a structure of a touch interaction based search device according to an embodiment of the present invention.
  • FIG. 9 is a schematic block diagram showing another structure of a touch interaction based search device according to an embodiment of the present invention.
  • Figure 10 is a schematic block diagram of a computing device for performing a touch interaction based search method in accordance with the present invention.
  • Fig. 11 schematically shows a storage unit for holding or carrying program code implementing a touch interaction based search method according to the present invention.
  • FIG. 1 illustrates a process flow diagram of a touch interaction based search method in accordance with one embodiment of the present invention. Referring to FIG. 1, the method includes at least steps S102 to S106:
  • Step S102 Receive a trigger instruction that the user performs a search based on the current interface.
  • Step S104 Receive a touch sliding operation performed by the user on the current interface, and determine a sliding area according to the touch sliding operation;
  • Step S106 extract an object based on the sliding area, and perform a search for the extracted object.
  • the embodiment of the invention implements a touch interaction based search method.
  • the user issues a triggering instruction for searching based on the current interface, and then, according to the touch sliding operation performed by the user on the current interface, determines a sliding area, and then extracts an object based on the sliding area, and performs an extraction on the extracted object. search for. It can be seen from the above that the search in the embodiment of the present invention does not need to open the search app, input the object in the search box, or copy and paste the selected object into the search box, and then search.
  • the embodiment of the present invention defines a sliding area by a user's touch sliding operation, directly extracts and searches for objects in the sliding area, and solves the problem that the keyboard (including the soft keyboard input search is slow and inconvenient), thereby saving the above.
  • the search box is opened, copied and pasted, etc., which makes the search operation simple and easy, solves the problem that the terminal based on the touch interaction wants to search, saves time, and improves the user experience.
  • the sliding area can fully reflect the user's willingness to search, and solves the disadvantage that the existing part of the search app can only copy the entire piece of content and cannot accurately search for a single word or a certain discrete words, thereby improving the accuracy of the search word.
  • the touch interaction-based search method provided by the embodiment of the present invention is applicable to any terminal that provides a touch interaction mode, and in particular, a mobile terminal that provides a touch screen.
  • the user operates the touch screen with a finger or a touch pen, and can clearly and intentionally define the sliding area to better reflect the user's search intention.
  • a screenshot of the corresponding sliding area can be obtained.
  • the screenshots can reflect the content of the sliding area in an original way, avoiding errors in data or elements or objects compared with the actual situation, and increasing the authenticity and integrity of the data.
  • the screenshot may be for a webpage, it may be an image for an animation, it may be a certain image for the video, it may be an application interface for an app, it may be for the desktop of the terminal, and may be in the user's photo gallery. Pictures or photos, and more. It can be seen that the content included in the screenshot may be quite rich. Therefore, after the screenshot, the embodiment of the present invention needs to identify the sliding area screenshot and extract one or more objects contained therein.
  • the extracted object may include at least one of the following: a text, a picture, a symbol.
  • the recognition technology can use a variety of, for example, picture OCR (Optical Character Recognition) recognition technology, that is, the background through the OCR Identify the text information extracted from the picture; for example, UiAutomator automated test technology: UiAutomator is android comes with an automated test tool, which can be used to extract the current page text information, this technology can get 100% correct text. There are different application scenarios for each recognition technology. The combination of UiAutomator and OCR can greatly improve the recognition accuracy.
  • picture OCR Optical Character Recognition
  • an operation means for presenting a translucent operable layer on the current interface may be adopted.
  • the semi-transparent and operable layer is covered on the current interface, which enables the user to see the interface clearly, and can perform a sliding operation according to the user's intention on the layer, so that the sliding area determined by the sliding operation accurately includes the user's desire.
  • a water mist glass erasing effect appears on the interface.
  • the user's finger slides over the touch screen the water mist in the area where the finger slides is erased, and images such as characters and pictures are displayed.
  • the operational layer can be implemented by using the trigger floating control.
  • the floating control can be used to provide a search trigger entry in the current interface, and the user triggers the entry input trigger instruction through the search to trigger the subsequent process.
  • the shape of the search trigger entry can be a clickable circle, a square, a polygon, etc. To ensure the normal application of the touch interaction interface, it is usually set at the corner position of the screen, and after being triggered, the translucent operable layer can be extracted. , and then complete the follow-up process.
  • 2 shows a schematic diagram of a short message interface in accordance with one embodiment of the present invention. Referring to Figure 2, the search trigger entry is set to a double ring shape that, when clicked, triggers the display of the translucent operational layer.
  • the touch interaction based search method provided by the embodiment of the present invention will be described by taking a text search as an example. Because the touch of text through the touch of the way to achieve the more visual, it can be called the word search. This embodiment is applied to a mobile terminal with a touch screen.
  • the user When the user is in any lighted interface state, click the touch word generated by the floating control to search for the trigger entry, that is, generate a layer of mask on the interface (ie, the semi-transparent operable layer mentioned above, referred to as the mask) ), the user can touch the highlighted area according to the position he wants to select, and the covered text part in the highlighted area is the text that the user wants to identify and search, click the OK button below to pop up A screenshot of the highlighted area, and the text in the screenshot is identified based on this screenshot. Enter the search box above and the user clicks the search button to achieve a quick touch search.
  • the mask the layer of mask on the interface
  • the embodiment recognizes the text in the screenshot according to the identification strategy of the predetermined line break fingerprint, and the strategy is mainly applied to the mechanism that the user wants to find the words in two lines respectively.
  • the word recognition is mainly based on the boundary of the pixel points (four points of left and right (x, y)) which are similar to the rectangular highlight area generated by the user's finger sliding on the mask.
  • the text belongs to two lines and the finger is swiped twice, it will still be based on the top and bottom four pixels of the merged area, so that the range of the four pixels is much larger than the two areas highlighted by the finger.
  • the solution is: consider the line-change type of the two times their degree of coincidence, if the degree of coincidence is very low (for example, less than 30%, the threshold can be adjusted), as two The screenshots are separately identified, which increases the accuracy.
  • the flow of this embodiment can be summarized as: click touch word search trigger entry -> system screen capture and Uiautomator to obtain current screen data -> crop image analysis text data transfer to background OCR analysis -> get OCR return result -> call search engine to search .
  • the touch search method of the embodiment the user's cumbersome software disk input can be omitted, which is very convenient.
  • the boundary is extended by a certain threshold (for example, 30%) based on the area touched by the user, relative to the original Screenshot, after the border expansion, there will be a new screenshot slightly larger than the original screenshot.
  • the new screenshot can include half of the text in the interface, which solves the problem of incomplete text interception in the user's touch area and ensures searchable content. The integrity of the acquisition.
  • the short message interface shown in FIG. 2 is taken as a schematic diagram.
  • the express delivery number exists in the short message content.
  • the focus is on the current state of the express delivery, where it is sent, how long it takes to get to the hands of the courier, how much contact information the courier brother has. This information needs to be queried on the network based on the courier delivery number.
  • the user adds a semi-transparent operable layer to the short message interface by triggering a search triggering switch set to a double circular shape, which is similar to frosted glass in this example.
  • 3 shows a schematic diagram of adding an operational layer to the interface of FIG. 2, in accordance with one embodiment of the present invention. Subsequently, the user slides on the operable layer, the swiped area removes the translucent effect, and clearly displays and identifies the object therein, see the number representing the courier delivery number in FIG.
  • the embodiment of the present invention can perform the express delivery number by using the finger sliding. Selecting and searching for it, getting the query results, is simple and fast, greatly increasing the user experience.
  • FIG. 5 shows a schematic diagram of an IM chat record in accordance with one embodiment of the present invention.
  • User A mentions a place, but User B does not know.
  • the search triggers the entry and a semi-transparent actionable layer is added to the chat record.
  • Figure 6 shows a schematic diagram of the addition of an operational layer on the interface shown in Figure 5. Subsequently, the user uses his finger to slide on the "360 company headquarters” to remove the translucent effect, clearly display and identify the "360 company headquarters", as shown in Figure 6.
  • Figure 7 shows a schematic diagram of the address of "360 Corporate Headquarters” in accordance with one embodiment of the present invention.
  • the selection of the designated place can be performed by using the finger sliding, and the search result is obtained, and the query result is simple and fast, which greatly increases the user experience experience.
  • FIG. 8 is a block diagram showing the structure of a touch interaction based search device according to an embodiment of the present invention. Referring to Figure 8, the device includes at least:
  • the receiving module 810 is adapted to receive a trigger instruction that the user performs a touch search based on the current interface
  • the receiving module 810 is further adapted to receive a touch sliding operation performed by the user on the current interface
  • the area determining module 820 is coupled to the receiving module 810, and is adapted to determine a sliding area according to the sliding operation;
  • An object identification module 830 coupled to the region determining module 820, is adapted to extract an object therein based on the sliding region;
  • a search module 840 coupled to the object recognition module 830, is adapted to search for the extracted object.
  • FIG. 9 illustrates a touch interaction based search device according to an embodiment of the present invention. Another structural diagram. Referring to FIG. 9, the touch interaction based search device includes, in addition to the structure shown in FIG. 8, a:
  • the layer setting module 910 is coupled to the receiving module 810 and the area determining module 820, and is adapted to receive a triggering instruction of the touch search by the user based on the current interface, and then present a translucent operable layer on the current interface;
  • the area determining module 820 is further adapted to: perform a sliding operation on the operable layer.
  • the operational layer is implemented using a trigger hovering control.
  • the receiving module 810 is further adapted to: receive a triggering command by the user to trigger an entry input through a search provided by the hovering control in the current interface.
  • the object recognition module 890 is further adapted to:
  • the object comprises at least one of the following: text, picture, symbol.
  • the user issues a triggering instruction for searching based on the current interface, and then, according to the touch sliding operation performed by the user on the current interface, determines a sliding area, and then extracts an object based on the sliding area, and performs an extraction on the extracted object. search for. It can be seen from the above that the search in the embodiment of the present invention does not need to open the search app, input the object in the search box, or copy and paste the selected object into the search box, and then search.
  • the embodiment of the present invention defines a sliding area by a user's touch sliding operation, directly extracts and searches for objects in the sliding area, and solves the problem that the keyboard (including the soft keyboard input search is slow and inconvenient), thereby saving the above.
  • the search box is opened, copied and pasted, etc., which makes the search operation simple and easy, solves the problem that the terminal based on the touch interaction wants to search, saves time, and improves the user experience.
  • the sliding area can fully reflect the user's willingness to search, and solves the drawback that the existing part of the search app can only copy the entire piece of content and cannot accurately search for a single word or a few discrete words, thereby improving the accuracy of the search.
  • modules in the devices of the embodiments can be adaptively changed and placed in one or more devices different from the embodiment.
  • the modules or units or components of the embodiments may be combined into one module or unit or component, and further they may be divided into a plurality of sub-modules or sub-units or sub-components.
  • any combination of the features disclosed in the specification, including the accompanying claims, the abstract and the drawings, and any methods so disclosed, or All processes or units of the device are combined.
  • Each feature disclosed in this specification (including the accompanying claims, the abstract and the drawings) may be replaced by alternative features that provide the same, equivalent or similar purpose.
  • the various component embodiments of the present invention may be implemented in hardware, or in a software module running on one or more processors, or in a combination thereof.
  • a microprocessor or digital signal processor may be used in practice to implement some or all of the functionality of some or all of the components of the touch interaction based search device in accordance with embodiments of the present invention.
  • the invention can also be implemented as a device or device program (e.g., a computer program and a computer program product) for performing some or all of the methods described herein.
  • a program implementing the invention may be stored on a computer readable medium or may be in the form of one or more signals.
  • Such a letter The number can be downloaded from the Internet website, or provided on a carrier signal, or in any other form.
  • Figure 10 illustrates a computing device that can implement a touch interaction based search method in accordance with the present invention.
  • the computing device conventionally includes a processor 1010 and a computer program product or computer readable medium in the form of a memory 1020.
  • the memory 1020 may be an electronic memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), an EPROM, a hard disk, or a ROM.
  • the memory 1020 has a memory space 1030 for executing program code 1031 of any of the above method steps.
  • storage space 1030 for program code may include various program code 1031 for implementing various steps in the above methods, respectively.
  • the program code can be read from or written to one or more computer program products.
  • Such computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such a computer program product is typically a portable or fixed storage unit as described with reference to FIG.
  • the storage unit may have storage segments, storage spaces, and the like that are similarly arranged to memory 1020 in the computing device of FIG.
  • the program code can be compressed, for example, in an appropriate form.
  • the storage unit includes computer readable code 1031', ie, code that can be read by, for example, a processor such as 1010, which when executed by a computing device causes the computing device to perform each of the methods described above step.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开了一种基于触摸交互的搜索方法及装置,包括:接收用户基于当前界面进行搜索的触发指令;接收所述用户对所述当前界面进行的触摸滑动操作,并根据所述触摸滑动操作确定滑动区域;基于所述滑动区域提取其中的对象,并针对所述对象进行搜索。采用本发明能够解决了通过键盘(包括软键盘输入搜索慢、不便捷的问题),节省了上述搜索框打开、复制粘贴等操作,使得搜索操作变得简单易行,解决于基于触摸交互的终端想搜就搜的问题,节省时间,并提高了用户感受体验。

Description

基于触摸交互的搜索方法及装置 技术领域
本发明涉及互联网搜索领域,尤其涉及一种基于触摸交互的搜索方法及装置。
背景技术
随着互联网业务的发展趋势,移动终端因其可移动性、便捷性越来越成为互联网业务的主要载体。人们越来越趋向于在随身携带的移动终端上查找信息、上网浏览、游戏娱乐等等。
移动终端(例如智能手机)上的搜索服务,例如各种搜索应用(app),都是基于搜索框输入的。使用时,需要打开搜索app,在搜索框中输入搜索词后,点击确定,触发搜索操作。
但是,考虑到触摸屏需要用户触摸输入,而搜索框因其窄小存在输入不便捷的问题,导致搜索体验差,效率低。尤其是当用户在使用智能手机时基于屏幕上的字符、图像等有各种即时的搜索需求时,再打开搜索app,弹出搜索框中再输入,非常的不便捷。
发明内容
鉴于上述问题,提出了本发明以便提供一种克服上述问题或者至少部分地解决或者减缓上述问题的基于触摸交互的搜索方法和相应的装置。
根据本发明的一个方面,提供了一种基于触摸交互的搜索方法,包括:
接收用户基于当前界面进行搜索的触发指令;
接收所述用户对所述当前界面进行的触摸滑动操作,并根据所述触摸滑动操作确定滑动区域;
基于所述滑动区域提取其中的对象,并针对所述对象进行搜索。
根据本发明的另一个方面,提供了一种基于触摸交互的搜索装置,包括:
接收模块,适于接收用户基于当前界面进行触摸搜索的触发指令;
所述接收模块,还适于接收所述用户对所述当前界面进行的触摸滑动操作;
区域确定模块,适于根据所述滑动操作确定滑动区域;
对象识别模块,适于基于所述滑动区域提取其中的对象;
搜索模块,适于针对所述对象进行搜索。
根据本发明的又一个方面,提供了一种计算机程序,其包括计算机可读代码,当所述计算机可读代码在计算设备上运行时,导致所述计算设备执行根据上述中的任一种基于触摸交互的搜索方法。
根据本发明的再一个方面,提供了一种计算机可读介质,其中存储了如上所述的计算机程序。
本发明的有益效果为:
在本发明实施例中,用户基于当前界面发出搜索的触发指令,随后,根据用户对当前界面上进行的触摸滑动操作,确定滑动区域,进而基于滑动区域提取其中的对象,并针对提取的对象进行搜索。由上述可知,本发明实施例的搜索并不需要打开搜索app,在搜索框中输入对象,或将选择的对象复制粘贴到搜索框中,再进行搜索。相应的,本发明实施例通过用户的触摸滑动操作界定了滑动区域,对滑动区域中的对象直接提取并搜索,解决了通过键盘(包括软键盘输入搜索慢、不便捷的问题),节省了上述搜索框打开、复制粘贴等操作,使得搜索操作变得简单易行,解决了基于触摸交互的终端想搜就搜的问题,节省时间,并提高了用户感受体验。另外,滑动区域能够充分体现用户的搜索意愿,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了搜索词的精准性。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。
附图说明
通过阅读下文优选实施方式的详细描述,各种其他的优点和益处对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本发明的限制。而且在整个附图中,用 相同的参考符号表示相同的部件。在附图中:
图1示意性示出了根据本发明一个实施例的基于触摸交互的搜索方法的处理流程图;
图2示意性示出了根据本发明一个实施例的短信界面的示意图;
图3示意性示出了根据本发明一个实施例的在图2界面上增加可操作图层的示意图;
图4示意性示出了根据本发明一个实施例的对快递速递号进行搜索的查询结果的示意图;
图5示意性示出了根据本发明一个实施例的IM聊天记录的示意图;
图6示意性示出了根据本发明一个实施例的在图5所示的界面上增加可操作图层后的示意图;
图7示意性示出了根据本发明一个实施例的“360公司总部”的地址示意图;
图8示意性示出了根据本发明一个实施例的基于触摸交互的搜索装置的结构示意图;
图9示意性示出了根据本发明一个实施例的基于触摸交互的搜索装置的另一个结构示意图;
图10示意性地示出了用于执行根据本发明的基于触摸交互的搜索方法的计算设备的框图;以及
图11示意性地示出了用于保持或者携带实现根据本发明的基于触摸交互的搜索方法的程序代码的存储单元。
具体实施方式
下面结合附图和具体的实施方式对本发明作进一步的描述。虽然附图中显示了本公开的示例性实施例,然而应当理解,可以以各种形式实现本公开而不应被这里阐述的实施例所限制。相反,提供这些实施例是为了能够更透彻地理解本公开,并且能够将本公开的范围完整的传达给本领域的技术人员。
为解决上述技术问题,本发明实施例提供了一种基于触摸交互的搜索方法。图1示出了根据本发明一个实施例的基于触摸交互的搜索方法的处理流程图。参见图1,该方法至少包括步骤S102至步骤S106:
步骤S102、接收用户基于当前界面进行搜索的触发指令;
步骤S104、接收用户对当前界面进行的触摸滑动操作,并根据触摸滑动操作确定滑动区域;
步骤S106、基于滑动区域提取其中的对象,并针对提取的对象进行搜索。
本发明实施例实现了一种基于触摸交互的搜索方法。在本发明实施例中,用户基于当前界面发出搜索的触发指令,随后,根据用户对当前界面上进行的触摸滑动操作,确定滑动区域,进而基于滑动区域提取其中的对象,并针对提取的对象进行搜索。由上述可知,本发明实施例的搜索并不需要打开搜索app,在搜索框中输入对象,或将选择的对象复制粘贴到搜索框中,再进行搜索。相应的,本发明实施例通过用户的触摸滑动操作界定了滑动区域,对滑动区域中的对象直接提取并搜索,解决了通过键盘(包括软键盘输入搜索慢、不便捷的问题),节省了上述搜索框打开、复制粘贴等操作,使得搜索操作变得简单易行,解决了基于触摸交互的终端想搜就搜的问题,节省时间,并提高了用户感受体验。另外,滑动区域能够充分体现用户的搜索意愿,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了搜索词的精准性。
本发明实施例提供的基于触摸交互的搜索方法适用于任何提供触摸交互方式的终端,尤其是目前常见的提供触摸屏的移动终端。用户使用手指或触摸笔对触摸屏进行操作,能够可见地、有意向地界定出滑动区域,更好地体现出用户的搜索意向。具体地,对滑动区域进行截取之后,可以得到相应滑动区域截图。截图能够原汁原味地体现出滑动区域的内容,避免数据或元素或对象等与实际情况相比出现误差,增加数据的真实性和完整性。截图可能是针对某个网页,可能是针对某部动漫的图片,可能是针对视频的某频图像,可能是针对某个app的应用界面,可能是针对终端的桌面,可能是用户的图片库中的图片或照片,等等。可以看出,截图中包括的内容可能相当丰富,因此,本发明实施例在截图之后,需要对滑动区域截图进行识别,提取其中包含的一个或多个对象。优选地,提取的对象可以包括下列至少之一:文字、图片、符号。而识别技术可以使用多种,例如,图片OCR(Optical Character Recognition,光学字符识别)识别技术,即,后台通过OCR 识别从图片中提取文字信息;再例如,UiAutomator自动化测试技术:UiAutomator是android自带自动化测试工具,可以用来提取当前页面文本信息,这种技术可以获取100%正确文字。各识别技术存在不同的适用场景,UiAutomator与OCR结合能极大提高识别准确度。
为使得基于触摸交互的搜索方法对于用户的可视性更强,可以在接收用户基于当前界面进行触摸搜索的触发指令之后,采用在当前界面上呈现半透明的可操作图层的操作手段。半透明的可操作图层覆盖在当前界面上,既能够使得用户看清楚界面,又能够在图层上针对界面进行符合用户意向的滑动操作,使得滑动操作所确定的滑动区域中准确包括用户想要搜索的内容,实现时,在界面上出现类似水雾玻璃擦除效果,用户手指滑过触摸屏时,手指滑过的区域水雾被擦除,显示出其中的文字、图片等图像。
其中,可操作图层可以利用触发悬浮控件实现。实现时,可以利用悬浮控件在当前界面中提供搜索触发入口,用户通过搜索触发入口输入触发指令,以触发后续流程。搜索触发入口的形状可以是可点击的圆形、方形、多边形等,为保证触摸交互界面的正常应用,通常在屏幕的边角位置设置,被触发后,可以将半透明的可操作图层引出,进而完成后续流程。图2示出了根据本发明一个实施例的短信界面的示意图。参见图2,搜索触发入口设置为双圆环形状,当其被点击时,触发半透明可操作图层的显示。
现以文字搜索为例对本发明实施例提供的基于触摸交互的搜索方法进行说明。因通过触摸交互方式实现对文字的摸索,更形象地,可以称之为摸字搜索。本实施例应用于带触摸屏的移动终端。
当用户在屏幕处于任何点亮的界面状态下,点击悬浮控件生成的摸词搜索触发入口,即在界面上生成一层蒙板(即前文提及的半透明的可操作图层,简称蒙板),用户可以根据想选择的位置,通过手指滑动,摸出高亮的区域,而高亮区域内的涵盖的文字部分,则为用户想要识别并搜索的文字,点击下方的确定按钮,弹出高亮区域的截图,以及基于此截图识别出截图中的文字,输入到上方的搜索框中,用户点击搜索按钮,即可实现快捷摸字搜索。
其中,本实施例根据预定的换行摸字的识别策略在截图中识别出文字,这种策略主要应用于用户想要摸出的字分别在两行的机制。由 于摸字识别主要是基于用户手指滑动在蒙板上产生的类似矩形高亮区域的像素点的边界(左右上下(x,y)四个点)来识别的。但是如果文本属于两行,手指滑动两次,也依然会根据两次合并后区域左右上下四个像素点,这样四个像素点圈出的范围就比手指滑动两次高亮区域要大很多,就使得没有精准在高亮词上,因此解决的方法是:考虑换行摸字这两次他们的重合度高低,如果重合度很低(例如30%以下,阈值可以再调节),就作为两张截图来分别识别,这样就提高了精准程度。
本实施例的流程可以归纳为:点击摸词搜索触发入口->系统截屏和Uiautomator获取当前屏幕数据->裁剪图片分析文本数据传递给后台OCR分析->获取OCR返回结果->调用搜索引擎进行搜索。通过本实施例的摸字搜索方式,可省略用户繁琐的软件盘输入,非常便捷。
在本实施例中,若用户手指滑动摸出的高亮部分不准确,有文字截取未截取完整的情形,则基于用户摸字的区域,将边界扩展一定阈值(比如30%),相对于原截图,边界扩展后就有了一张比原截图略大的新截图,新截图能够将截取一半的文字包含在界面中,这样就解决了用户摸字区域文字截取不完整的问题,保证了可搜索内容获取的完整性。
实施例一
以图2所示的短信界面作为示意图。其中,短信内容中存在快递速递号。对于用户而言,其关注地是快递目前的状态,发出到哪儿了,有多久能到自己手里,快递小哥的联系方式是多少等等。这些信息需要根据快递速递号在网络中进行查询。
在本实施例中,用户通过触发设置为双圆环形的搜索触发入口,在短信界面上增加半透明的可操作图层,本例中类似于毛玻璃。图3示出了根据本发明一个实施例的在图2界面上增加可操作图层的示意图。随后,用户在可操作图层上进行滑动,滑动过的区域去除半透明的效果,清晰显示并识别出其中的对象,参见图3中的代表快递速递号的数字。
随后,根据图3中的快递速递号进行搜索,得到图4所示的查询结果。
由此可见,本发明实施例利用手指滑动就能够进行快递速递号的 选择,并对其进行搜索,得到查询结果,简单快捷,大大增加了用户的感受体验。
实施例二
本实施例以即时通信消息(IM)为例进行说明。图5示出了根据本发明一个实施例的IM聊天记录的示意图。其中用户A提及某个地点,但是用户B并不知道。此时,触发搜索触发入口,在聊天记录上增加半透明的可操作图层。图6示出了在图5所示的界面上增加可操作图层后的示意图。随后,用户利用手指在“360公司总部”上滑动,去除半透明的效果,清晰显示并识别出“360公司总部”,具体见图6。
随后,对识别出的“360公司总部”进行搜索,得到其具体地址。图7示出了根据本发明一个实施例的“360公司总部”的地址示意图。
由此可见,本发明实施例利用手指滑动就能够进行指定地点的选定,并对其进行搜索,得到查询结果,简单快捷,大大增加了用户的感受体验。
本实施例仅仅是以文字为例进行说明,在实施应用中,图片、符号等其他对象的搜索方式相类似,本领域技术人员能够根据上述实施例相应实现,在此不做赘述。
基于同一发明构思,本发明实施例提供了一种基于触摸交互的搜索装置,用于支持上述任意一个实施例所提供的基于触摸交互的搜索方法。图8示出了根据本发明一个实施例的基于触摸交互的搜索装置的结构示意图。参见图8,该装置至少包括:
接收模块810,适于接收用户基于当前界面进行触摸搜索的触发指令;
接收模块810,还适于接收用户对当前界面进行的触摸滑动操作;
区域确定模块820,与接收模块810耦合,适于根据滑动操作确定滑动区域;
对象识别模块830,与区域确定模块820耦合,适于基于滑动区域提取其中的对象;
搜索模块840,与对象识别模块830耦合,适于针对提取的对象进行搜索。
图9示出了根据本发明一个实施例的基于触摸交互的搜索装置的 另一个结构示意图。参见图9,基于触摸交互的搜索装置除包括图8所示的结构之外,还包括:
图层设置模块910,分别与接收模块810及区域确定模块820耦合,适于接收用户基于当前界面进行触摸搜索的触发指令之后,在当前界面上呈现半透明的可操作图层;
区域确定模块820,还适于:在可操作图层上进行滑动操作。
在一个优选的实施例中,可操作图层利用触发悬浮控件实现。
在一个优选的实施例中,接收模块810还可以适于:接收用户通过悬浮控件在当前界面中提供的搜索触发入口输入的触发指令。
在一个优选的实施例中,对象识别模块890,还适于:
对滑动区域进行截取,得到相应滑动区域截图;
对滑动区域截图进行识别,提取其中包含的一个或多个对象。
在一个优选的实施例中,对象包括下列至少之一:文字、图片、符号。
采用本发明实施例提供的基于触摸交互的搜索方法及装置能够达到如下有益效果:
在本发明实施例中,用户基于当前界面发出搜索的触发指令,随后,根据用户对当前界面上进行的触摸滑动操作,确定滑动区域,进而基于滑动区域提取其中的对象,并针对提取的对象进行搜索。由上述可知,本发明实施例的搜索并不需要打开搜索app,在搜索框中输入对象,或将选择的对象复制粘贴到搜索框中,再进行搜索。相应的,本发明实施例通过用户的触摸滑动操作界定了滑动区域,对滑动区域中的对象直接提取并搜索,解决了通过键盘(包括软键盘输入搜索慢、不便捷的问题),节省了上述搜索框打开、复制粘贴等操作,使得搜索操作变得简单易行,解决于基于触摸交互的终端想搜就搜的问题,节省时间,并提高了用户感受体验。另外,滑动区域能够充分体现用户的搜索意愿,解决了现有部分搜索app只能复制整段内容而不能精确搜索单个词或某几个离散词汇的弊端,提高了搜索的精准性。
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本发明的实施例可以在没有这些具体细节的情况下实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。
类似地,应当理解,为了精简本公开并帮助理解各个发明方面中的一个或多个,在上面对本发明的示例性实施例的描述中,本发明的各个特征有时被一起分组到单个实施例、图、或者对其的描述中。然而,并不应将该公开的方法解释成反映如下意图:即所要求保护的本发明要求比在每个权利要求中所明确记载的特征更多的特征。更确切地说,如下面的权利要求书所反映的那样,发明方面在于少于前面公开的单个实施例的所有特征。因此,遵循具体实施方式的权利要求书由此明确地并入该具体实施方式,其中每个权利要求本身都作为本发明的单独实施例。
本领域那些技术人员可以理解,可以对实施例中的设备中的模块进行自适应性地改变并且把它们设置在与该实施例不同的一个或多个设备中。可以把实施例中的模块或单元或组件组合成一个模块或单元或组件,以及此外可以把它们分成多个子模块或子单元或子组件。除了这样的特征和/或过程或者单元中的至少一些是相互排斥之外,可以采用任何组合对本说明书(包括伴随的权利要求、摘要和附图)中公开的所有特征以及如此公开的任何方法或者设备的所有过程或单元进行组合。除非另外明确陈述,本说明书(包括伴随的权利要求、摘要和附图)中公开的每个特征可以由提供相同、等同或相似目的的替代特征来代替。
此外,本领域的技术人员能够理解,尽管在此所述的一些实施例包括其它实施例中所包括的某些特征而不是其它特征,但是不同实施例的特征的组合意味着处于本发明的范围之内并且形成不同的实施例。例如,在下面的权利要求书中,所要求保护的实施例的任意之一都可以以任意的组合方式来使用。
本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本发明实施例的基于触摸交互的搜索设备中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本发明的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信 号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。
例如,图10示出了可以实现根据本发明的基于触摸交互的搜索方法的计算设备。该计算设备传统上包括处理器1010和以存储器1020形式的计算机程序产品或者计算机可读介质。存储器1020可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器1020具有用于执行上述方法中的任何方法步骤的程序代码1031的存储空间1030。例如,用于程序代码的存储空间1030可以包括分别用于实现上面的方法中的各种步骤的各个程序代码1031。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图11所述的便携式或者固定存储单元。该存储单元可以具有与图10的计算设备中的存储器1020类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码1031’,即可以由例如诸如1010之类的处理器读取的代码,这些代码当由计算设备运行时,导致该计算设备执行上面所描述的方法中的各个步骤。
本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本发明的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。
应该注意的是上述实施例对本发明进行说明而不是对本发明进行限制,并且本领域技术人员在不脱离所附权利要求的范围的情况下可设计出替换实施例。在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本发明可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。
此外,还应当注意,本说明书中使用的语言主要是为了可读性和 教导的目的而选择的,而不是为了解释或者限定本发明的主题而选择的。因此,在不偏离所附权利要求书的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。对于本发明的范围,对本发明所做的公开是说明性的,而非限制性的,本发明的范围由所附权利要求书限定。

Claims (14)

  1. 一种基于触摸交互的搜索方法,包括:
    接收用户基于当前界面进行搜索的触发指令;
    接收所述用户对所述当前界面进行的触摸滑动操作,并根据所述触摸滑动操作确定滑动区域;
    基于所述滑动区域提取其中的对象,并针对所述对象进行搜索。
  2. 根据权利要求1所述的方法,其中,所述接收用户基于当前界面进行触摸搜索的触发指令之后,包括:在所述当前界面上呈现半透明的可操作图层;
    对所述当前界面进行滑动操作,包括:
    在所述可操作图层上进行滑动操作。
  3. 根据权利要求2所述的方法,其中,所述可操作图层利用触发悬浮控件实现。
  4. 根据权利要求3所述的方法,其中,所述接收用户基于当前界面进行触摸搜索的触发指令,包括:
    所述悬浮控件在所述当前界面中提供搜索触发入口;
    接收所述用户通过所述搜索触发入口输入的所述触发指令。
  5. 根据权利要求1至4任一项所述的方法,其中,基于所述滑动区域提取其中的对象,包括:
    对所述滑动区域进行截取,得到相应滑动区域截图;
    对所述滑动区域截图进行识别,提取其中包含的一个或多个对象。
  6. 根据权利要求1至5任一项所述的方法,其中,所述对象包括下列至少之一:文字、图片、符号。
  7. 一种基于触摸交互的搜索装置,包括:
    接收模块,适于接收用户基于当前界面进行触摸搜索的触发指令;
    所述接收模块,还适于接收所述用户对所述当前界面进行的触摸滑动操作;
    区域确定模块,适于根据所述滑动操作确定滑动区域;
    对象识别模块,适于基于所述滑动区域提取其中的对象;
    搜索模块,适于针对所述对象进行搜索。
  8. 根据权利要求7所述的装置,其中,还包括:
    图层设置模块,适于所述接收用户基于当前界面进行触摸搜索的触发指令之后,在所述当前界面上呈现半透明的可操作图层;
    所述区域确定模块,还适于:
    在所述可操作图层上进行滑动操作。
  9. 根据权利要求8所述的装置,其中,所述可操作图层利用触发悬浮控件实现。
  10. 根据权利要求9所述的装置,其中,所述接收模块还适于:
    接收所述用户通过所述悬浮控件在所述当前界面中提供的搜索触发入口输入的所述触发指令。
  11. 根据权利要求7至10任一项所述的装置,其中,所述对象识别模块,还适于:
    对所述滑动区域进行截取,得到相应滑动区域截图;
    对所述滑动区域截图进行识别,提取其中包含的一个或多个对象。
  12. 根据权利要求7至11任一项所述的装置,其中,所述对象包括下列至少之一:文字、图片、符号。
  13. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算设备上运行时,导致所述计算设备执行根据权利要求1-6中的任一种基于触摸交互的搜索方法。
  14. 一种计算机可读介质,其中存储了如权利要求13所述的计算机程序。
PCT/CN2015/094151 2014-12-26 2015-11-09 基于触摸交互的搜索方法及装置 WO2016101717A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/539,943 US20170351371A1 (en) 2014-12-26 2015-11-09 Touch interaction based search method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410834136.2 2014-12-26
CN201410834136.2A CN105786930B (zh) 2014-12-26 2014-12-26 基于触摸交互的搜索方法及装置

Publications (1)

Publication Number Publication Date
WO2016101717A1 true WO2016101717A1 (zh) 2016-06-30

Family

ID=56149202

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/094151 WO2016101717A1 (zh) 2014-12-26 2015-11-09 基于触摸交互的搜索方法及装置

Country Status (3)

Country Link
US (1) US20170351371A1 (zh)
CN (1) CN105786930B (zh)
WO (1) WO2016101717A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109710794A (zh) * 2018-12-29 2019-05-03 联想(北京)有限公司 一种信息处理方法和电子设备

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106814964A (zh) * 2016-12-19 2017-06-09 广东小天才科技有限公司 一种在移动终端进行内容搜索的方法及内容搜索装置
CN106843884B (zh) * 2017-01-24 2020-05-19 宇龙计算机通信科技(深圳)有限公司 一种查询数据处理方法及其设备
CN107168635A (zh) * 2017-05-05 2017-09-15 百度在线网络技术(北京)有限公司 信息呈现方法和装置
CN107357577A (zh) * 2017-07-01 2017-11-17 北京奇虎科技有限公司 一种基于移动终端的用户交互界面的搜索方法及装置
CN107391017B (zh) * 2017-07-20 2022-05-17 Oppo广东移动通信有限公司 文字处理方法、装置、移动终端及存储介质
CN107480223B (zh) * 2017-08-02 2020-12-01 北京五八信息技术有限公司 一种搜索方法、装置及存储介质
CN108334273B (zh) * 2018-02-09 2020-08-25 网易(杭州)网络有限公司 信息显示方法及装置、存储介质、处理器、终端
CN108628524A (zh) * 2018-04-28 2018-10-09 尚谷科技(天津)有限公司 一种针对当前阅读内容的搜索装置
CN108549520B (zh) * 2018-04-28 2021-11-12 杭州悠书网络科技有限公司 一种针对当前阅读内容的搜索方法
CN109062871B (zh) * 2018-07-03 2022-05-13 北京明略软件系统有限公司 一种文本标注方法和装置、计算机可读存储介质
CN108958634A (zh) * 2018-07-23 2018-12-07 Oppo广东移动通信有限公司 快递信息获取方法、装置、移动终端以及存储介质
CN109977290A (zh) * 2019-03-14 2019-07-05 北京达佳互联信息技术有限公司 信息处理方法、系统、装置和计算机可读存储介质
CN113485594A (zh) * 2021-06-30 2021-10-08 上海掌门科技有限公司 消息记录搜索方法、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1860536A2 (en) * 2006-05-24 2007-11-28 LG Electronics Inc. Touch screen device and method of selecting files thereon
CN103294363A (zh) * 2013-05-20 2013-09-11 华为技术有限公司 一种搜索方法和终端
CN103324674A (zh) * 2013-05-24 2013-09-25 优视科技有限公司 网页内容选取方法及装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102087662A (zh) * 2011-01-24 2011-06-08 深圳市同洲电子股份有限公司 一种信息的搜索方法及搜索装置
AU2012101185B4 (en) * 2011-08-19 2013-05-02 Apple Inc. Creating and viewing digital note cards
CN102298520A (zh) * 2011-08-29 2011-12-28 上海量明科技发展有限公司 一种搜索工具的实现方法及系统
TWI544350B (zh) * 2011-11-22 2016-08-01 Inst Information Industry Input method and system for searching by way of circle
CN103455590B (zh) * 2013-08-29 2017-05-31 百度在线网络技术(北京)有限公司 在触屏设备中进行检索的方法和装置
US9329692B2 (en) * 2013-09-27 2016-05-03 Microsoft Technology Licensing, Llc Actionable content displayed on a touch screen
CN103984709A (zh) * 2014-04-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 一种在任意界面进行搜索的方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1860536A2 (en) * 2006-05-24 2007-11-28 LG Electronics Inc. Touch screen device and method of selecting files thereon
CN103294363A (zh) * 2013-05-20 2013-09-11 华为技术有限公司 一种搜索方法和终端
CN103324674A (zh) * 2013-05-24 2013-09-25 优视科技有限公司 网页内容选取方法及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109710794A (zh) * 2018-12-29 2019-05-03 联想(北京)有限公司 一种信息处理方法和电子设备

Also Published As

Publication number Publication date
CN105786930B (zh) 2019-11-26
US20170351371A1 (en) 2017-12-07
CN105786930A (zh) 2016-07-20

Similar Documents

Publication Publication Date Title
WO2016101717A1 (zh) 基于触摸交互的搜索方法及装置
US20210208776A1 (en) Techniques for image-based search using touch controls
US10789078B2 (en) Method and system for inputting information
CN106484266B (zh) 一种文本处理方法及装置
CN110276007B (zh) 用于提供信息的装置和方法
TWI685795B (zh) 資訊識別方法及裝置
KR102238809B1 (ko) 터치스크린 상에 표시되는 조치 가능한 콘텐츠
KR102173123B1 (ko) 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치
AU2011292026B2 (en) Touch-based gesture detection for a touch-sensitive device
TWI512598B (zh) 單次點擊標記使用者介面
WO2015043382A1 (zh) 一种适用于触屏设备的截图装置和方法
US20160132983A1 (en) Method and device for searching in a touch-screen apparatus
WO2016095689A1 (zh) 基于终端界面多次触控操作进行识别搜索的方法及系统
WO2016091095A1 (zh) 基于终端界面触控操作进行搜索的方法及系统
EP3183640B1 (en) Device and method of providing handwritten content in the same
US20110252316A1 (en) Translating text on a surface computing device
US9239961B1 (en) Text recognition near an edge
US10803339B2 (en) Data processing method and device for electronic book, and mobile terminal
CN106991179B (zh) 数据删除方法、装置及移动终端
JP2016524229A (ja) 検索推奨方法及び装置
US11556605B2 (en) Search method, device and storage medium
TWI686717B (zh) 資料的提取方法、裝置及終端設備
WO2020125481A1 (zh) 生成标识图案的方法和终端设备
WO2016155643A1 (zh) 一种基于输入的显示候选词的方法和装置
KR102303206B1 (ko) 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15871788

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 15539943

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 15871788

Country of ref document: EP

Kind code of ref document: A1