WO2018018801A1 - 用于搜索结果的交互方法和装置 - Google Patents

用于搜索结果的交互方法和装置 Download PDF

Info

Publication number
WO2018018801A1
WO2018018801A1 PCT/CN2016/106442 CN2016106442W WO2018018801A1 WO 2018018801 A1 WO2018018801 A1 WO 2018018801A1 CN 2016106442 W CN2016106442 W CN 2016106442W WO 2018018801 A1 WO2018018801 A1 WO 2018018801A1
Authority
WO
WIPO (PCT)
Prior art keywords
search result
user
interaction
instruction
search
Prior art date
Application number
PCT/CN2016/106442
Other languages
English (en)
French (fr)
Inventor
秦首科
刘晓春
张泽明
韩友
刘小月
曹密
程小华
王山雨
徐培治
江焱
陈震
邱学忠
Original Assignee
百度在线网络技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 百度在线网络技术(北京)有限公司 filed Critical 百度在线网络技术(北京)有限公司
Priority to US16/322,102 priority Critical patent/US11100180B2/en
Priority to JP2018554057A priority patent/JP7126453B2/ja
Publication of WO2018018801A1 publication Critical patent/WO2018018801A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/954Navigation, e.g. using categorised browsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Definitions

  • the present invention relates to the field of search engine technologies, and in particular, to an interaction method and apparatus for search results.
  • the embodiments of the present invention provide an interaction method and device for searching results, which can implement diverse interactions between users and rich media search results, and effectively enhance the user's immersive and scened experience.
  • an embodiment of the present invention provides an interaction method for a search result, where the search result is a rich media search result, including:
  • the determining, by the user of the electronic device, the interactive operation of the search result by using a browser includes:
  • the preset identifier includes: a first identifier used to start the camera of the electronic device, and further includes:
  • the interactive instruction includes an instruction to trigger the first identifier
  • image data of the user is collected by the camera, and the image data is displayed in a search result page.
  • the preset identifier further includes: a second identifier used to start the microphone of the electronic device, and further includes:
  • the voice data of the user is collected by the microphone when the interactive instruction includes an instruction to trigger the second identifier.
  • it also includes:
  • it also includes:
  • an embodiment of the present invention provides an interaction method for a search result, where the search result is a rich media search result, including:
  • the associated data of the search results that match the interactive instruction is read and presented to the user.
  • it also includes:
  • the associated data of the search result is not displayed on the search result page until the associated data of the search result matching the interactive instruction is read.
  • it also includes:
  • an embodiment of the present invention provides an interaction device for a search result, where the search result is a rich media search result, including:
  • a first extraction module configured to determine, when the user of the electronic device performs an interactive operation of the search result through the browser Extracting an interactive instruction in the interaction operation
  • a generating module configured to generate, according to a preset rule, a guiding interaction step that matches the interaction instruction
  • a prompting module configured to prompt the user according to the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • the first extraction module is further configured to:
  • the preset identifier includes: a first identifier used to start the camera of the electronic device, and further includes:
  • an acquiring module configured to: when the interactive instruction includes an instruction to trigger the first identifier, collect image data of the user through the camera, and display the image data in a search result page.
  • the preset identifier further includes: a second identifier used to start the microphone of the electronic device, where the collecting module is further configured to:
  • the voice data of the user is collected by the microphone when the interactive instruction includes an instruction to trigger the second identifier.
  • it also includes:
  • a first receiving module configured to receive an instruction that the user triggers the guiding interaction step on the search result page
  • an execution module configured to execute the guiding interaction step according to the instruction for triggering the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • it also includes:
  • a second receiving module configured to receive search result configuration information input by the user, and configure the search result according to the configuration information.
  • an embodiment of the present invention provides an interaction device for a search result, where the search result is a rich media search result, including:
  • a second extraction module configured to extract an interaction instruction based on the interaction operation when determining, by the user of the electronic device, the interaction operation of the search result through the browser;
  • a reading module configured to read association data of the search result that matches the interaction instruction, and display the associated data to the user.
  • it also includes:
  • a display module configured to: before the associating data of the search result that matches the interaction instruction
  • the search results page displays associated data for the search results.
  • it also includes:
  • a third receiving module configured to receive search result configuration information input by the user, and configure the search result according to the configuration information.
  • the interaction method and device for searching results provided by the embodiment of the present invention can realize the user and the rich media search result by prompting the user according to the guiding interaction step, so that the user interacts with the rich media search result according to the guiding interaction step. Diverse interactions effectively enhance the user's immersive and scene-based experience.
  • FIG. 1 is a schematic flow chart of an interaction method for search results according to an embodiment of the present invention
  • FIG. 2 is a schematic flow chart of an interaction method for search results according to another embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a search result page according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a model configuration of a search result page according to an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of an interaction method for search results according to another embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of an interaction method for search results according to another embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a user interacting with a search result by a swipe interaction manner according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a user interacting with a search result by a zoom interaction method according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of another search result page according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a user interacting with search results through a touch interaction manner according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of a scened experience according to an embodiment of the present invention.
  • FIG. 12 is a schematic diagram of another user interacting with search results through a touch interaction manner according to an embodiment of the present invention.
  • FIG. 13 is a schematic diagram of another user interacting with search results through a touch interaction manner in an embodiment of the present invention.
  • FIG. 14 is a schematic flowchart diagram of an interaction method for search results according to another embodiment of the present invention.
  • 15 is a schematic diagram of a user interacting with a search result by means of browsing and clicking interaction in an embodiment of the present invention
  • 16 is a schematic structural diagram of an interaction apparatus for search results according to an embodiment of the present invention.
  • FIG. 17 is a schematic structural diagram of an interaction apparatus for search results according to another embodiment of the present invention.
  • FIG. 18 is a schematic structural diagram of an interaction apparatus for search results according to another embodiment of the present invention.
  • FIG. 19 is a schematic structural diagram of an interaction apparatus for search results according to another embodiment of the present invention.
  • FIG. 1 is a schematic flow chart of an interaction method for search results according to an embodiment of the present invention. This embodiment is exemplified in the interactive device for search results configured as an interactive device for search results.
  • the interactive method for search results can be applied to a search engine in a browser of an electronic device.
  • the search result is a rich media search result.
  • the electronic device is, for example, a personal computer (PC), a cloud device or a mobile device, a mobile device such as a smart phone, or a tablet computer.
  • PC personal computer
  • cloud device or a mobile device
  • mobile device such as a smart phone
  • tablet computer a tablet computer
  • the interaction method for the search results includes:
  • the user may enter a search term in a search box in a search engine (eg, a Baidu search engine).
  • a search engine eg, a Baidu search engine
  • the user may input a search term in a search box of the search engine on the electronic device to obtain a rich media search result corresponding to the search term.
  • users can interact diversely with rich media search results.
  • determining that the user of the electronic device performs the interactive operation of the search result through the browser may include:
  • the preset identifier includes: a first identifier for starting the electronic device camera
  • the interaction method for the search result further includes: collecting, by the camera, when the interactive instruction includes the instruction for triggering the first identifier
  • the user's image data and the image data is displayed on the search results page.
  • the user may input a search term in a search box of the search engine on the electronic device, obtain a rich media search result corresponding to the search term, and start the electronic device camera by detecting a search result of the electronic device by the user of the electronic device.
  • the first identifier it is determined that the user searches through the browser.
  • the interactive operation of the result the image data of the user is collected by the camera, and the image data is displayed on the search result page.
  • FIG. 3 is a schematic diagram of a search result page according to an embodiment of the present invention.
  • FIG. 3 includes a search box 31, a search result 32, a camera 33, a microphone 34, and a model display area 35.
  • the search term entered in Baidu's search box 31 is "fashionable women's clothing”
  • the search result 32 of "fashion women's clothing” is displayed correspondingly on the left side of the search page, and when the user triggers the first logo of the electronic device camera 33, the user passes
  • the camera 33 collects image data of the user and displays the image data in a live model of the model display area 35 of the search result page. For example, referring to FIG.
  • the display area 42 correspondingly displays the image data of the user, and the user can interact with the rich media search result based on the image data, for example, the fashioned women's clothing in the search result 32 is tried on the basis of the image data, which is not limited.
  • the preset identifier further includes: a second identifier for starting the electronic device microphone
  • the interaction method for the search result further includes: passing the microphone when the interactive instruction includes the instruction for triggering the second identifier Collect user's voice data.
  • the user may input a search term in a search box of a search engine on the electronic device, obtain a search result corresponding to the search term, and use a second trigger for triggering the search result of the electronic device to activate the electronic device microphone through the browser.
  • the identification it is determined that the user performs the interactive operation of the search result through the browser, and the user's voice data is collected through the microphone.
  • the user's voice data is collected through the microphone, so that the user can be made. Interact with voice-based data and rich media search results to enhance user experience.
  • the user's voice data is collected through the microphone 34, and the interaction is triggered to display the model display area 35 of the search result page according to the user's voice data.
  • the live model in the middle is configured, and the user can interact with the rich media search result based on the voice data, for example, configuring the live model in the search result 35 based on the voice data, which is not limited.
  • the instruction for triggering the first identifier is extracted, and the user who detects the electronic device triggers through the browser.
  • the second identifier of the search result is used to activate the microphone of the electronic device, an instruction to trigger the second identifier is extracted.
  • S12 Generate a guiding interaction step that matches the interaction instruction according to the preset rule.
  • the preset rule may be pre-configured in the database of the electronic device, or may be configured on the server side, which is not limited thereto.
  • the interactive instruction includes an instruction to trigger the first identification for starting the electronic device camera
  • a guiding interaction step of matching the instruction for triggering the first identification for starting the electronic device camera is generated according to the preset rule to enter the user Line prompt.
  • the interactive instruction includes an instruction for triggering the second identifier for starting the electronic device microphone
  • S13 Prompt the user according to the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • the user is prompted according to the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • a boot interaction step matching the instruction for triggering the first identifier for starting the electronic device camera 33 is generated according to a preset rule, and the user is prompted according to the guiding step, so that the user according to the guiding interaction step
  • the live model in the model display area 35 is set. For example, when the user collects the image data of the user through the camera 33, the user can drag and drop the image in the search result 32 of the "fashionable women's clothing" on the left side of the search page. To the real model, the user is allowed to try on the fashion women's clothing in the search result 32 based on the image data, thereby improving the user's immersive and scene-like experience.
  • a boot interaction step matching an instruction for triggering a second identifier for starting the electronic device microphone 34 is generated according to a preset rule, and the user is prompted according to the guiding step, so that the user according to the guiding interaction step
  • the 3D model in the model display area 35 is set. For example, when the user is inconvenient to collect his own image data, a piece of voice data can be input through the microphone 34 to describe his or her body type, so that the user can match the real person in the search result 35 based on the voice data.
  • the model is configured to implement try-on and dress-up, or the user can input voice data through the microphone 34, and provide the user with a voice search function, so that the user can intelligently search for the clothing required, and quickly and conveniently implement the try-on and dress-up. There is no limit to this.
  • the user is prompted according to the guiding interaction step, so that the user interacts with the rich media search result according to the guiding interaction step, thereby enabling diverse interaction between the user and the rich media search result, and effectively improving the user's immersive and scene.
  • the user is prompted according to the guiding interaction step, so that the user interacts with the rich media search result according to the guiding interaction step, thereby enabling diverse interaction between the user and the rich media search result, and effectively improving the user's immersive and scene.
  • FIG. 5 is a schematic flowchart diagram of an interaction method for search results according to another embodiment of the present invention. This embodiment is exemplified in the interactive device for search results configured as an interactive device for search results.
  • the interaction method for the search results includes:
  • the user may enter a search term in a search box in a search engine (eg, a Baidu search engine).
  • a search engine eg, a Baidu search engine
  • the user may input a search term in a search box of the search engine on the electronic device, obtain a rich media search result corresponding to the search term, and use the browser to trigger the search result by the browser to start the electronic device.
  • the user is determined to perform the interactive operation of the search result through the browser, and the image data of the user is collected by the camera, and the image data is displayed on the search result page.
  • the user is determined to perform the search result through the browser.
  • the interaction operation the user collects the image data of the user through the camera, and displays the image data in the search result page, so that the user can interact with the rich media search result based on the image data, thereby improving the user experience.
  • FIG. 3 is a schematic diagram of a search result page according to an embodiment of the present invention.
  • FIG. 3 includes a search box 31, a search result 32, a camera 33, a microphone 34, and a model display area 35.
  • the search term entered in Baidu's search box 31 is "fashionable women's clothing”
  • the search result 32 of the "fashion women's clothing” corresponding to the left side of the search page is displayed when the user triggers the first logo of the electronic device camera 33.
  • the camera 33 collects image data of the user and displays the image data in a live model of the model display area 35 of the search result page. For example, referring to FIG.
  • the display area 42 correspondingly displays the image data of the user, and the user can interact with the rich media search result based on the image data, for example, the fashioned women's clothing in the search result 32 is tried on the basis of the image data, which is not limited.
  • the user may input a search term in the search box of the search engine on the electronic device to obtain a search result corresponding to the search term, and secondly trigger the search result of the electronic device to trigger the electronic device microphone through the browser.
  • the identification it is determined that the user performs the interactive operation of the search result through the browser, and the user's voice data is collected through the microphone.
  • the user's voice data is collected through the microphone, so that the user can be made. Interact with voice-based data and rich media search results to enhance user experience.
  • the user's voice data is collected through the microphone 34 to display the real person in the model display area 35 of the search result page according to the user's voice data.
  • the model is configured, and the user can interact with the rich media search result based on the voice data, for example, configuring the live model in the search result 35 based on the voice data, which is not limited.
  • the instruction for triggering the first identifier is extracted, and the user who detects the electronic device triggers through the browser.
  • the second identifier of the search result is used to activate the microphone of the electronic device, an instruction to trigger the second identifier is extracted.
  • S53 Generate a guiding interaction step that matches the interaction instruction according to the preset rule.
  • the interactive instruction includes an instruction to trigger the first identification of the electronic device camera
  • a guiding interaction step of matching the instruction for triggering the first identification of the electronic device camera is generated according to the preset rule to perform the user interaction prompt.
  • the interactive instruction includes an instruction for triggering the second identifier for starting the electronic device microphone
  • S54 Receive an instruction that the user triggers the step of guiding the interaction on the search result page.
  • the browser receives an instruction that the user triggers the step of guiding the interaction on the rich media search result page to perform the guided interaction step according to the instruction of the guided interaction step.
  • S55 Perform a boot interaction step according to an instruction that triggers the boot interaction step.
  • the guiding interaction step is performed according to an instruction that triggers the guiding interaction step to enable the user to interact with the rich media search result according to the guiding interaction step.
  • S56 Prompt the user according to the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • the user is prompted according to the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • a boot interaction step matching the instruction for triggering the first identifier for starting the electronic device camera 33 is generated according to a preset rule, and the user is prompted according to the guiding step, so that the user according to the guiding interaction step
  • the live model in the model display area 35 is set. For example, when the user collects the image data of the user through the camera 33, the user can drag and drop the image in the search result 32 of the "fashionable women's clothing" on the left side of the search page. To the real model, the user is allowed to try on the fashion women's clothing in the search result 32 based on the image data, thereby improving the user's immersive and scene-like experience.
  • a boot interaction step matching an instruction for triggering a second identifier for starting the electronic device microphone 34 is generated according to a preset rule, and the user is prompted according to the guiding step, so that the user according to the guiding interaction step
  • the 3D model in the model display area 35 is set. For example, when the user is inconvenient to collect his own image data, a piece of voice data can be input through the microphone 34 to describe his or her body type, so that the user can match the real person in the search result 35 based on the voice data.
  • the model is configured to implement try-on and dress-up, or the user can input voice data through the microphone 34, and provide the user with a voice search function, so that the user can intelligently search for the clothing required, and quickly and conveniently implement the try-on and dress-up. There is no limit to this.
  • S57 Receive search result configuration information input by the user, and configure the search result according to the configuration information.
  • the user may influence or modify the information of the search result page in the electronic device by touching, sliding, and zooming, or the user may input new image or audio information to display the search result page in the electronic device.
  • the content information is modified and there is no restriction on this.
  • the immersive and scened experience by receiving the search result configuration information input by the user, and configuring the search result according to the configuration information, enables the user to obtain personalized search information, thereby effectively improving the user experience.
  • FIG. 6 is a schematic flowchart diagram of an interaction method for search results according to another embodiment of the present invention. This embodiment is exemplified in the interactive device for search results configured as an interactive device for search results.
  • the interaction method for the search results includes:
  • the interaction operation may be, for example, positioning the search result, moving in the space of the search result, and performing touch, swipe, zoom, and the like on the search result, which is not limited.
  • FIG. 7 includes a swipe direction identifier 71
  • the vehicle when the user touches the swipe direction identifier 71 in the rich media search result, the vehicle can be flipped 360° to obtain an immersive view. On-the-spot experience.
  • FIG. 8 includes a zoom identifier 81.
  • the craft material of the user's attention can be carefully viewed to obtain the information of the customer product service. .
  • S62 Read association data of the search result matched with the interaction instruction, and display the associated data to the user.
  • the associated data of the search result matched with the interactive instruction is read, and the associated data is displayed to the user, and the associated data of the rich media search result is automatically extracted from the search engine, so that the user obtains the required information to the maximum extent.
  • improve the user experience is improved.
  • FIG. 9 includes a search box 91 and a search result 92.
  • the search page 91 of the search engine When the search term input by the user in the search box 91 of the search engine is "Mercedes-Benz glc", the search page correspondingly displays "Mercedes-Benz glc”.
  • Search results 92 at this time, only the data related to the Mercedes-Benz glc model, see Figure 10, wherein Figure 10 includes the Mercedes-Benz glc paint process 101 and the interior 102, when the user clicks on the interior 102 of the Mercedes-Benz glc, see 11 , wherein FIG. 11 includes a leather steering wheel 111 and an ultra clear electronic screen 112, and the browser displays associated data such as the corresponding leather steering wheel 111 and the ultra clear electronic screen 112 to the user.
  • FIG. 12 includes a search box 121, a search result 122, a kitchen direction identifier 123, and a bedroom direction identifier 124 when the user inputs a search term "decoration" in the search box 121 of the search engine.
  • the search page correspondingly displays the search result 122 of "decoration".
  • FIG. 13 wherein FIG. 13 includes the living room.
  • the direction indicator 131 the user can see the decoration data of the bedroom.
  • the user can also touch the living room direction indicator 131 in FIG. 13 to return to the living room shown in FIG. 12, which is not limited.
  • the associated data of the rich media search result matched with the interaction instruction is displayed to the user, which can implement diverse interactions between the user and the rich media search result, and effectively enhance the user's immersive and scened experience.
  • FIG. 14 is a schematic flowchart diagram of an interaction method for search results according to another embodiment of the present invention. This embodiment is exemplified in the interactive device for search results configured as an interactive device for search results.
  • the interaction operation may be, for example, positioning the search result, moving in the space of the search result, and performing touch, swipe, zoom, and the like on the search result, which is not limited.
  • FIG. 7 includes a swipe direction identifier 71.
  • the swipe direction identifier 71 in the rich media search result, the 360° view of the vehicle model can be realized, thereby effectively improving the user's immersive manner. Scenario experience.
  • FIG. 8 includes a zoom identifier 81.
  • the craft material of the user's attention can be carefully viewed to obtain the information of the customer product service. .
  • S142 The related data of the search result is not displayed on the search result page.
  • FIG. 9 includes a search box 91 and a search result 92.
  • the search page displays the search of "Mercedes-Benz glc" correspondingly.
  • the search page displays the search of "Mercedes-Benz glc” correspondingly.
  • the picture data associated with the Mercedes-Benz glc model when the user does not have any operation, does not display the associated data of the search result on the search result page.
  • S143 Read association data of the search result matched with the interaction instruction, and display the associated data to the user.
  • FIG. 9 includes a search box 91 and a search result 92.
  • the search page 91 of the search engine When the search term input by the user in the search box 91 of the search engine is "Mercedes-Benz glc", the search page correspondingly displays "Mercedes-Benz glc”.
  • Search results 92 at this time, only the data related to the Mercedes-Benz glc model, see Figure 10, wherein Figure 10 includes the Mercedes-Benz glc paint process 101 and the interior 102, when the user clicks on the interior 102 of the Mercedes-Benz glc, see 11 , wherein FIG. 11 includes a leather steering wheel 111 and an ultra clear electronic screen 112, and the browser displays associated data such as the corresponding leather steering wheel 111 and the ultra clear electronic screen 112 to the user.
  • FIG. 12 includes a search box 121, a search result 122, a kitchen direction identifier 123, and a bedroom direction identifier 124 when the user inputs a search term "decoration" in the search box 121 of the search engine.
  • the search page correspondingly displays the search result 122 of "decoration".
  • only the decoration data related to the living room when the user touches the direction identifier 123 of the bedroom in FIG. 12, see FIG. 13, wherein FIG. 13 includes the living room.
  • the direction indicator 131 the user can see the decoration data of the bedroom.
  • the user can also touch the living room direction indicator 131 in FIG. 13 to return to the living room shown in FIG. 12, which is not limited.
  • S144 Receive search result configuration information input by the user, and configure the search result according to the configuration information.
  • the browser of the electronic device can receive the search result configuration information input by the user, and according to the configuration information
  • the search results are configured to enable users to obtain personalized search information, which effectively enhances the user experience.
  • FIG. 15 includes a drag direction indicator 151, a coffee table 152, and a bench 153
  • the rich media search result can be performed according to his own needs.
  • Configuration for example, when there is no bench 153 around the coffee table 152 in the living room, the user can browse various types of benches 153 on the right side of the rich media search results page, click on the bench of his choice, and drag it to the coffee table.
  • the search result configuration information is input, and the search result is configured according to the configuration information, so that the user can obtain personalized search information, thereby effectively improving the user experience.
  • FIG. 16 is a schematic structural diagram of an interaction apparatus for search results according to an embodiment of the present invention.
  • the interaction device 160 for the search result may be implemented by software, hardware or a combination of the two.
  • the interaction device 160 for the search result may include a first extraction module 161, a generation module 162, and a prompt module 163. among them,
  • the first extraction module 161 is configured to extract an interaction instruction based on the interaction operation when determining that the user of the electronic device performs an interaction operation of the search result through the browser.
  • the generating module 162 is configured to generate a guiding interaction step that matches the interactive instruction according to the preset rule.
  • the prompting module 163 is configured to prompt the user according to the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • the interaction device 160 for searching results may further include:
  • the first extraction module 161 is further configured to: when detecting that the user of the electronic device triggers the preset identifier of the search result by using a browser, determine that the user performs an interaction operation of the search result through the browser.
  • the preset identifier comprises: a first identifier for starting the electronic device camera and a second identifier for starting the electronic device microphone.
  • the collecting module 164 is configured to: when the interactive instruction includes the instruction for triggering the first identifier, collect image data of the user through the camera, and display the image data in the search result page; and when the interactive instruction includes the instruction that triggers the first identifier The image data of the user is collected by the camera, and the image data is displayed on the search result page.
  • the first receiving module 165 is configured to receive an instruction that the user triggers the step of guiding the interaction on the search result page.
  • the executing module 166 is configured to perform a guiding interaction step according to the instruction that triggers the guiding interaction step, so that the user interacts with the search result according to the guiding interaction step.
  • the second receiving module 167 is configured to receive search result configuration information input by the user, and configure the search result according to the configuration information.
  • the user is prompted according to the guiding interaction step, so that the user interacts with the rich media search result according to the guiding interaction step, thereby enabling diverse interaction between the user and the rich media search result, and effectively improving the user's immersive and scene.
  • the user is prompted according to the guiding interaction step, so that the user interacts with the rich media search result according to the guiding interaction step, thereby enabling diverse interaction between the user and the rich media search result, and effectively improving the user's immersive and scene.
  • FIG. 18 is a schematic structural diagram of an interaction apparatus for search results according to another embodiment of the present invention.
  • the interaction device 180 for search results may be implemented by software, hardware, or a combination of both, and the interaction device 180 for search results may include a second extraction module 181 and a reading module 182. among them,
  • the second extraction module 181 is configured to extract an interaction instruction based on the interaction operation when determining that the user of the electronic device performs an interaction operation of the search result through the browser.
  • the reading module 182 is configured to read the associated data of the search result matched with the interactive instruction, and display the associated data to the user.
  • the interaction device 180 for searching results may further include:
  • the display module 183 is configured to display the associated data of the search result on the search result page before reading the associated data of the search result that matches the interactive instruction.
  • the third receiving module 184 is configured to receive search result configuration information input by the user, and configure the search result according to the configuration information.
  • the associated data of the rich media search result matched with the interaction instruction is displayed to the user, which can implement diverse interactions between the user and the rich media search result, and effectively enhance the user's immersive and scened experience.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Acoustics & Sound (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种用于搜索结果的交互方法和装置,该方法包括在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令(S11);根据预设规则生成与交互指令匹配的引导交互步骤(S12);根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与搜索结果进行交互(S13)。通过本方法能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。

Description

用于搜索结果的交互方法和装置
相关申请的交叉引用
本申请要求百度在线网络技术(北京)有限公司于2016年7月25日提交中国专利局、申请号为201610591921.9、发明名称为“用于搜索结果的交互方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及搜索引擎技术领域,尤其涉及一种用于搜索结果的交互方法和装置。
背景技术
用户使用电子设备中的浏览器进行相关信息的搜索时,会有希望通过搜索页面来实现沉浸式与场景化的体验的需求。例如,用户希望通过浏览、聆听、触摸、划动、缩放、陈述以及拍摄等多类交互方式来实现和搜索结果的多样化交互。
相关技术中,未有通过搜索结果页面来实现沉浸式与场景化的体验。
发明内容
有鉴于此,本发明实施例提供一种用于搜索结果的交互方法和装置,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。
为达到上述目的,本发明的实施例采用如下技术方案:
第一方面,本发明实施例提供一种用于搜索结果的交互方法,所述搜索结果为富媒体搜索结果,包括:
在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
根据预设规则生成与所述交互指令匹配的引导交互步骤;
根据所述引导交互步骤对所述用户进行提示,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
可选的,所述判定到电子设备的用户通过浏览器进行搜索结果的交互操作,包括:
在检测到所述电子设备的用户通过所述浏览器触发所述搜索结果的预设标识时,判定所述用户通过所述浏览器进行所述搜索结果的交互操作。
可选的,所述预设标识包括:用于启动所述电子设备摄像头的第一标识,还包括:
在所述交互指令包括触发所述第一标识的指令时,通过所述摄像头采集所述用户的图像数据,并将所述图像数据展示在搜索结果页面中。
可选的,所述预设标识还包括:用于启动所述电子设备麦克风的第二标识,还包括:
在所述交互指令包括触发所述第二标识的指令时,通过所述麦克风采集所述用户的语音数据。
可选的,还包括:
接收所述用户在所述搜索结果页面触发所述引导交互步骤的指令;
根据所述触发所述引导交互步骤的指令执行所述引导交互步骤,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
可选的,还包括:
接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
第二方面,本发明实施例提供一种用于搜索结果的交互方法,所述搜索结果为富媒体搜索结果,包括:
在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
读取与所述交互指令匹配的搜索结果的关联数据,并将所述关联数据展示给所述用户。
可选的,还包括:
在所述读取与所述交互指令匹配的搜索结果的关联数据之前,不在所述搜索结果页面展示所述搜索结果的关联数据。
可选的,还包括:
接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
第三方面,本发明实施例提供一种用于搜索结果的交互装置,所述搜索结果为富媒体搜索结果,包括:
第一提取模块,用于在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基 于所述交互操作提取交互指令;
生成模块,用于根据预设规则生成与所述交互指令匹配的引导交互步骤;
提示模块,用于根据所述引导交互步骤对所述用户进行提示,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
可选的,所述第一提取模块还用于:
在检测到所述电子设备的用户通过所述浏览器触发所述搜索结果的预设标识时,判定所述用户通过所述浏览器进行所述搜索结果的交互操作。
可选的,所述预设标识包括:用于启动所述电子设备摄像头的第一标识,还包括:
采集模块,用于在所述交互指令包括触发所述第一标识的指令时,通过所述摄像头采集所述用户的图像数据,并将所述图像数据展示在搜索结果页面中。
可选的,所述预设标识还包括:用于启动所述电子设备麦克风的第二标识,所述采集模块还用于:
在所述交互指令包括触发所述第二标识的指令时,通过所述麦克风采集所述用户的语音数据。
可选的,还包括:
第一接收模块,用于接收所述用户在所述搜索结果页面触发所述引导交互步骤的指令;
执行模块,用于根据所述触发所述引导交互步骤的指令执行所述引导交互步骤,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
可选的,还包括:
第二接收模块,用于接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
第四方面,本发明实施例提供一种用于搜索结果的交互装置,所述搜索结果为富媒体搜索结果,包括:
第二提取模块,用于在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
读取模块,用于读取与所述交互指令匹配的搜索结果的关联数据,并将所述关联数据展示给所述用户。
可选的,还包括:
展示模块,用于在所述读取与所述交互指令匹配的搜索结果的关联数据之前,不在所述 搜索结果页面展示所述搜索结果的关联数据。
可选的,还包括:
第三接收模块,用于接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
本发明实施例提供的用于搜索结果的交互方法和装置,通过根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与富媒体搜索结果进行交互,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1是本发明一实施例提出的用于搜索结果的交互方法的流程示意图;
图2是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图;
图3是本发明实施例的搜索结果页面示意图;
图4是本发明实施例的搜索结果页面模特配置示意图;
图5是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图;
图6是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图;
图7是本发明实施例中用户通过划动交互方式实现与搜索结果的多样化交互示意图;
图8是本发明实施例中用户通过缩放交互方式实现与搜索结果的多样化交互示意图;
图9是本发明实施例的另一搜索结果页面示意图;
图10是本发明实施例中用户通过触摸交互方式实现与搜索结果的多样化交互示意图;
图11是本发明实施例场景化体验示意图;
图12是本发明实施例中另一用户通过触摸交互方式实现与搜索结果的多样化交互示意图;
图13本发明实施例中另一用户通过触摸交互方式实现与搜索结果的多样化交互示意图;
图14是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图;
图15是本发明实施例中用户通过浏览与点击交互方式实现与搜索结果的多样化交互示意图;
图16是本发明一实施例提出的用于搜索结果的交互装置的结构示意图;
图17是本发明另一实施例提出的用于搜索结果的交互装置的结构示意图;
图18是本发明另一实施例提出的用于搜索结果的交互装置的结构示意图;
图19是本发明另一实施例提出的用于搜索结果的交互装置的结构示意图。
具体实施方式
下面结合附图对本发明实施例进行详细描述。
应当明确,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其它实施例,都属于本发明保护的范围。
图1是本发明一实施例提出的用于搜索结果的交互方法的流程示意图。本实施例以该用于搜索结果的交互方法被配置为用于搜索结果的交互装置中来举例说明。该用于搜索结果的交互方法可以应用在电子设备的浏览器中的搜索引擎中。
在本发明的实施例中,该搜索结果为富媒体搜索结果。
电子设备例如为个人电脑(Personal Computer,PC),云端设备或者移动设备,移动设备例如智能手机,或者平板电脑等。
参见图1,该用于搜索结果的交互方法包括:
S11:在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令。
可选地,用户可以在搜索引擎中(例如,百度搜索引擎)的搜索框中输入搜索词。
例如,用户可以在电子设备上搜索引擎的搜索框中输入搜索词,获取与搜索词对应的富媒体搜索结果。
在本发明的实施例中,用户可以与富媒体搜索结果进行多样化交互。
一些实施例中,参见图2,判定到电子设备的用户通过浏览器进行搜索结果的交互操作可以包括:
S21:在检测到电子设备的用户通过浏览器触发搜索结果的预设标识时,判定用户通过浏览器进行搜索结果的交互操作。
在本发明的实施例中,预设标识包括:用于启动电子设备摄像头的第一标识,该用于搜索结果的交互方法还包括:在交互指令包括触发第一标识的指令时,通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中。例如,用户可以在电子设备上搜索引擎的搜索框中输入搜索词,获取与搜索词对应的富媒体搜索结果,在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备摄像头的第一标识时,判定用户通过浏览器进行搜 索结果的交互操作,则通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中,此时,判定用户通过浏览器进行搜索结果的交互操作,通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中,可以基于图像数据使用户与富媒体搜富结果进行交互,提升用户体验。
作为一种示例,参见图3,图3为本发明实施例的搜索结果页面示意图,其中,图3包括搜索框31、搜索结果32、摄像头33、麦克风34,以及模特展示区域35,当用户在百度的搜索框31中输入的搜索词为“时尚女装”时,搜索页面的左侧相应地显示出“时尚女装”的搜索结果32,在用户触发启动电子设备摄像头33的第一标识时,通过摄像头33采集用户的图像数据,并将图像数据展示在搜索结果页面模特展示区域35的真人模特中,例如,参见图4,当用户通过摄像头41采集用户的图像数据时,搜索页面右侧的模特展示区域42相应地显示出用户的图像数据,用户可以基于图像数据与富媒体搜富结果进行交互,例如,基于图像数据对搜索结果32中的时尚女装进行试穿,对此不作限制。
在本发明的实施例中,预设标识还包括:用于启动电子设备麦克风的第二标识,该用于搜索结果的交互方法还包括:在交互指令包括触发第二标识的指令时,通过麦克风采集用户的语音数据。
例如,用户可以使用电子设备上搜索引擎的搜索框中输入搜索词,获取与搜索词对应的搜索结果,在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备麦克风的第二标识时,判定用户通过浏览器进行搜索结果的交互操作,则通过麦克风采集用户的语音数据,此时,判定用户通过浏览器进行搜索结果的交互操作,通过麦克风采集用户的语音数据,可以使用户基于语音数据与富媒体搜富结果进行交互,提升用户体验。
作为一种示例,参见图3,当用户启动电子设备麦克风34的第二标识时,则通过麦克风34采集用户的语音数据,触发交互,以根据用户的语音数据对搜索结果页面的模特展示区域35中的真人模特进行配置,用户可以基于语音数据与富媒体搜富结果进行交互,例如,基于语音数据对搜索结果35中的真人模特进行配置,对此不作限制。
可选地,在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备摄像头的第一标识时,提取触发第一标识的指令,在在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备麦克风的第二标识时,提取触发第二标识的指令。
S12:根据预设规则生成与交互指令匹配的引导交互步骤。
在本发明的实施例中,预设规则可以预先配置在电子设备的数据库中,也可以配置在服务器侧,对此不作限制。
例如,在交互指令包括触发用于启动电子设备摄像头的第一标识的指令时,根据预设规则生成与触发用于启动电子设备摄像头的第一标识的指令匹配的引导交互步骤,以对用户进 行提示。
或者,在交互指令包括触发用于启动电子设备麦克风的第二标识的指令时,根据预设规则生成与触发用于启动电子设备麦克风的第二标识的指令匹配的引导交互步骤,以对用户进行提示。
S13:根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与搜索结果进行交互。
可选地,根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与搜索结果进行交互。
作为一种示例,参见图3,根据预设规则生成与触发用于启动电子设备摄像头33的第一标识的指令匹配的引导交互步骤,根据引导步骤对用户进行提示,以使用户根据引导交互步骤对模特展示区域35中的真人模特进行设置,例如,当用户通过摄像头33采集用户的图像数据时,用户可通过点击搜索页面左侧“时尚女装”的搜索结果32中的图片,将其拖拽至真人模特身上,使用户基于图像数据对搜索结果32中的时尚女装进行试穿,提升用户沉浸式与场景化的体验。
作为一种示例,参见图3,根据预设规则生成与触发用于启动电子设备麦克风34的第二标识的指令匹配的引导交互步骤,根据引导步骤对用户进行提示,以使用户根据引导交互步骤对模特展示区域35中的3D模特进行设置,例如,当用户不方便采集自己的图像数据时,可通过麦克风34输入一段语音数据描述自己的体型,使用户基于语音数据对搜索结果35中的真人模特进行配置,实现试穿与换装,或者,用户可通过麦克风34输入语音数据,给用户提供语音搜索功能,使用户智能搜索其所需的服装,快速且方便地实现试穿与换装,对此不作限制。
本实施例中,通过根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与富媒体搜索结果进行交互,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。
图5是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图。本实施例以该用于搜索结果的交互方法被配置为用于搜索结果的交互装置中来举例说明。
参见图5,该用于搜索结果的交互方法包括:
S51:在检测到电子设备的用户通过浏览器触发搜索结果的预设标识时,判定用户通过浏览器进行搜索结果的交互操作。
可选地,用户可以在搜索引擎中(例如,百度搜索引擎)的搜索框中输入搜索词。
例如,用户可以在电子设备上搜索引擎的搜索框中输入搜索词,获取与搜索词对应的富媒体搜索结果,在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备摄像 头的第一标识时,判定用户通过浏览器进行搜索结果的交互操作,则通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中,此时,判定用户通过浏览器进行搜索结果的交互操作,通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中,可以基于图像数据使用户与富媒体搜富结果进行交互,提升用户体验。
作为一种示例,参见图3,图3为本发明实施例的搜索结果页面示意图,其中,图3包括搜索框31、搜索结果32、摄像头33、麦克风34,以及模特展示区域35,当用户在百度的搜索框31中输入的搜索词为“时尚女装”时,搜索页面的左侧相应的显示出“时尚女装”的搜索结果32,在用户触发启动电子设备摄像头33的第一标识时,通过摄像头33采集用户的图像数据,并将图像数据展示在搜索结果页面模特展示区域35的真人模特中,例如,参见图4,当用户通过摄像头41采集用户的图像数据时,搜索页面右侧的模特展示区域42相应地显示出用户的图像数据,用户可以基于图像数据与富媒体搜富结果进行交互,例如,基于图像数据对搜索结果32中的时尚女装进行试穿,对此不作限制。
或者,用户可以在电子设备上搜索引擎的搜索框中输入搜索词,获取与搜索词对应的搜索结果,在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备麦克风的第二标识时,判定用户通过浏览器进行搜索结果的交互操作,则通过麦克风采集用户的语音数据,此时,判定用户通过浏览器进行搜索结果的交互操作,通过麦克风采集用户的语音数据,可以使用户基于语音数据与富媒体搜富结果进行交互,提升用户体验。
作为一种示例,参见图3,当用户启动电子设备麦克风34的第二标识时,则通过麦克风34采集用户的语音数据,以根据用户的语音数据对搜索结果页面的模特展示区域35中的真人模特进行配置,用户可以基于语音数据与富媒体搜富结果进行交互,例如,基于语音数据对搜索结果35中的真人模特进行配置,对此不作限制。
S52:在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令。
可选地,在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备摄像头的第一标识时,提取触发第一标识的指令,在在检测到电子设备的用户通过浏览器触发搜索结果的用于启动电子设备麦克风的第二标识时,提取触发第二标识的指令。
S53:根据预设规则生成与交互指令匹配的引导交互步骤。
在本发明的实施例中,预设规则可以预先配置在电子设备的数据库中,也可以配置在服务器侧,对此不作限制。
例如,在交互指令包括触发用于启动电子设备摄像头的第一标识的指令时,根据预设规则生成与触发用于启动电子设备摄像头的第一标识的指令匹配的引导交互步骤,以对用户进行提示。
或者,在交互指令包括触发用于启动电子设备麦克风的第二标识的指令时,根据预设规则生成与触发用于启动电子设备麦克风的第二标识的指令匹配的引导交互步骤,以对用户进行提示。
S54:接收用户在搜索结果页面触发引导交互步骤的指令。
可选地,浏览器接收用户在富媒体搜索结果页面触发引导交互步骤的指令,以根据引导交互步骤的指令执行引导交互步骤。
S55:根据触发引导交互步骤的指令执行引导交互步骤。
可选地,根据触发引导交互步骤的指令执行引导交互步骤,以使用户根据引导交互步骤与富媒体搜索结果进行交互。
S56:根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与搜索结果进行交互。
可选地,根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与搜索结果进行交互。
作为一种示例,参见图3,根据预设规则生成与触发用于启动电子设备摄像头33的第一标识的指令匹配的引导交互步骤,根据引导步骤对用户进行提示,以使用户根据引导交互步骤对模特展示区域35中的真人模特进行设置,例如,当用户通过摄像头33采集用户的图像数据时,用户可通过点击搜索页面左侧“时尚女装”的搜索结果32中的图片,将其拖拽至真人模特身上,使用户基于图像数据对搜索结果32中的时尚女装进行试穿,提升用户沉浸式与场景化的体验。
作为一种示例,参见图3,根据预设规则生成与触发用于启动电子设备麦克风34的第二标识的指令匹配的引导交互步骤,根据引导步骤对用户进行提示,以使用户根据引导交互步骤对模特展示区域35中的3D模特进行设置,例如,当用户不方便采集自己的图像数据时,可通过麦克风34输入一段语音数据描述自己的体型,使用户基于语音数据对搜索结果35中的真人模特进行配置,实现试穿与换装,或者,用户可通过麦克风34输入语音数据,给用户提供语音搜索功能,使用户智能搜索其所需的服装,快速且方便地实现试穿与换装,对此不作限制。
S57:接收用户输入的搜索结果配置信息,并根据配置信息对搜索结果进行配置。
例如,用户可以通过触摸、滑动,以及缩放等动作对电子设备中的搜索结果页面的信息产生影响或修改,或者,用户可以输入新的图像或音频信息,对电子设备中的搜索结果页面展示的内容信息进行修改,对此不作限制。
本实施例中,通过根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与富媒体搜索结果进行交互,能够使实现用户与富媒体搜索结果的多样化交互,有效提升用户沉 浸式与场景化体验,通过接收用户输入的搜索结果配置信息,并根据配置信息对搜索结果进行配置,能够使用户获得个性化搜索信息,有效提升用户的使用体验。
图6是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图。本实施例以该用于搜索结果的交互方法被配置为用于搜索结果的交互装置中来举例说明。
参见图6,该用于搜索结果的交互方法包括:
S61:在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令。
其中,交互操作例如可以为对搜索的结果进行定位,在搜索结果的空间中进行移动,对搜索结果进行触摸、划动、缩放等操作,对此不作限制。
作为一种示例,参见图7,其中,图7包括划动方向标识71,当用户触摸富媒体搜索结果中的划动方向标识71时,可以实现车型翻转360°查看,获得身临其境的临场体验。
作为一种示例,参见图8,其中,图8包括缩放标识81,当用户触摸富媒体搜索结果中的缩放标识81时,可以对自己关注的工艺材质进行仔细查看,全面获取客户产品服务的信息。
S62:读取与交互指令匹配的搜索结果的关联数据,并将关联数据展示给用户。
可选地,读取与交互指令匹配的搜索结果的关联数据,并将关联数据展示给用户,能够从搜索引擎中自动挖掘富媒体搜索结果的关联数据,使用户最大限度地获取所需信息,提升用户体验。
作为一种示例,参见图9,图9包括搜索框91以及搜索结果92,当用户在搜索引擎的搜索框91中输入的搜索词为“奔驰glc”时,搜索页面相应地显示出“奔驰glc”的搜索结果92,此时,只有与奔驰glc车型相关的数据,参见图10,其中,图10包括奔驰glc的漆工艺101以及内饰102,当用户点击奔驰glc的内饰102时,参见图11,其中,图11包括真皮方向盘111以及超清电子屏112,浏览器将相应的真皮方向盘111、超清电子屏112等关联数据展示给用户。
作为一种示例,参见图12,图12包括搜索框121、搜索结果122、厨房方向标识123,以及卧室方向标识124,当用户在搜索引擎的搜索框121中输入的搜索词为“装修”时,搜索页面相应地显示出“装修”的搜索结果122,此时,只有与客厅相关的装修数据,当用户触摸图12中的卧室的方向标识124时,参见图13,其中,图13包括客厅方向标识131,用户可以看到卧室的装修数据,当然,用户也可以触摸图13中的客厅方向标识131重新返回到如图12所示的客厅,对此不作限制。
本实施例中,通将与交互指令匹配的富媒体搜索结果的关联数据展示给用户,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。
图14是本发明另一实施例提出的用于搜索结果的交互方法的流程示意图。本实施例以该用于搜索结果的交互方法被配置为用于搜索结果的交互装置中来举例说明。
S141:在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令。
其中,交互操作例如可以为对搜索的结果进行定位,在搜索结果的空间中进行移动,对搜索结果进行触摸、划动、缩放等操作,对此不作限制。
作为一种示例,参见图7,其中,图7包括划动方向标识71,当用户触摸富媒体搜索结果中的划动方向标识71时,可以实现车型翻转360°查看,有效提升用户沉浸式与场景化的体验。
作为一种示例,参见图8,其中,图8包括缩放标识81,当用户触摸富媒体搜索结果中的缩放标识81时,可以对自己关注的工艺材质进行仔细查看,全面获取客户产品服务的信息。
S142:不在搜索结果页面展示搜索结果的关联数据。
例如,参见图9,图9包括搜索框91以及搜索结果92,当用户在搜索引擎的搜索框91中输入的搜索词为“奔驰glc”时,搜索页面相应地显示出“奔驰glc”的搜索结果92,此时,只有与奔驰glc车型相关的图片数据,当用户没有任何操作时,不在搜索结果页面展示搜索结果的关联数据。
S143:读取与交互指令匹配的搜索结果的关联数据,并将关联数据展示给用户。
作为一种示例,参见图9,图9包括搜索框91以及搜索结果92,当用户在搜索引擎的搜索框91中输入的搜索词为“奔驰glc”时,搜索页面相应地显示出“奔驰glc”的搜索结果92,此时,只有与奔驰glc车型相关的数据,参见图10,其中,图10包括奔驰glc的漆工艺101以及内饰102,当用户点击奔驰glc的内饰102时,参见图11,其中,图11包括真皮方向盘111以及超清电子屏112,浏览器将相应的真皮方向盘111、超清电子屏112等关联数据展示给用户。
作为一种示例,参见图12,图12包括搜索框121、搜索结果122、厨房方向标识123,以及卧室方向标识124,当用户在搜索引擎的搜索框121中输入的搜索词为“装修”时,搜索页面相应地显示出“装修”的搜索结果122,此时,只有与客厅相关的装修数据,当用户触摸图12中的卧室的方向标识123时,参见图13,其中,图13包括客厅方向标识131,用户可以看到卧室的装修数据,当然,用户也可以触摸图13中的客厅方向标识131重新返回到如图12所示的客厅,对此不作限制。
S144:接收用户输入的搜索结果配置信息,并根据配置信息对搜索结果进行配置。
可选地,电子设备的浏览器可以接收用户输入的搜索结果配置信息,并根据配置信息对 搜索结果进行配置,能够使用户获得个性化搜索信息,有效提升用户的使用体验。
作为一种示例,参见图15,其中,图15包括拖动方向标识151、茶几152,以及板凳153,当用户感觉客厅的装修缺少某样家具的时候,可以根据自身需求对富媒体搜索结果进行配置,例如,在客厅的茶几152周围没有板凳153时,用户可以浏览富媒体搜索结果页面右侧的各种类型的板凳153,点击自己喜欢的板凳,将其拖拽至茶几旁。
本实施例中,通过将与交互指令匹配的富媒体搜索结果的关联数据展示给用户,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验,通过接收用户输入的搜索结果配置信息,并根据配置信息对搜索结果进行配置,能够使用户获得个性化搜索信息,有效提升用户的使用体验。
图16是本发明一实施例提出的用于搜索结果的交互装置的结构示意图。该用于搜索结果的交互装置160可以通过软件、硬件或者两者的结合实现,该用于搜索结果的交互装置160可以包括:第一提取模块161、生成模块162,以及提示模块163。其中,
第一提取模块161,用于在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令。
生成模块162,用于根据预设规则生成与交互指令匹配的引导交互步骤。
提示模块163,用于根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与搜索结果进行交互。
一些实施例中,参见图17,该用于搜索结果的交互装置160还可以包括:
可选地,第一提取模块161还用于:在检测到电子设备的用户通过浏览器触发搜索结果的预设标识时,判定用户通过浏览器进行搜索结果的交互操作。
可选地,预设标识包括:用于启动电子设备摄像头的第一标识和用于启动电子设备麦克风的第二标识。
采集模块164,用于在交互指令包括触发第一标识的指令时,通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中;用于在交互指令包括触发第一标识的指令时,通过摄像头采集用户的图像数据,并将图像数据展示在搜索结果页面中。
第一接收模块165,用于接收用户在搜索结果页面触发引导交互步骤的指令。
执行模块166,用于根据触发引导交互步骤的指令执行引导交互步骤,以使用户根据引导交互步骤与搜索结果进行交互。
第二接收模块167,用于接收用户输入的搜索结果配置信息,并根据配置信息对搜索结果进行配置。
需要说明的是,前述图1-图5实施例中对用于搜索结果的交互方法实施例的解释说明也适用于该实施例的用于搜索结果的交互装置160,其实现原理类似,此处不再赘述。
本实施例中,通过根据引导交互步骤对用户进行提示,以使用户根据引导交互步骤与富媒体搜索结果进行交互,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。
图18是本发明另一实施例提出的用于搜索结果的交互装置的结构示意图。该用于搜索结果的交互装置180可以通过软件、硬件或者两者的结合实现,该用于搜索结果的交互装置180可以包括:第二提取模块181和读取模块182。其中,
第二提取模块181,用于在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于交互操作提取交互指令。
读取模块182,用于读取与交互指令匹配的搜索结果的关联数据,并将关联数据展示给用户。
一些实施例中,参见图19,该用于搜索结果的交互装置180还可以包括:
展示模块183,用于在读取与交互指令匹配的搜索结果的关联数据之前,不在搜索结果页面展示搜索结果的关联数据。
第三接收模块184,用于接收用户输入的搜索结果配置信息,并根据配置信息对搜索结果进行配置。
需要说明的是,前述图6-图15实施例中对用于搜索结果的交互方法实施例的解释说明也适用于该实施例的用于搜索结果的交互装置180,其实现原理类似,此处不再赘述。
本实施例中,通将与交互指令匹配的富媒体搜索结果的关联数据展示给用户,能够实现用户与富媒体搜索结果的多样化交互,有效提升用户沉浸式与场景化的体验。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (18)

  1. 一种用于搜索结果的交互方法,其特征在于,所述搜索结果为富媒体搜索结果,包括以下步骤:
    在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
    根据预设规则生成与所述交互指令匹配的引导交互步骤;
    根据所述引导交互步骤对所述用户进行提示,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
  2. 如权利要求1所述的用于搜索结果的交互方法,其特征在于,所述判定到电子设备的用户通过浏览器进行搜索结果的交互操作,包括:
    在检测到所述电子设备的用户通过所述浏览器触发所述搜索结果的预设标识时,判定所述用户通过所述浏览器进行所述搜索结果的交互操作。
  3. 如权利要求2所述的用于搜索结果的交互方法,其特征在于,所述预设标识包括:用于启动所述电子设备摄像头的第一标识,还包括:
    在所述交互指令包括触发所述第一标识的指令时,通过所述摄像头采集所述用户的图像数据,并将所述图像数据展示在搜索结果页面中。
  4. 如权利要求2所述的用于搜索结果的交互方法,其特征在于,所述预设标识还包括:用于启动所述电子设备麦克风的第二标识,还包括:
    在所述交互指令包括触发所述第二标识的指令时,通过所述麦克风采集所述用户的语音数据。
  5. 如权利要求1所述的用于搜索结果的交互方法,其特征在于,还包括:
    接收所述用户在所述搜索结果页面触发所述引导交互步骤的指令;
    根据所述触发所述引导交互步骤的指令执行所述引导交互步骤,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
  6. 如权利要求1所述的用于搜索结果的交互方法,其特征在于,还包括:
    接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
  7. 一种用于搜索结果的交互方法,其特征在于,所述搜索结果为富媒体搜索结果,包括以下步骤:
    在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
    读取与所述交互指令匹配的搜索结果的关联数据,并将所述关联数据展示给所述用户。
  8. 如权利要求7所述的用于搜索结果的交互方法,其特征在于,还包括:
    在所述读取与所述交互指令匹配的搜索结果的关联数据之前,不在所述搜索结果页面展示所述搜索结果的关联数据。
  9. 如权利要求7所述的用于搜索结果的交互方法,其特征在于,还包括:
    接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
  10. 一种用于搜索结果的交互装置,其特征在于,所述搜索结果为富媒体搜索结果,包括以下步骤:
    第一提取模块,用于在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
    生成模块,用于根据预设规则生成与所述交互指令匹配的引导交互步骤;
    提示模块,用于根据所述引导交互步骤对所述用户进行提示,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
  11. 如权利要求10所述的用于搜索结果的交互装置,其特征在于,所述第一提取模块还用于:
    在检测到所述电子设备的用户通过所述浏览器触发所述搜索结果的预设标识时,判定所述用户通过所述浏览器进行所述搜索结果的交互操作。
  12. 如权利要求11所述的用于搜索结果的交互装置,其特征在于,所述预设标识包括:用于启动所述电子设备摄像头的第一标识,还包括:
    采集模块,用于在所述交互指令包括触发所述第一标识的指令时,通过所述摄像头采集所述用户的图像数据,并将所述图像数据展示在搜索结果页面中。
  13. 如权利要求11所述的用于搜索结果的交互装置,其特征在于,所述预设标识还包括:用于启动所述电子设备麦克风的第二标识,所述采集模块还用于:
    在所述交互指令包括触发所述第二标识的指令时,通过所述麦克风采集所述用户的语音数据。
  14. 如权利要求10所述的用于搜索结果的交互装置,其特征在于,还包括:
    第一接收模块,用于接收所述用户在所述搜索结果页面触发所述引导交互步骤的指令;
    执行模块,用于根据所述触发所述引导交互步骤的指令执行所述引导交互步骤,以使所述用户根据所述引导交互步骤与所述搜索结果进行交互。
  15. 如权利要求10所述的用于搜索结果的交互装置,其特征在于,还包括:
    第二接收模块,用于接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所 述搜索结果进行配置。
  16. 一种用于搜索结果的交互装置,其特征在于,所述搜索结果为富媒体搜索结果,包括:
    第二提取模块,用于在判定电子设备的用户通过浏览器进行搜索结果的交互操作时,基于所述交互操作提取交互指令;
    读取模块,用于读取与所述交互指令匹配的搜索结果的关联数据,并将所述关联数据展示给所述用户。
  17. 如权利要求16所述的用于搜索结果的交互方法,其特征在于,还包括:
    展示模块,用于在所述读取与所述交互指令匹配的搜索结果的关联数据之前,不在所述搜索结果页面展示所述搜索结果的关联数据。
  18. 如权利要求16所述的用于搜索结果的交互方法,其特征在于,还包括:
    第三接收模块,用于接收所述用户输入的搜索结果配置信息,并根据所述配置信息对所述搜索结果进行配置。
PCT/CN2016/106442 2016-07-25 2016-11-18 用于搜索结果的交互方法和装置 WO2018018801A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/322,102 US11100180B2 (en) 2016-07-25 2016-11-18 Interaction method and interaction device for search result
JP2018554057A JP7126453B2 (ja) 2016-07-25 2016-11-18 検索結果用インタラクティブ方法及び装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610591921.9 2016-07-25
CN201610591921.9A CN106250425B (zh) 2016-07-25 2016-07-25 用于搜索结果的交互方法和装置

Publications (1)

Publication Number Publication Date
WO2018018801A1 true WO2018018801A1 (zh) 2018-02-01

Family

ID=57604632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/106442 WO2018018801A1 (zh) 2016-07-25 2016-11-18 用于搜索结果的交互方法和装置

Country Status (4)

Country Link
US (1) US11100180B2 (zh)
JP (1) JP7126453B2 (zh)
CN (1) CN106250425B (zh)
WO (1) WO2018018801A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968798B (zh) * 2019-10-25 2023-11-24 贝壳找房(北京)科技有限公司 房源显示方法、装置、可读存储介质及处理器
CN114339434A (zh) * 2020-09-30 2022-04-12 阿里巴巴集团控股有限公司 货品试穿效果的展示方法及装置
US11797632B2 (en) * 2021-03-01 2023-10-24 Microsoft Technology Licensing, Llc Image reranking and presentation for visual exploration

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1632832A (zh) * 2003-12-24 2005-06-29 毛新 在本人图像上进行网络试衣
CN101216841A (zh) * 2008-01-14 2008-07-09 南京搜拍信息技术有限公司 交互式图像搜索系统和方法
CN105512931A (zh) * 2015-12-09 2016-04-20 北京镜联视界科技有限公司 在线智能配镜的方法和装置

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325608A (ja) 2000-05-15 2001-11-22 Nippon Telegr & Teleph Corp <Ntt> 画像表示方法、画像表示装置及び画像表示プログラムを記録した記録媒体、及び電子決済方法
JP2001325297A (ja) 2000-05-17 2001-11-22 Nec Software Hokuriku Ltd 商品画像表示システム、商品画像閲覧方法および記録媒体
JP2004318359A (ja) 2003-04-15 2004-11-11 Casio Comput Co Ltd 商品販売装置および商品販売システム並びにプログラム
JP2009508274A (ja) 2005-09-13 2009-02-26 スペースタイムスリーディー・インコーポレーテッド 3次元グラフィカル・ユーザ・インターフェースを提供するシステム及び方法
US8751502B2 (en) * 2005-11-29 2014-06-10 Aol Inc. Visually-represented results to search queries in rich media content
US8306969B2 (en) * 2008-09-23 2012-11-06 Microsoft Corporation Linking search queries to rich media themes
US8316037B1 (en) * 2009-01-30 2012-11-20 Google Inc. Providing remedial search operation based on analysis of user interaction with search results
CN101819663A (zh) * 2009-08-27 2010-09-01 珠海琳琅信息科技有限公司 一种服装虚拟试穿系统
US8386454B2 (en) 2009-09-20 2013-02-26 Yahoo! Inc. Systems and methods for providing advanced search result page content
WO2012039054A1 (ja) 2010-09-24 2012-03-29 株式会社フォーサイド・ドット・コム 書籍コンテンツ配信システム及びコンテンツサーバ
TW201235867A (en) * 2011-02-18 2012-09-01 Hon Hai Prec Ind Co Ltd System and method for searching related terms
JP2014522005A (ja) 2011-03-31 2014-08-28 フェイスケーキ マーケティング テクノロジーズ,インコーポレイテッド ターゲットマーケティングシステム及び方法
US20130238612A1 (en) * 2012-03-08 2013-09-12 Xerox Corporation Method and apparatus for providing refined search results for a query based on one or more user interactions
CN104090923B (zh) * 2012-05-04 2018-06-26 北京奇虎科技有限公司 一种浏览器中的富媒体信息的展示方法和装置
US8843483B2 (en) * 2012-05-29 2014-09-23 International Business Machines Corporation Method and system for interactive search result filter
CN202870858U (zh) * 2012-06-12 2013-04-10 杭州宙捷科技有限公司 智能交互式试衣系统及装置
WO2013188603A2 (en) * 2012-06-12 2013-12-19 Yahoo! Inc Systems and methods involving search enhancement features associated with media modules
US9305102B2 (en) * 2013-02-27 2016-04-05 Google Inc. Systems and methods for providing personalized search results based on prior user interactions
US9256687B2 (en) * 2013-06-28 2016-02-09 International Business Machines Corporation Augmenting search results with interactive search matrix
US20150220647A1 (en) * 2014-02-01 2015-08-06 Santosh Kumar Gangwani Interactive GUI for clustered search results
CN104063521B (zh) * 2014-07-17 2018-09-11 百度在线网络技术(北京)有限公司 搜索业务实现方法和装置
JP6436762B2 (ja) 2014-12-25 2018-12-12 株式会社野村総合研究所 情報処理装置およびサービス提供方法
CN104899305A (zh) * 2015-06-12 2015-09-09 百度在线网络技术(北京)有限公司 移动搜索的推荐方法和装置
CN105575198B (zh) * 2015-12-18 2019-01-11 北京美院帮网络科技有限公司 教学视频的展示方法和装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1632832A (zh) * 2003-12-24 2005-06-29 毛新 在本人图像上进行网络试衣
CN101216841A (zh) * 2008-01-14 2008-07-09 南京搜拍信息技术有限公司 交互式图像搜索系统和方法
CN105512931A (zh) * 2015-12-09 2016-04-20 北京镜联视界科技有限公司 在线智能配镜的方法和装置

Also Published As

Publication number Publication date
US20190188231A1 (en) 2019-06-20
CN106250425B (zh) 2020-11-03
JP2019516182A (ja) 2019-06-13
JP7126453B2 (ja) 2022-08-26
US11100180B2 (en) 2021-08-24
CN106250425A (zh) 2016-12-21

Similar Documents

Publication Publication Date Title
JP6363758B2 (ja) 関連コンテンツを見るためのジェスチャー・ベースのタグ付け
US10896457B2 (en) Synchronized audiovisual responses to user requests
JP7356206B2 (ja) コンテンツ推薦及び表示
US10572556B2 (en) Systems and methods for facilitating enhancements to search results by removing unwanted search results
US20190138815A1 (en) Method, Apparatus, User Terminal, Electronic Equipment, and Server for Video Recognition
US20180088969A1 (en) Method and device for presenting instructional content
WO2018077214A1 (zh) 信息搜索方法和装置
WO2016091044A1 (zh) 一种热词推荐方法、装置、系统、设备和计算机存储介质
WO2017190471A1 (zh) 电视购物信息处理方法和装置
CN107430626A (zh) 提供建议的基于话音的动作查询
JP2019507417A (ja) 多変数検索のためのユーザインターフェース
WO2015172359A1 (zh) 一种对象搜索方法及装置
CN104281656B (zh) 在应用程序中加入标签信息的方法和装置
TW201523426A (zh) 顯示於觸控螢幕上的可動作內容
WO2014000645A1 (zh) 一种基于图片的交互方法、装置和服务器
CN109002338A (zh) 页面渲染、页面装修信息处理方法及装置
CN107015979B (zh) 一种数据处理方法、装置和智能终端
CN109325143B (zh) 制作歌单的方法及装置、存储介质、处理器
WO2018018801A1 (zh) 用于搜索结果的交互方法和装置
TW201248450A (en) Background audio listening for content recognition
CN107622074A (zh) 一种数据处理方法、装置和计算设备
CN105183763A (zh) 一种搜索结果页的背景实现方法和装置
WO2015200914A1 (en) Techniques for simulating kinesthetic interactions
CN109977390A (zh) 一种生成文本的方法及装置
US20170147694A1 (en) Method and system for providing interaction driven electronic social experience

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018554057

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16910380

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16910380

Country of ref document: EP

Kind code of ref document: A1