CN113900618B - Information playing method and device, electronic equipment and storage medium - Google Patents

Information playing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113900618B
CN113900618B CN202111162316.7A CN202111162316A CN113900618B CN 113900618 B CN113900618 B CN 113900618B CN 202111162316 A CN202111162316 A CN 202111162316A CN 113900618 B CN113900618 B CN 113900618B
Authority
CN
China
Prior art keywords
information
target
control
application
triggered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111162316.7A
Other languages
Chinese (zh)
Other versions
CN113900618A (en
Inventor
江雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Koubei Network Technology Co Ltd
Original Assignee
Zhejiang Koubei Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Koubei Network Technology Co Ltd filed Critical Zhejiang Koubei Network Technology Co Ltd
Priority to CN202111162316.7A priority Critical patent/CN113900618B/en
Publication of CN113900618A publication Critical patent/CN113900618A/en
Application granted granted Critical
Publication of CN113900618B publication Critical patent/CN113900618B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces

Abstract

The embodiment of the application provides an information playing method, an information playing device, electronic equipment and a storage medium, wherein the method comprises the following steps: when a target application is in a vision disorder user operation mode, detecting whether a target control in a page of the target application is a current focus element or not; if yes, playing target information in a voice mode, wherein the target information is obtained according to information which can be displayed after the target control is triggered when the target application is in a non-visual obstacle user operation mode. According to the information playing method provided by the embodiment of the application, when the target control in the page of the target application is the current focus element, the target information obtained according to the information which can be displayed after the target control is triggered when the target application is in the non-visual obstacle user operation mode is played, so that the problem of what information is played in a voice mode when the control in the page of the application is selected as the current focus element is solved.

Description

Information playing method and device, electronic equipment and storage medium
The application is a divisional application of Chinese patent application with the application date of 2021, 5 months and 7 days, the application number of 202110502860.5 and the name of information playing method, device, electronic equipment and storage medium.
Technical Field
The application relates to the technical field of computers, in particular to an information playing method. The application also relates to an information playing device, electronic equipment and a storage medium.
Background
In order to enable visually impaired people to operate the intelligent mobile terminal, currently mainstream intelligent mobile terminals generally support a visually impaired user operation mode. In the operation mode of the visually impaired user, the intelligent mobile terminal can provide language assistance for the visually impaired user, so that the visually impaired user can very conveniently and effectively interact with the intelligent mobile terminal, for example: when a user selects a control in a take-out APP page of the intelligent mobile terminal in a barrier-free mode, the intelligent terminal takes the selected control as a current focus element and plays information for prompting the name of the control in a voice mode, so that the visually impaired user can know what control is currently selected by the visually impaired user. However, when some controls in the application page are selected as the current focus element, the information for prompting the names of the controls is only played in a voice mode, and the requirement of the visually impaired people for selecting the controls cannot be met. Therefore, when a control in a page of an application is selected as a current focus element, what information is played through a voice manner becomes a problem to be solved.
In other scenarios, there may also be a problem of what information is played in a voice manner when a control in the page of the application is selected as the current focus element.
Disclosure of Invention
The embodiment of the application provides an information playing method, an information playing device, electronic equipment and a storage medium, so as to solve the problem of playing what information in a voice mode when a control in an application page is selected as a current focus element.
The embodiment of the application provides an information playing method, which comprises the following steps:
when a target application is in a vision disorder user operation mode, detecting whether a target control in a page of the target application is a current focus element or not;
if yes, playing target information in a voice mode, wherein the target information is obtained according to information which can be displayed after the target control is triggered when the target application is in a non-visual obstacle user operation mode.
Optionally, the information playing method provided in the embodiment of the present application further includes: obtaining a control for hiding information in the page as the target control;
the information which can be displayed after the target control is triggered is information which can be displayed after the target control is triggered and is hidden in the page.
Optionally, the information that can be displayed after the target control is triggered is information that can be displayed after the target control is triggered and is hidden in the page, including: the information which can be displayed after the target control is triggered is information selected from the information which can be displayed after the target control is triggered when the target application is in a non-vision obstacle user operation mode.
Optionally, the target control is a control for providing distribution service for visually impaired users;
the information which can be displayed after the target control is triggered is information selected from the information which can be displayed after the target control is triggered when the target application is in a non-vision obstacle user operation mode, and the information comprises: the information which can be displayed after the target control is triggered is distribution service information selected from the information which can be displayed after the target control is triggered when the target application is in a non-vision obstacle user operation mode.
Optionally, the detecting whether the target control in the page of the target application is the current focus element includes: detecting whether the target control is the current focus element by detecting whether a target operation for the target control is obtained.
Optionally, the target operation for the target control includes at least one of the following operations:
clicking operation aiming at the target control;
a long-press operation for the target control;
and performing sliding operation on the target control.
Optionally, the detecting whether the target control in the page of the target application is the current focus element includes: and detecting whether the target control is the current focus element or not by detecting whether a target operation aiming at the element combination of the target control is obtained, wherein the element combination of the target control is an element combination which is formed by the target control and other elements in the page and has an association relation.
Optionally, the playing the target information by voice mode includes: sequentially playing element combination information corresponding to the element combinations according to a preset playing sequence, wherein the element combination information is obtained according to information which can be displayed after the element combinations are triggered when the target application is in a non-visual obstacle user operation mode, and the element combination information carries the target information.
Optionally, the playing the target information by voice mode includes:
Obtaining target document information for introducing the target information;
and playing the target document information in a voice mode.
Optionally, the information playing method provided in the embodiment of the present application further includes:
obtaining the target information when the target application is in a non-vision-impaired user operation mode;
pre-configuring the target document information for the target information;
and establishing a corresponding relation among the target control, the target information and the target document information.
Optionally, the information playing method provided in the embodiment of the present application further includes:
sending a corresponding relation acquisition request message for requesting to acquire the corresponding relation among the target control, the target information and the target document information to a server;
and obtaining the corresponding relation among the target control, the target information and the target document information sent by the server side aiming at the corresponding relation acquisition request message.
Optionally, the obtaining the target document information for introducing the target information includes: and obtaining the target document information according to the corresponding relation among the target control, the target information and the target document information.
Optionally, the information playing method provided in the embodiment of the present application further includes: a document acquisition request message for requesting to acquire the target document information is sent to a server;
the obtaining the target document information for introducing the target information includes: and obtaining the target document information sent by the server side aiming at the document acquisition request message.
Optionally, the target control includes at least one of the following types of controls:
a button-type control;
a bubble type control;
a control of the floating layer type.
In another embodiment of the present application, there is also provided an information playing device, including:
the focus detection unit is used for detecting whether a target control in a page of a target application is a current focus element or not when the target application is in a vision disorder user operation mode;
the information playing unit is used for playing target information in a voice mode when the target control is the current focus element, wherein the target information is obtained according to information which can be displayed after the target control is triggered when the target application is in a non-visual obstacle user operation mode.
In another embodiment of the present application, there is also provided an electronic device including:
A processor;
and the memory is used for storing a computer program, and the computer program is run by the processor to execute the information playing method provided by the embodiment of the application.
In another embodiment of the present application, there is further provided a storage medium storing a computer program, where the computer program is executed by a processor to perform the information playing method provided in the embodiment of the present application.
Compared with the prior art, the application has the following advantages:
the embodiment of the application provides an information playing method, which comprises the following steps: when a target application is in a vision disorder user operation mode, detecting whether a target control in a page of the target application is a current focus element or not; if yes, playing target information in a voice mode, wherein the target information is obtained according to information which can be displayed after the target control is triggered when the target application is in a non-visual obstacle user operation mode. According to the information playing method provided by the embodiment of the application, when the target control in the page of the target application is the current focus element, the target information obtained according to the information which can be displayed after the target control is triggered when the target application is in the non-visual obstacle user operation mode is played, so that the problem of what information is played in a voice mode when the control in the page of the application is selected as the current focus element is solved.
Drawings
Fig. 1 is a schematic diagram of a first scenario of an information playing method according to an embodiment of the present application.
Fig. 2 is a second schematic view of an information playing method according to an embodiment of the present application.
Fig. 3 is a third schematic view of an information playing method according to an embodiment of the present application.
Fig. 4 is a flowchart of an information playing method provided in the first embodiment of the present application.
Fig. 5 is a schematic diagram of an information playing device according to a second embodiment of the present application.
Fig. 6 is a schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
In order to more clearly show the information playing method provided by the embodiment of the present application, first, an application scenario of the information playing method provided by the embodiment of the present application is introduced. The information playing method provided by the embodiment of the application is generally applied to a scenario that a visual impairment user operates the takeaway application to order food when the takeaway application is in a visual impairment user operation mode, and can also be applied to a scenario that other applications outside the takeaway application are operated by the visual impairment user when the other applications are in a visual impairment user operation mode, such as: the vision impairment user operates the music playing application to play the music under the condition that the music playing application is in the vision impairment user operation mode.
In addition, the information playing method provided by the embodiment of the application can be applied to some other same or similar scenes. In the embodiment of the present application, the information playing method is specifically taken as an example of applying the information playing method to a scenario in which a visually impaired user operates the take-out application to order in a visually impaired user operation mode, and the information playing method provided in the embodiment of the present application is described, and the detailed execution process or steps of the information playing method provided in the embodiment of the present application refer to the case in which the visually impaired user operates the take-out application to order in a visually impaired user operation mode.
The execution body of the information playing method provided by the embodiment of the application is a user side capable of implementing the information playing method provided by the embodiment of the application, and the user side is generally an electronic device provided with a target application. The specific implementation manner of the user side capable of implementing the information playing method provided by the embodiment of the application is as follows: program or software for realizing the information playing method provided by the embodiment of the application is preconfigured in the electronic equipment corresponding to the user side, or a module for realizing the information playing method provided by the embodiment of the application is preconfigured in the target application.
So-called electronic devices are typically smartphones, and a range of different types of computers, including tablet computers. The Application target is generally an APP (Application program) or a computer Application, and the following details about a scenario in which an information playing method is applied to a visually impaired user and the take-away Application is operated to order when the take-away Application is in a visually impaired user operation mode by using an electronic device as a mobile phone and the target Application as the take-away APP.
Fig. 1 is a schematic diagram of a first scenario of an information playing method according to an embodiment of the present application. Fig. 1 includes: the client 101. After the user terminal 101 opens the operation mode of the visually impaired user based on the triggering operation of the user, the take-out APP is in the operation mode of the visually impaired user. The visually impaired user operation mode may be an operation mode of the mobile phone itself or an operation mode of only a specific application including the target application on the mobile phone. The so-called dysarthria user operation mode is an operation mode that provides language assistance for a dysarthria user so that the dysarthria user can effectively interact with the electronic device or an application on the electronic device.
When the target take-away APP is in the vision impairment user operation mode, the user terminal 101 enters the target take-away APP after detecting that the user opens a trigger operation for the application of the target take-away APP, and displays a page of the target take-away APP. After the user terminal 101 displays the page of the target take-away APP, a control for hiding information in the page is obtained as a target control. That is, the target control is a control that can further display information after being triggered. In a specific implementation process, a control which can acquire hidden information in a page is taken as a target control in all controls of the page, such as a more expanded button in fig. 1: ", v", and screen button:
the target controls comprise controls of at least one of the following control button types; a bubble type control; a control of the floating layer type. That is, the so-called target control may be one or more of a button, a bubble, and a float layer.
Fig. 2 is a schematic diagram of a second scenario of the information playing method according to the embodiment of the present application. Fig. 2 includes: the client 101. After obtaining the control for hiding the information in the page as the target control, the user side 101 further detects whether the target control in the page of the target take-away APP is the current focus element. Specifically, the user side 101 detects whether the target control is the current focus element by detecting whether a target operation for the target control is obtained, where the target operation for the target control includes, but is not limited to: clicking operation for the target control.
And if the user side detects the clicking operation on the target control, generating a focus frame for the clicked target control, and taking the target control with the focus frame as the current focus control.
Fig. 3 is a schematic diagram of a third scenario of the information playing method according to the embodiment of the present application. Fig. 3 includes: the client 101 and the server 102. The server 102 is a computing device for providing services such as data processing and storage for the target take-away APP, and is generally implemented in a specific manner: a server or a cluster of servers.
When the user terminal 101 detects that the target control is the current focus element, the user terminal 101 plays the target information in a voice mode. Specifically, the specific implementation manner of playing the target information in a voice manner is as follows: obtaining target document information for introducing target information; and playing the target document information in a voice mode. In the implementation process, the client 101 sends a document acquisition request message to the server 102, where the document acquisition request message requests to acquire the target document information. The document acquisition request message carries identification information of the target control. The target document information is document information configured in advance for introducing the target information. The target information is obtained according to information which can be displayed after the target control is triggered when the target take-away APP is in a non-vision obstacle user operation mode.
Taking the target control as an example of a 'all-classification' button, when the target take-away APP is in a non-vision disorder user operation mode, the information which can be displayed after being triggered is as follows: "snack cuisine 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class", then "snack cuisine 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class" are the target information. At this time, the document information for introducing the target information may be: the hidden information under the button of the "all classification V" is the introduction information of the meal variety class, and the information is sequentially from top to bottom in the page from top to bottom in the order of the meal variety class, namely "snack lunch 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class".
It should be noted that, in the embodiment of the present application, the specific form or content of the target document information is not specifically limited, so long as it can be ensured that the information that can be displayed after the target control is triggered when the target take-away APP is in the non-vision impairment user operation mode can be clearly and briefly introduced.
After obtaining the identification information document obtaining request message carrying the target control, the server 102 obtains the target document information according to the corresponding relationship between the identification information and the identification information, the target information and the target document information, and sends the target document information to the client 101 according to the document obtaining request message.
After the user terminal 101 obtains the target document information, the target document information is played in a voice manner.
In this embodiment of the present application, an application scenario of the information playing method provided in the embodiment of the present application is not specifically limited, for example: the information playing method provided by the application can also be applied to a scene that the take-away application is operated to order when the take-away application is in the operation mode of the visual disorder user, and detailed description is omitted here. The embodiment corresponding to the application scenario of the information playing method is provided for facilitating understanding of the information playing method provided by the application, and is not limited to the information playing method provided by the application.
First embodiment
A method for playing information is provided in a first embodiment of the present application, and is described below with reference to fig. 4.
Fig. 4 is a flowchart of an information playing method provided in the first embodiment of the present application.
In step S401, when the target application is in the visually impaired user operation mode, it is detected whether the target control in the page of the target application is the current focus element.
The execution body of the information playing method provided in the first embodiment of the present application is a user side capable of implementing the information playing method provided in the first embodiment of the present application, where the user side is generally an electronic device with a target application installed. The specific implementation manner of the user side capable of implementing the information playing method provided in the first embodiment of the present application is: the electronic device corresponding to the user side is preconfigured with a program or software for realizing the information playing method provided by the first embodiment of the application, or the target application is preconfigured with a module for realizing the information playing method provided by the first embodiment of the application. That is, the information playing method provided in the first embodiment of the present application may be applicable to only the target application, may be applicable to a plurality of specific applications including the target application, and may also be applicable to all applications installed on the electronic device corresponding to the user side.
So-called electronic devices are typically smartphones, and a range of different types of computers, including tablet computers. The target application is typically an APP or a computer application. Because the information playing method provided in the first embodiment of the present application is mainly aimed at visually impaired users, the electronic device corresponding to the user terminal in the first embodiment of the present application is generally an electronic device supporting touch screen operation. In addition, the electronic device corresponding to the user terminal in the first embodiment of the present application may also be an electronic device operated by an auxiliary device such as a mouse.
In order to enable the information playing method provided in the first embodiment of the present application, a program or software for implementing the information playing method provided in the first embodiment of the present application needs to be configured on an electronic device corresponding to a user side in advance, or a module for implementing the information playing method provided in the first embodiment of the present application needs to be configured on an electronic device corresponding to a user side in advance.
In the first embodiment of the present application, the target application includes, but is not limited to, a take-away application, a music playing application, a taxi taking application, and a social application, and the specific implementation forms are: take-away APP, music playing software, etc.
The operation mode of the vision-impaired user can be a mode pre-configured for the electronic equipment corresponding to the user side, or a mode pre-configured for a specific application, wherein the operation mode of the vision-impaired user is used for providing language assistance for the vision-impaired user so that the vision-impaired user can effectively interact with the electronic equipment or the application on the electronic equipment. Specifically, when a user selects an element in a page displayed by the electronic device or a page of a target application as a current focus element through preset triggering operations such as clicking, long pressing, sliding, a certain time period when the mouse stays in a target area, the electronic device plays preset information in a voice mode, including but not limited to: the name information of the element and the information obtained according to the information which can be displayed after the control is triggered when the target application is in the non-visual obstacle user operation mode.
The so-called non-visually impaired user operation mode is typically a normal operation mode of the electronic device, such as: and the electronic equipment defaults to an operation mode corresponding to the presence setting.
In the first embodiment of the present application, before detecting whether the target control in the page of the target application is the current focus element, the target control needs to be predetermined. A target control is a control for hiding information in a page, and the target control can further display information after being triggered. In the specific implementation process, before detecting whether a target control in a page of a target application is a current focus element, identifying the control in the page of the target application is needed, and obtaining the control for hiding information in the page as the target control, namely calibrating the control for hiding information in the page as the target control.
By controls for hiding information in a page, i.e., target controls include, but are not limited to, "expand more controls", "view more controls", "filter controls", "get more controls", "show details controls", and "catalog controls". The general expression of "expand more controls", "acquire more controls" in a page is: "in" and " >"etc.," view moreThe general expression forms of the multi-control and the presentation detail control in the page are as follows:and +.>Etc., the general manifestation of the "filter control" in a page is: />Etc., the general manifestation of a "catalog control" in a page is: "≡", etc.
In the first embodiment of the present application, there are two specific implementations for detecting whether the target control in the page of the target application is the current focus element.
The first way is: whether the target control is a current focus element is detected by detecting whether a target operation for the target control is obtained. At this time, the target control is an element that exists alone in the page of the target application. The preset operation of the target operation for the target control comprises at least one of the following operations: clicking operation aiming at a target control; long-press operation for a target control; sliding operation for the target control. The clicking operation may be clicking or double clicking, the long-press operation may be a long-press operation of the long-press target control exceeding a preset time period, and the sliding operation may be upward sliding, downward sliding, leftward sliding, rightward sliding, and the like.
The second mode is as follows: whether the target control is a current focus element is detected by detecting whether a target operation for an element combination to which the target control belongs is obtained. The element combination to which the target control belongs is an element combination which is formed by the target control and other elements in the page and has an association relation. At this time, the target control is one element in the element combination. Common combinations of elements are: and combining the target control with the text elements in the page to obtain combined elements, such as: "all sorts expand" expand more controls "the combination of elements with association that is composed by nesting the target element with other elements in the page.
The operation that the target operation of the element combination to which the target control belongs is preset comprises at least one of the following operations: clicking operation aiming at the element combination to which the target control belongs; long-press operation aiming at element combinations to which target controls belong; a sliding operation for a combination of elements to which the target control belongs. The clicking operation may be clicking or double clicking, the long-press operation may be a long-press operation of combining elements belonging to the long-press target control for longer than a preset time period, and the sliding operation may be upward sliding, downward sliding, leftward sliding, rightward sliding, and the like.
In the first embodiment of the present application, if a target operation for an element combination to which the target control belongs is detected or a target operation for the target control is detected, the target control in the page of the target application is regarded as being the current focus element.
In the first embodiment of the present application, the target control includes a control of a control button type of at least one of the following types; a bubble type control; a control of the floating layer type. That is, the so-called target control may be one or more of a button, a bubble, and a float layer.
In the first embodiment of the present application, if it is detected that the target control in the page of the target application is the current focus element, step S402 is performed.
In step S402, the target information is played in a voice manner, where the target information is obtained according to information that can be displayed after the target control is triggered when the target application is in the non-visually impaired user operation mode.
The information which can be displayed after the target control is triggered is information which can be displayed after the target control is triggered and is hidden in the page. The target information corresponding to the expansion more control is information which can be displayed after the expansion more control is triggered when the target application is in a non-vision obstruction user operation mode. Taking the take-out APP as an example, the information that the "" sort "expands more controls" can be displayed after being triggered when the take-out APP is in the non-visually impaired user operation mode is generally: more classification information of take-out, such as: "snack cuisine 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class", etc. The information that the "" specific product "" expands more controls "can be displayed after being triggered when the take-away APP is in a non-visually impaired user operation mode is generally: distribution service information corresponding to the meal, such as: "distribution distance information of the food from the user", "distribution time information of the food", "distribution fee information of the food", "enjoyable resource information of the food", and the like. Wherein, "the enjoyable resource information of the meal item" includes but is not limited to "the enjoyable full reduction resource information of the meal item" and "the enjoyable red pack resource information of the meal item" and the like.
In order to provide a more targeted service for visually impaired users, according to the information playing method provided by the first embodiment of the present application, the information that can be displayed after the target control is triggered is the information that can be displayed after the target control is triggered and is hidden in the page, and the information that can be displayed after the target control is triggered is the information selected from the information that can be displayed after the target control is triggered when the target application is in the non-visually impaired user operation mode. Specifically, the information with high service relevance provided by the target application can be selected as information which can be displayed after the target control is triggered. The information which is selected from the information which can be displayed after the target control is triggered when the target application is in the non-vision obstacle user operation mode is used as the information which can be displayed after the target control is triggered, and the information which can be displayed after the target control is triggered can be selected in a personalized mode according to the type and the use scene of the target application, so that the requirement of vision obstacle users for the target control can be better met, and the information with pertinence can be provided for the vision obstacle users.
For take-away APP applications, the information that can be presented after the target control is triggered is delivery service information selected from the information that can be presented after the target control is triggered when the target application is in a non-visually impaired user operation mode. The so-called delivery service information includes, but is not limited to: the term "information about the type of food", "information about the category of store", "information about the category of food", "information about introduction of food", "information about the distribution distance of food from the user", "information about the distribution time period of food", "information about the distribution fee of food", "information about the resources enjoyable by the food", etc. At this time, the target control is a control for providing a distribution service for visually impaired users.
For music playing applications, the information that can be displayed after the target control is triggered is music service information selected from the information that can be displayed after the target control is triggered when the target application is in a non-visually impaired user operation mode. The so-called music service information includes, but is not limited to: "song ranking information", "song categorization information", "music type information", "singer introduction information", and the like. At this time, the target control is a control for providing a music service for visually impaired users.
In the first embodiment of the present application, whether the target control in the page of the target application is the current focus element is: the specific implementation manner of playing the target information in a voice manner is as follows when detecting whether the target control is the current focus element by detecting whether the target operation for the element combination to which the target control belongs is obtained: sequentially playing element combination information corresponding to the element combinations according to a preset playing sequence, wherein the element combination information is obtained according to information which can be displayed after the element combinations are triggered when the target application is in a non-visual obstacle user operation mode, and the element combination information carries the target information.
In the first embodiment of the present application, the specific implementation manner of playing the target information by the voice manner is as follows: firstly, obtaining target document information for introducing target information; then, the target document information is played through a voice mode.
In the first embodiment of the present invention, when the target control in the page of the target application is the current focus element, the target information obtained according to the information that can be displayed after the target control is triggered when the target application is in the non-vision impairment user operation mode is played, so that the vision impairment user can know the information that can be displayed after the target button is triggered without operating the target button, thereby providing convenience for the vision impairment user, and therefore, increasing the user experience of the vision impairment user using the target application.
Taking the target control as an example of the 'all classification' button in the take-away APP page, when the target take-away APP is in a non-vision obstacle user operation mode, the information which can be displayed after being triggered is as follows: "snack cuisine 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class", then "snack cuisine 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class" are the target information. At this time, the document information for introducing the target information may be: the hidden information under the button of the "all classification V" is the introduction information of the meal variety class, and the information is sequentially from top to bottom in the page from top to bottom in the order of the meal variety class, namely "snack lunch 5000 class", "Chinese cuisine 1500 class", "dessert drink 500 class" and "snack barbecue 100 class".
Taking a button of "expand more v" in a page with a target control as music playing software as an example, when the target music playing software is in a non-vision-obstacle user operation mode, the information which can be displayed after being triggered is: "Song 1", "Song 2", "Song 3" and "Song 4", then "Song 1", "Song 2", "Song 3" and "Song 4" are the target information. At this time, the document information for introducing the target information may be: the hidden information under the button of the 'expand more V' is introduction information of songs, and the hidden information is sequentially 'Song 1', 'Song 2', 'Song 3', and 'Song 4' from top to bottom in the page according to the order of the song heat.
In the first embodiment of the present application, the so-called ways of obtaining the target document information for introducing the target information include, but are not limited to, the following three ways:
the first way is: firstly, obtaining target information in advance when a target application is in a non-vision obstruction user operation mode; secondly, pre-configuring target document information for target information; thirdly, establishing a corresponding relation among the target control, the target information and the target document information; and finally, obtaining the target document information according to the corresponding relation among the target control, the target information and the target document information. That is, the user side obtains the target information when the target application is in the non-vision-impaired user operation mode in advance; pre-configuring target document information for target information; and establishing a corresponding relation among the target control, the target information and the target document information and storing the corresponding relation in the user side.
The second mode is as follows: sending a corresponding relation acquisition request message for requesting to acquire the corresponding relation among the target control, the target information and the target document information to the server; obtaining a corresponding relation among a target control, target information and target document information sent by a server side aiming at a corresponding relation acquisition request message; and obtaining the target document information according to the corresponding relation among the target control, the target information and the target document information. That is, the server obtains the target information in advance when the target application is in the non-visually impaired user operation mode; pre-configuring target document information for target information; and establishing a corresponding relation among the target control, the target information and the target document information, storing the corresponding relation in the server, sending a corresponding relation acquisition request message to the server when the user side has a demand, acquiring the corresponding relation among the target control, the target information and the target document information, and acquiring the target document information according to the corresponding relation among the target control, the target information and the target document information provided by the server.
The third way is: a document acquisition request message for requesting to acquire target document information is sent to a server; and obtaining target document information sent by the server side aiming at the document acquisition request message. That is, the server obtains the target information in advance when the target application is in the non-visually impaired user operation mode; pre-configuring target document information for target information; and establishing a corresponding relation among the target control, the target information and the target document information, storing the corresponding relation in the server, sending a document acquisition request message to the server when the user side has a demand, acquiring the corresponding relation among the target control, the target information and the target document information by the server aiming at the document acquisition request message, and acquiring the target document information according to the corresponding relation among the target control, the target information and the target document information and providing the target document information to the user side.
A first embodiment of the present application provides an information playing method, including: when a target application is in a vision disorder user operation mode, detecting whether a target control in a page of the target application is a current focus element or not; if yes, playing target information in a voice mode, wherein the target information is obtained according to information which can be displayed after the target control is triggered when the target application is in a non-visual obstacle user operation mode. According to the information playing method provided by the first embodiment of the invention, when the target control in the page of the target application is the current focus element, the target information obtained according to the information which can be displayed after the target control is triggered when the target application is in the non-visual obstacle user operation mode is played, so that the problem of what information is played in a voice mode when the control in the page of the application is selected as the current focus element is solved.
Second embodiment
The second embodiment of the present application further provides an information playing device, corresponding to the application scenario of the information playing method provided by the embodiment of the present application and the information playing method provided by the first embodiment. Since the embodiment of the apparatus is substantially similar to the application scenario of the information playing method provided in the embodiment of the present application and the information playing method provided in the first embodiment, the description is relatively simple, and please refer to the application scenario of the information playing method provided in the embodiment of the present application and the partial description of the information playing method provided in the first embodiment for relevant points. The device embodiments described below are merely illustrative.
Fig. 5 is a schematic diagram of an information playing device according to a second embodiment of the present application.
The information playing device provided in the second embodiment of the present application includes:
a focus detection unit 501, configured to detect, when a target application is in a visually impaired user operation mode, whether a target control in a page of the target application is a current focus element;
the information playing unit 502 is configured to play, by voice, target information when the target control is the current focus element, where the target information is obtained according to information that can be displayed after the target control is triggered when the target application is in a non-visual obstacle user operation mode.
Optionally, the information playing device provided in the second embodiment of the present application further includes: the target control obtaining unit is used for obtaining a control for hiding information in the page as the target control;
the information which can be displayed after the target control is triggered is information which can be displayed after the target control is triggered and is hidden in the page.
Optionally, the information that can be displayed after the target control is triggered is information that can be displayed after the target control is triggered and is hidden in the page, including: the information which can be displayed after the target control is triggered is information selected from the information which can be displayed after the target control is triggered when the target application is in a non-vision obstacle user operation mode.
Optionally, the target control is a control for providing distribution service for visually impaired users;
the information which can be displayed after the target control is triggered is information selected from the information which can be displayed after the target control is triggered when the target application is in a non-vision obstacle user operation mode, and the information comprises: the information which can be displayed after the target control is triggered is distribution service information selected from the information which can be displayed after the target control is triggered when the target application is in a non-vision obstacle user operation mode.
Optionally, the focus detection unit 501 is specifically configured to detect whether the target control is the current focus element by detecting whether a target operation for the target control is obtained.
Optionally, the target operation for the target control includes at least one of the following operations:
clicking operation aiming at the target control;
a long-press operation for the target control;
and performing sliding operation on the target control.
Optionally, the focus detection unit 501 is specifically configured to detect whether the target control is the current focus element by detecting whether a target operation for an element combination to which the target control belongs is obtained, where the element combination to which the target control belongs is an element combination that is formed by the target control and other elements in the page and has an association relationship.
Optionally, the information playing unit 502 is specifically configured to sequentially play, according to a preset playing order, element combination information corresponding to the element combination, where the element combination information is obtained according to information that can be displayed after the element combination is triggered when the target application is in a non-visually impaired user operation mode, and the element combination information carries the target information.
Optionally, the information playing unit 502 specifically obtains target document information for introducing the target information; and playing the target document information in a voice mode.
Optionally, the information playing device provided in the second embodiment of the present application further includes:
a target information acquisition unit configured to acquire the target information when the target application is in a non-visually impaired user operation mode;
a target document information configuration unit configured to pre-configure the target document information for the target information;
and the corresponding relation establishing unit is used for establishing the corresponding relation among the target control, the target information and the target document information.
Optionally, the information playing device provided in the second embodiment of the present application,
a corresponding relation acquisition request message sending unit, configured to send a corresponding relation acquisition request message to a server, where the corresponding relation acquisition request message is used to request to acquire the corresponding relation among the target control, the target information and the target document information;
The corresponding relation obtaining unit is used for obtaining the corresponding relation among the target control, the target information and the target document information sent by the server side aiming at the corresponding relation obtaining request message.
Optionally, the obtaining the target document information for introducing the target information includes: and obtaining the target document information according to the corresponding relation among the target control, the target information and the target document information.
Optionally, the information playing device provided in the second embodiment of the present application further includes: a document acquisition request message sending unit, configured to send a document acquisition request message for requesting to acquire the target document information to a server;
the obtaining the target document information for introducing the target information includes: and obtaining the target document information sent by the server side aiming at the document acquisition request message.
Optionally, the target control includes at least one of the following types of controls:
a button-type control;
a bubble type control;
a control of the floating layer type.
Third embodiment
The third embodiment of the present application further provides an electronic device, corresponding to the application scenario of the information playing method provided by the embodiment of the present application, the information playing method provided by the first embodiment, and the information playing apparatus provided by the second embodiment of the present application. Since the third embodiment is substantially similar to the application scenario of the information playing method provided in the embodiment of the present application, the information playing method provided in the first embodiment, and the information playing device provided in the second embodiment of the present application, the description is relatively simple, and the relevant points will only be referred to the application scenario of the information playing method provided in the embodiment of the present application, the information playing method provided in the first embodiment, and part of the description of the information playing device provided in the second embodiment of the present application. The third embodiment described below is merely illustrative.
Fig. 6 is a schematic diagram of an electronic device according to an embodiment of the present application.
The electronic device includes: a processor 601;
the memory 602 is configured to store a computer program, where the computer program is executed by the processor, and the information playing method provided in the embodiment of the present application.
It should be noted that, for the detailed description of the electronic device provided in the third embodiment of the present application, reference may be made to the application scenario of the information playing method provided in the embodiment of the present application, the information playing method provided in the first embodiment, and the related description of the information playing apparatus provided in the second embodiment of the present application, which are not repeated herein.
Fourth embodiment
The fourth embodiment of the present application further provides a computing storage medium corresponding to the application scenario of the information playing method provided by the embodiment of the present application, the information playing method provided by the first embodiment, and the information playing device provided by the second embodiment of the present application. Since the fourth embodiment is substantially similar to the application scenario of the information playing method provided in the embodiment of the present application, the information playing method provided in the first embodiment, and the information playing device provided in the second embodiment of the present application, the description is relatively simple, and the relevant points will only be referred to the application scenario of the information playing method provided in the embodiment of the present application, the information playing method provided in the first embodiment, and part of the description of the information playing device provided in the second embodiment of the present application. The device embodiments described below are merely illustrative.
The storage medium stores a computer program that is executed by a processor to perform the information playing method provided in the embodiment of the present application.
It should be noted that, for the detailed description of the computing storage medium provided in the fourth embodiment of the present application, reference may be made to the application scenario of the information playing method provided in the embodiment of the present application, the information playing method provided in the first embodiment, and the related description of the information playing device provided in the second embodiment of the present application, which are not repeated herein.
While the preferred embodiment has been described, it is not intended to limit the invention thereto, and any person skilled in the art may make variations and modifications without departing from the spirit and scope of the invention, so that the scope of the invention shall be defined by the claims.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory, random Access Memory (RAM), and/or nonvolatile memory, such as Read Only Memory (ROM) or Flash memory (Flash RAM), among others, in a computer readable medium. Memory is an example of computer-readable media.
1. Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), random Access Memory (RAM) of other physical types, read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage media, or any other non-transmission media, that can be used to store information that can be accessed by a computing device. Computer-readable Media, as defined herein, does not include non-Transitory computer-readable Media (transmission Media), such as modulated data signals and carrier waves.
2. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, method or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.

Claims (15)

1. An information playing method, comprising:
when a target application is in a vision obstruction user operation mode, detecting whether target operation aiming at an element combination which is to be controlled by a target control is obtained, wherein the element combination which is to be controlled by the target control is an element combination which is to be controlled by the target control and is formed by other elements and has an association relation, the target control is a control which can display information after being triggered when the target application is in a non-vision obstruction user operation mode, and the information which can be displayed after being triggered is information which can be displayed after being triggered and is hidden in a page;
if so, playing the element combination information corresponding to the element combination in a voice mode, wherein the element combination information is obtained according to information which can be displayed after the element combination is triggered when the target application is in a non-visual obstacle user operation mode.
2. The method according to claim 1, wherein said playing the element combination information corresponding to the element combination by voice includes: and sequentially playing the element combination information corresponding to the element combination according to a preset playing sequence.
3. The method according to claim 1 or 2, wherein the element combination information is obtained from information that can be presented after the element combination is triggered when the target application is in a non-visually impaired user operation mode.
4. The method of claim 1, wherein the other elements comprise text elements.
5. The method of claim 1, wherein the target operation for the combination of elements to which the target control belongs comprises at least one of:
clicking operation aiming at the element combination to which the target control belongs;
long-press operation aiming at element combinations to which target controls belong;
a sliding operation for a combination of elements to which the target control belongs.
6. The method of claim 1, wherein the target control is a control that can reveal information hidden in a page after the target application is triggered when in a non-visually impaired user operation mode.
7. The method of claim 1, wherein the information presentable after the target control is triggered is information selected from the information presentable after the target control is triggered when the target application is in a non-visually impaired user operation mode.
8. The method of claim 1, wherein the target control is a control for providing delivery services to visually impaired users, and the information presentable after the target control is triggered is delivery service information selected from the information presentable after the target control is triggered when the target application is in a non-visually impaired user mode of operation.
9. The method of claim 1, wherein the target control comprises at least one of the following types of controls:
a button-type control;
a bubble type control;
a control of the floating layer type.
10. The method of claim 1, wherein the element combination information includes target information obtained from information that may be presented after the target control is triggered when the target application is in a non-visually impaired user operation mode.
11. The method as recited in claim 10, further comprising: obtaining target document information for introducing target information;
the playing the element combination information corresponding to the element combination in a voice mode comprises the following steps: and playing the target document information in a voice mode.
12. The method of claim 11, wherein the obtaining the target document information for introducing the target information comprises:
acquiring target information in advance when the target application is in a non-vision obstacle user operation mode;
pre-configuring target document information for target information;
establishing a corresponding relation among the target control, the target information and the target document information;
and obtaining target document information for introducing the target information according to the corresponding relation among the target control, the target information and the target document information.
13. The method of claim 11, wherein the obtaining the target document information for introducing the target information comprises:
sending a corresponding relation acquisition request message for requesting to acquire the corresponding relation among the target control, the target information and the target document information to the server;
obtaining a corresponding relation among a target control, target information and target document information sent by a server side aiming at a corresponding relation acquisition request message;
and obtaining the target document information according to the corresponding relation among the target control, the target information and the target document information.
14. The method of claim 11, wherein the obtaining the target document information for introducing the target information comprises:
a document acquisition request message for requesting to acquire target document information is sent to a server;
and obtaining target document information sent by the server side aiming at the document acquisition request message.
15. An information playback apparatus, comprising:
the focus detection unit is used for detecting whether target operation aiming at an element combination which is to be a target control is obtained when a target application is in a vision disorder user operation mode, wherein the element combination which is to be the target control and is formed by other elements and has an association relation, the target control is a control which can display information after being triggered when the target application is in a non-vision disorder user operation mode, and the information which can be displayed after being triggered is information which can be displayed after being triggered and is hidden in a page;
and the information playing unit is used for playing the element combination information corresponding to the element combination in a voice mode if the target operation of obtaining the element combination to which the target control belongs is detected, wherein the element combination information is obtained according to information which can be displayed after the element combination is triggered when the target application is in a non-visual obstacle user operation mode.
CN202111162316.7A 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium Active CN113900618B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111162316.7A CN113900618B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111162316.7A CN113900618B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium
CN202110502860.5A CN112988108B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110502860.5A Division CN112988108B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113900618A CN113900618A (en) 2022-01-07
CN113900618B true CN113900618B (en) 2023-12-19

Family

ID=76337342

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110502860.5A Active CN112988108B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium
CN202111162316.7A Active CN113900618B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110502860.5A Active CN112988108B (en) 2021-05-07 2021-05-07 Information playing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (2) CN112988108B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489833B (en) * 2021-06-29 2022-11-04 维沃移动通信有限公司 Information broadcasting method, device, equipment and storage medium
CN114827740A (en) * 2022-04-15 2022-07-29 海信视像科技股份有限公司 Display device, control device and method for starting visual impairment function service
CN115167680A (en) * 2022-07-14 2022-10-11 腾讯科技(深圳)有限公司 Vibration reminding method, related equipment and computer storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1678982A (en) * 2002-09-30 2005-10-05 微软公司 System and method for making user interface elements known to an application and user
CN106055721A (en) * 2016-07-15 2016-10-26 深圳市联谛信息无障碍有限责任公司 Accessible webpage processing method and relevant equipment
CN106406867A (en) * 2016-09-05 2017-02-15 深圳市联谛信息无障碍有限责任公司 Android system-based screen reading method and apparatus
CN112148408A (en) * 2020-09-27 2020-12-29 深圳壹账通智能科技有限公司 Barrier-free mode implementation method and device based on image processing and storage medium
CN112307390A (en) * 2020-11-26 2021-02-02 广东南方网络信息科技有限公司 Website barrier-free informatization processing method, device, storage medium and system

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103377209A (en) * 2012-04-20 2013-10-30 阿里巴巴集团控股有限公司 Method and terminal for browsing webpages with the help of voices
US9507561B2 (en) * 2013-03-15 2016-11-29 Verizon Patent And Licensing Inc. Method and apparatus for facilitating use of touchscreen devices
KR20160029587A (en) * 2014-09-05 2016-03-15 삼성전자주식회사 Method and apparatus of Smart Text Reader for converting Web page through TTS
CN105487744A (en) * 2014-09-23 2016-04-13 中兴通讯股份有限公司 Method and device for realizing interaction on accessible intelligent terminal
CN105404415A (en) * 2015-10-19 2016-03-16 惠州Tcl移动通信有限公司 Mobile terminal and control method of mobile terminal
CN106648291A (en) * 2016-09-28 2017-05-10 珠海市魅族科技有限公司 Method and device for displaying information and broadcasting information
CN109117047A (en) * 2017-06-22 2019-01-01 西安中兴新软件有限责任公司 terminal control method and device, mobile terminal and computer readable storage medium
CN111263204B (en) * 2018-11-30 2022-09-20 青岛海尔多媒体有限公司 Control method and device for multimedia playing equipment and computer storage medium
CN109803050B (en) * 2019-01-14 2020-09-25 南京点明软件科技有限公司 Full screen guiding clicking method suitable for blind person to operate mobile phone
CN109893852B (en) * 2019-02-26 2022-07-26 北京心智互动科技有限公司 Interface information processing method and device
CN110222550A (en) * 2019-07-11 2019-09-10 上海肇观电子科技有限公司 Information broadcasting method, circuit, casting equipment, storage medium, intelligent glasses
CN111399638B (en) * 2020-02-29 2023-06-30 浙江工业大学 Blind computer and intelligent mobile phone auxiliary control method suitable for blind computer
CN112214155B (en) * 2020-06-09 2022-04-26 北京沃东天骏信息技术有限公司 View information playing method, device, equipment and storage medium
CN112486451B (en) * 2020-11-27 2022-03-11 掌阅科技股份有限公司 Voice broadcasting method, computing device and computer storage medium
CN112578967B (en) * 2020-12-24 2022-04-15 深圳市联谛信息无障碍有限责任公司 Chart information reading method and mobile terminal
CN112711372B (en) * 2020-12-28 2022-03-11 掌阅科技股份有限公司 Page response method in visual impairment mode, computing device and computer storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1678982A (en) * 2002-09-30 2005-10-05 微软公司 System and method for making user interface elements known to an application and user
CN106055721A (en) * 2016-07-15 2016-10-26 深圳市联谛信息无障碍有限责任公司 Accessible webpage processing method and relevant equipment
CN106406867A (en) * 2016-09-05 2017-02-15 深圳市联谛信息无障碍有限责任公司 Android system-based screen reading method and apparatus
CN112148408A (en) * 2020-09-27 2020-12-29 深圳壹账通智能科技有限公司 Barrier-free mode implementation method and device based on image processing and storage medium
CN112307390A (en) * 2020-11-26 2021-02-02 广东南方网络信息科技有限公司 Website barrier-free informatization processing method, device, storage medium and system

Also Published As

Publication number Publication date
CN113900618A (en) 2022-01-07
CN112988108B (en) 2021-08-10
CN112988108A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
CN113900618B (en) Information playing method and device, electronic equipment and storage medium
US10529326B2 (en) Suggesting intent frame(s) for user request(s)
US20200311342A1 (en) Populating values in a spreadsheet using semantic cues
US9069443B2 (en) Method for dynamically displaying a personalized home screen on a user device
US10114534B2 (en) System and method for dynamically displaying personalized home screens respective of user queries
US9996222B2 (en) Automatic deep view card stacking
JP2020507861A (en) Method and apparatus for providing search results
AU2014309040B9 (en) Presenting fixed format documents in reflowed format
US10412037B2 (en) Methods and systems for providing notifications to users of a social networking service
US10311500B2 (en) Methods and systems for developer onboarding for software-development products
US11586690B2 (en) Client-side personalization of search results
US10860187B1 (en) Object oriented interactions
EP3207689A1 (en) Suggesting activities
US10002113B2 (en) Accessing related application states from a current application state
US10282736B2 (en) Dynamic modification of a parameter of an image based on user interest
CN107665447B (en) Information processing method and information processing apparatus
US20160334969A1 (en) Methods and Systems for Viewing an Associated Location of an Image
US10613828B2 (en) Dynamic and personalized filtering of media content
US10635729B2 (en) Research application and service
CN113190697A (en) Image information playing method and device
CN111077991A (en) Point reading control method and terminal equipment
US11782976B2 (en) Method for querying information and display device
US10261952B1 (en) Restoring temporal coherence of previously seen ranked content
CN113253909A (en) Control method and device for resource information display and play, storage and electronic equipment
KR20150140947A (en) Content provision method of objects and the apparatus using the method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant