WO2022111047A1 - Procédé de diffusion vocale, dispositif informatique et support de stockage informatique - Google Patents

Procédé de diffusion vocale, dispositif informatique et support de stockage informatique Download PDF

Info

Publication number
WO2022111047A1
WO2022111047A1 PCT/CN2021/121810 CN2021121810W WO2022111047A1 WO 2022111047 A1 WO2022111047 A1 WO 2022111047A1 CN 2021121810 W CN2021121810 W CN 2021121810W WO 2022111047 A1 WO2022111047 A1 WO 2022111047A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
controls
voice
user
book
Prior art date
Application number
PCT/CN2021/121810
Other languages
English (en)
Chinese (zh)
Inventor
杨树彬
Original Assignee
掌阅科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 掌阅科技股份有限公司 filed Critical 掌阅科技股份有限公司
Publication of WO2022111047A1 publication Critical patent/WO2022111047A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons

Definitions

  • the present disclosure relates to the field of computer technology, and in particular, to a voice broadcast method, a computing device, and a computer storage medium.
  • the operation control method adopted is to generate program effects through touch and display them on the screen.
  • the display effect is then controlled and operated.
  • the smart terminal usually has a voice broadcast function, and the user can know which controls are clicked through the voice broadcast method.
  • the existing voice broadcast method is that the user triggers a control once, and responds Due to the user's trigger operation, a voice broadcast is performed on the control focus content of the control. Due to the many controls in the application, the user frequently triggers different controls when clicking on the screen, resulting in frequent broadcasts, so that the user cannot quickly find the desired operation.
  • the controls are cumbersome to operate, and also affect the user experience.
  • the present disclosure is proposed to provide a voice broadcast method, computing device and computer storage medium that overcome the above problems or at least partially solve the above problems.
  • a voice broadcasting method comprising:
  • At least one control associated with the control is determined, and the control focus content of the at least two associated controls is sequentially voice broadcast.
  • a computing device including: a processor, a memory, a communication interface, and a communication bus, and the processor, the memory, and the communication interface communicate with each other through the communication bus;
  • the memory is used to store at least one executable instruction, and the executable instruction causes the processor to perform the following operations:
  • At least one control associated with the control is determined, and the control focus content of the at least two associated controls is sequentially voice broadcast.
  • a non-volatile computer-readable storage medium where at least one executable instruction is stored in the non-volatile computer-readable storage medium, and the executable instruction causes a processor to execute the above-mentioned The operation corresponding to the voice broadcast method.
  • a computer program product comprising a computing program stored on the above-mentioned non-volatile computer-readable storage medium.
  • the trigger operation is to determine at least one control that is associated with the control, and sequentially voice broadcast the control focus content of at least two associated controls, so as to realize the effect of broadcasting the focus content of multiple associated controls by one trigger operation, which simplifies the
  • the user operation enables the user to quickly find the desired control without frequently triggering the control, avoiding frequent switching between the triggering operation and the voice broadcast, thereby improving the user experience.
  • FIG. 1 shows a schematic flowchart of a voice broadcast method according to an embodiment of the present disclosure
  • FIG. 2A shows a schematic flowchart of a voice broadcast method according to another embodiment of the present disclosure
  • Fig. 2B is the schematic diagram of the corresponding operation area of the control
  • 2C is a schematic diagram of a merge operation area
  • Fig. 2D is the schematic diagram of activation control
  • FIG. 3 shows a schematic structural diagram of a computing device according to an embodiment of the present disclosure.
  • FIG. 1 shows a schematic flowchart of a voice broadcast method according to an embodiment of the present disclosure. As shown in Figure 1, the method includes the following steps:
  • Step S101 with respect to at least two controls in the preset area in the application page, determine whether the at least two controls satisfy the association condition; if so, go to step S102 .
  • the application page contains many controls, and these controls are distributed throughout the application page.
  • the application page is divided into multiple areas, each area contains multiple controls, and the size of each area can be flexibly set For example, the sizes of the multiple regions may be the same or different, which are not specifically limited here.
  • the association condition is a condition that should be satisfied for establishing an association between the at least two controls. If at least two controls satisfy the association condition, step S102 is executed; if the association condition is not met, the method ends.
  • step S102 at least two controls are associated with each other.
  • the at least two controls may be associated with processing, for example, the association relationship between the at least two controls may be recorded, or the at least two controls may be associated in other ways, for example, Merge the operation areas corresponding to at least two controls.
  • This embodiment does not specifically limit the method used for associating at least two controls, and other methods capable of associating at least two controls also belong to the protection scope of the present disclosure.
  • By associating at least two controls an association is established between the at least two controls, thereby providing an implementation basis for subsequently broadcasting the control focus content of multiple controls through one trigger operation voice.
  • Step S103 in response to the user's triggering operation on any control in the application page, determine at least one control associated with the control, and sequentially voice broadcast the control focus content of the at least two associated controls.
  • At least one control that is associated with the control can be determined by querying.
  • the control focus content of at least two associated controls is broadcast in sequence, where the control focus content is used to describe what control a control is, for example, it can be the name of the control, etc., through a trigger operation , realizes the effect of voice broadcast of the focus content of multiple controls, simplifies user operations, makes users do not need to trigger controls frequently, thereby improving user experience.
  • the method provided by the above-mentioned embodiments of the present disclosure provides a basis for subsequently broadcasting the control focus content of multiple controls through one trigger operation voice by associating at least two controls that satisfy the association conditions;
  • the triggering operation of a control determines at least one control that is associated with the control, and sequentially voice broadcasts the control focus content of at least two associated controls, thereby realizing the effect of broadcasting the focus content of multiple associated controls by one triggering operation.
  • the user operation is simplified, so that the user can quickly find the desired control without frequently triggering the control, avoiding frequent switching between the triggering operation and the voice broadcast, thereby improving the user experience.
  • FIG. 2A shows a schematic flowchart of a voice broadcast method according to another embodiment of the present disclosure. As shown in Figure 2A, the method includes the following steps:
  • the application page contains many controls, and these controls are distributed throughout the application page.
  • the application page is divided into multiple areas, each area contains multiple controls, and the size of each area can be flexibly set
  • the sizes of the multiple regions may be the same or different, which are not specifically limited here.
  • the controls in the application page include: no-click event control and click-event control.
  • the no-click event control means that there is no follow-up response after the user clicks the control.
  • this type of control does not have a corresponding jump address;
  • the click-event control refers to the control. After the user clicks the control, it will respond to the click behavior.
  • there is a click event control corresponding to the jump address and the page jump can be realized according to the jump address.
  • Any control in the application page corresponds to an operation area, and a trigger operation at any position of the operation area is considered as a trigger operation of the control.
  • the voice broadcast method provided in this embodiment can be applied to a visually impaired scenario, that is, when a blind person or a visually impaired person uses an intelligent terminal, the screen may not be seen or seen clearly. Assist the corresponding personnel to know what the control triggered by them is. Of course, it is not limited to visually impaired scenarios, and other scenarios that require voice broadcast are applicable.
  • Step S201 if at least two controls in the preset area in the application page are no-click event controls, determine whether the distance between the operation areas corresponding to the at least two controls is less than the preset distance, and if so, go to step S205.
  • judging whether the at least two controls can be associated is specifically by judging whether the distance between the operation areas corresponding to the at least two controls is less than the preset distance.
  • the set distance can be flexibly set according to actual needs.
  • the preset distance is a critical value. When the distance between the operation areas corresponding to at least two controls is less than the preset distance, it means that at least two controls are very close, and at least two controls meet the requirements. Associative conditions, that is, jump to step S205.
  • the distance between the operation areas corresponding to the at least two controls is greater than or equal to the preset distance, it indicates that the at least two controls are a little far apart, and the at least two controls do not satisfy the association condition, and the method ends.
  • the positional relationship of at least two controls belongs to an up-down relationship or a left-right relationship
  • the corresponding operation area is also an up-down relationship or a left-right relationship
  • the distance between the operation areas here refers to the up-down distance or the left-right distance.
  • Step S202 if at least two controls in the preset area in the application page include a no-click event control and a click-event control, then determine whether there is a control focus content of the no-click event control and the control focus content of the click-event control. Relevance, if yes, execute step S203.
  • No click event control means that there is no follow-up response after the user clicks the control, usually this type of control does not have a corresponding jump address; click event control means that after the user clicks the control, it will respond to the click behavior, usually there is a click event control Corresponding to the jump address, the page jump can be realized according to the jump address.
  • the no-click event control does not correspond to the jump address. Therefore, if at least two controls in the preset area of the application page contain both the no-click event control and the click event control, the at least two controls cannot be determined based on the jump address. Whether the two controls can be associated.
  • step S205 can be skipped to execute; The method ends when there is no association between the focus content of the control.
  • Step S203 judging whether the distance between the operation areas corresponding to the at least two controls is smaller than the preset distance; if yes, then performing step S205.
  • the preset distance can be flexibly set according to actual needs.
  • the preset distance is a critical value. When the distance between the operation areas corresponding to at least two controls is smaller than the preset distance, it means that at least two controls are very close, and at least two controls are very close.
  • the association condition is satisfied, that is, step S205 is jumped to.
  • the distance between the operation areas corresponding to the at least two controls is greater than or equal to the preset distance, it indicates that the at least two controls are a little far apart, and the at least two controls do not satisfy the association condition, and the method ends.
  • Step S204 if at least two controls in the preset area in the application page are controls with click events, determine whether the jump addresses corresponding to the at least two controls are the same, and if so, go to step S205.
  • Each of the click event controls corresponds to a jump address. Therefore, whether to associate at least two controls can be determined based on the jump address.
  • the following method can be used to determine whether at least two controls satisfy the association condition: judging whether the jump addresses corresponding to at least two controls are the same jump address, if it is the same jump address, it means that after clicking at least two controls respectively. , the responses made are the same, that is, jumping to the same page, then it can be determined that at least two controls meet the associated conditions, that is, jumping to perform step S205; After the two controls, jump to different pages, it is determined that at least two controls do not satisfy the association condition, and the method ends.
  • Step S205 it is determined that at least two controls satisfy the association condition.
  • the at least two controls satisfy the association condition, so that the at least two controls can be The two controls are associated with each other.
  • Step S206 merge the operation areas corresponding to the at least two controls to obtain the merge operation area.
  • the association of the at least two controls can be realized by merging the operation areas corresponding to the at least two controls.
  • the user operable area can be expanded, so that The user does not need to look at the position of the control every time to click effectively, which can effectively solve the problem that the operation area corresponding to the control is too small, which is inconvenient for the user to click. For the visually impaired, the user experience is improved.
  • the operation area corresponding to each control is fixed. Therefore, the operation area corresponding to at least two controls can be merged by using the principle of minimum rectangular coverage. Specifically, determine The smallest rectangular area containing the operation areas corresponding to at least two controls, and the smallest rectangular area is determined as the merge operation area. After the merge operation area is determined, the area coordinates corresponding to the merge operation area can also be determined, so as to facilitate the subsequent determination of the merge operation area Override controls. Or, record the association relationship between the merge operation area and at least two controls.
  • Fig. 2B is a schematic diagram of the corresponding operation area of the controls. As shown in Fig. 2B, the avatar control, the level control and the nickname control all have corresponding operation areas, and the three controls correspond to the same operation area. Therefore, the operation areas corresponding to the three controls can be merged, and the merge operation area obtained after the merge processing is shown in FIG. 2C .
  • Step S207 in response to a user triggering any merge operation area in the application page, determine at least two controls covered by the merge operation area, and sequentially voice broadcast the control focus content of the at least two controls covered by the merge operation area.
  • the position coordinate matching method can be used to determine which merge operation the user is for The trigger operation of the area, in response to the user's trigger operation on any merge operation area in the application page, it is necessary to determine at least two controls covered by the merge operation area.
  • Each control corresponds to its own operation area and corresponds to the position coordinates. Therefore, it is possible to determine at least two controls covered by the merge operation area by determining the position coordinates of which controls fall within the range of the position coordinates of the merge operation area, or to determine the existence of the merge operation area with the query method.
  • the associated at least two controls after determining the at least two controls covered by the merge operation area, sequentially voice broadcast the control focus content of the at least two controls covered by the merge operation area, and realize the voice broadcast of multiple control focuses through one trigger operation
  • the effect of the content simplifies the user operation, so that the user does not need to trigger the control frequently, thereby improving the user experience.
  • Step S208 in response to the user's triggering operation on any merge operation area in the application page again, perform page jump processing according to the jump address corresponding to the control covered by the merge operation area.
  • the at least two controls covered by the merge operation area correspond to a certain jump address
  • a trigger operation can be performed on the merge operation area again.
  • the page is executed according to the jump address corresponding to the control covered by the merge operation area. Jump processing, jump to the page corresponding to the jump address.
  • the merge operation area covers the "avatar” control, the "nickname” control and the "level” control.
  • the three controls correspond to the same jump address, and the jump address corresponds to the setting page. Trigger the operation to realize page jump, and the page after jump is the setting page.
  • an activation control is also included in the application page, and the activation control is usually represented by a "more" button or button means that by clicking the "More" button or The button can call out the focus content of the control, usually in the form of a drop-down list, as shown in Figure 2D, in order to enable the user to know which control focus content is included in the activated control, in response to the user's response to any activated control in the application page.
  • the following control focus content corresponds to: importing local books, wireless book transfer, cloud bookshelf, organizing bookshelves, book sorting, scanning, backup and synchronization.
  • This trigger operation in turn, voice broadcast of importing local books, wireless book transmission, cloud bookshelf, sorting bookshelf, book sorting, scanning, backup and synchronization, instead of only broadcasting the focus content of the first control (importing local books),
  • the focus content of other controls in the drop-down list will no longer be broadcast.
  • Focus content instead of just voice broadcast of the focus content of the first control, can enrich the use functions of the visually impaired.
  • the method further includes: in response to a triggering operation of the editing control by the user, voice broadcast of the control focus content corresponding to the editing control, and switching the input mode from the keyboard input mode to the voice input mode.
  • the editing control in this optional embodiment refers to a control that provides text input operations, for example, a search control or a comment control, etc.
  • the editing control can be triggered, for example, clicking on the editing control, responding to The user's trigger operation on the editing control will broadcast the focus content of the control corresponding to the editing control to the user through voice broadcast, so as to inform the user what control is triggered.
  • the voice input mode is adopted, that is, the input mode is switched from the keyboard input mode to the voice input mode, so that the user does not need to manually input text, but achieves the corresponding editing purpose through the voice input method.
  • the focus content of the control corresponding to the "Search Input Box” control is broadcast by voice, for example, the voice broadcast "Search Input Box” , and switch the input mode from the keyboard input mode to the voice input mode.
  • voice for example, the voice broadcast "Search Input Box”
  • the user can describe the "search input box” by voice to realize the search, and after obtaining the search results, they can also voice broadcast the search results in sequence.
  • This optional implementation manner may also be applied to other scenarios, for example, scenarios such as writing an idea or a book review for an e-book, or replying to comments from others, which will not be described in detail here.
  • the method further includes: in response to a triggering operation of the specified electronic book control by the user, voice broadcast of the electronic book identifier corresponding to the specified electronic book control;
  • the user when the user wants to read an e-book, the user can trigger the designated e-book control, and in response to the user's triggering operation on the designated e-book control, the electronic book corresponding to the designated e-book control
  • the book identification performs voice broadcast, so that the user can determine whether it is the e-book he wants to read through the voice broadcast method. If it is an e-book that the user wants to read, the user can trigger the designated e-book control again, and in response to the user's triggering operation on the designated e-book control again, it is determined whether there is an e-book identifier stored in the database that matches the designated e-book control.
  • the method provided by the above-mentioned embodiments of the present disclosure provides an implementation basis for subsequently broadcasting the control focus content of multiple controls through one trigger operation by associating at least two controls that satisfy the association condition;
  • the operation area corresponding to each control is merged, which expands the user's operable area, so that the user does not need to look at the location of the control every time before clicking effectively, which can effectively solve the problem that the operation area corresponding to the control is too small, which is inconvenient for the user to click.
  • the user experience is improved; in response to the user's triggering operation on the merge operation area, at least two controls covered by the merge operation area are determined, and the at least two controls covered by the merge operation area are sequentially voice broadcast.
  • the focus content of the control realizes the effect of broadcasting the focus content of multiple associated controls by one trigger operation, which simplifies the user operation, so that the user can quickly find the desired control without triggering the control frequently, avoiding the trigger operation. Frequent switching between voice broadcast and voice broadcast, thus improving the user experience.
  • An embodiment of the present disclosure further provides a non-volatile computer-readable storage medium, where the non-volatile computer-readable storage medium stores at least one executable instruction, and the computer-executable instruction can execute any of the foregoing method embodiments. voice broadcast method.
  • Executable instructions can specifically be used to cause the processor to perform the following operations:
  • At least one control associated with the control is determined, and the control focus content of the at least two associated controls is sequentially voice broadcast.
  • any control corresponds to an operation area
  • the executable instructions further cause the processor to:
  • At least two controls are no-click event controls, determine whether the distance between the operation areas corresponding to the at least two controls is less than the preset distance;
  • any control corresponds to an operation area
  • the executable instructions further cause the processor to:
  • At least two controls include a control without a click event and a control with a click event, determine whether there is a correlation between the focus content of the control without a click event and the focus content of the control with a click event;
  • executable instructions further cause the processor to:
  • At least two controls are controls with click events, determine whether the jump addresses corresponding to at least two controls are the same;
  • any control corresponds to an operation area
  • the executable instructions further cause the processor to:
  • the operation areas corresponding to at least two controls are merged to obtain a merged operation area
  • At least two controls covered by the merge operation area are determined, and the control focus content of the at least two controls covered by the merge operation area is sequentially voice broadcast.
  • executable instructions further cause the processor to:
  • executable instructions also cause the processor to:
  • page jump processing is performed according to the jump address corresponding to the control covered by the merge operation area.
  • executable instructions also cause the processor to:
  • control focus content corresponding to the activated control is sequentially voice broadcast.
  • executable instructions also cause the processor to:
  • the focus content of the control corresponding to the editing control is announced by voice, and the input mode is switched from the keyboard input mode to the voice input mode.
  • executable instructions also cause the processor to:
  • FIG. 3 shows a schematic structural diagram of a computing device according to an embodiment of the present disclosure.
  • the specific embodiment of the present disclosure does not limit the specific implementation of the computing device.
  • the computing device may include: a processor (processor) 302 , a communication interface (Communications Interface) 304 , a memory (memory) 306 , and a communication bus 308 .
  • processor processor
  • Communication interface Communication Interface
  • memory memory
  • communication bus 308 a communication bus
  • the processor 302 , the communication interface 304 , and the memory 306 communicate with each other through the communication bus 308 .
  • the communication interface 304 is used for communicating with network elements of other devices such as clients or other servers.
  • the processor 302 is configured to execute the program 310, and specifically may execute the relevant steps in the above-mentioned embodiments of the voice broadcasting method.
  • the program 310 may include program code including computer operation instructions.
  • the processor 302 may be a central processing unit (CPU), or an application specific integrated circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the present disclosure.
  • the one or more processors included in the computing device may be the same type of processors, such as one or more CPUs; or may be different types of processors, such as one or more CPUs and one or more ASICs.
  • the memory 306 is used to store the program 310 .
  • Memory 306 may include high-speed RAM memory, and may also include non-volatile memory, such as at least one disk memory.
  • the program 310 can specifically be used to cause the processor 302 to perform the following operations:
  • At least one control associated with the control is determined, and the control focus content of the at least two associated controls is sequentially voice broadcast.
  • any control corresponds to an operation area
  • Routine 310 further causes processor 302 to perform the following operations:
  • At least two controls are no-click event controls, determine whether the distance between the operation areas corresponding to the at least two controls is less than the preset distance;
  • any control corresponds to an operation area
  • Routine 310 further causes processor 302 to perform the following operations:
  • At least two controls include a control without a click event and a control with a click event, determine whether there is a correlation between the focus content of the control without a click event and the focus content of the control with a click event;
  • program 310 further causes processor 302 to perform the following operations:
  • At least two controls are controls with click events, determine whether the jump addresses corresponding to at least two controls are the same;
  • any control corresponds to an operation area
  • Routine 310 further causes processor 302 to perform the following operations:
  • the operation areas corresponding to at least two controls are merged to obtain a merged operation area
  • At least two controls covered by the merge operation area are determined, and the control focus content of the at least two controls covered by the merge operation area is sequentially voice broadcast.
  • program 310 further causes processor 302 to perform the following operations:
  • program 310 also causes processor 302 to perform the following operations:
  • page jump processing is performed according to the jump address corresponding to the control covered by the merge operation area.
  • program 310 also causes processor 302 to perform the following operations:
  • control focus content corresponding to the activated control is sequentially voice broadcast.
  • program 310 also causes processor 302 to perform the following operations:
  • the focus content of the control corresponding to the editing control is announced by voice, and the input mode is switched from the keyboard input mode to the voice input mode.
  • program 310 also causes processor 302 to perform the following operations:
  • modules in the device in the embodiment can be adaptively changed and arranged in one or more devices different from the embodiment.
  • the modules or units or components in the embodiments may be combined into one module or unit or component, and further they may be divided into multiple sub-modules or sub-units or sub-assemblies. All features disclosed in this specification (including accompanying claims, abstract and drawings) and any method so disclosed may be employed in any combination, unless at least some of such features and/or procedures or elements are mutually exclusive. All processes or units of equipment are combined.
  • Each feature disclosed in this specification may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
  • Various component embodiments of the present disclosure may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components according to embodiments of the present disclosure.
  • DSP digital signal processor
  • the present disclosure can also be implemented as apparatus or apparatus programs (eg, computer programs and computer program products) for performing some or all of the methods described herein.
  • Such a program implementing the present disclosure may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from Internet sites, or provided on carrier signals, or in any other form.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé de diffusion vocale, dispositif informatique et support de stockage informatique. Le procédé consiste : à déterminer si au moins deux commandes dans une région préétablie dans une page d'application satisfont une condition d'association ou non (S101) ; si tel est le cas, à effectuer un traitement par association sur lesdites deux commandes (S102) ; en réponse à une opération de déclenchement d'un utilisateur sur une quelconque commande dans la page d'application, à déterminer au moins une commande associée à la commande et à diffuser de manière séquentielle les contenus de focalisation de commande desdites deux commandes associées par la voix (S103)
PCT/CN2021/121810 2020-11-27 2021-09-29 Procédé de diffusion vocale, dispositif informatique et support de stockage informatique WO2022111047A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011364189.4 2020-11-27
CN202011364189.4A CN112486451B (zh) 2020-11-27 2020-11-27 语音播报方法、计算设备及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2022111047A1 true WO2022111047A1 (fr) 2022-06-02

Family

ID=74936730

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/121810 WO2022111047A1 (fr) 2020-11-27 2021-09-29 Procédé de diffusion vocale, dispositif informatique et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN112486451B (fr)
WO (1) WO2022111047A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117762942A (zh) * 2023-12-26 2024-03-26 深圳航天信息有限公司 一种表单设计、存储的方法、系统和可读存储介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112486451B (zh) * 2020-11-27 2022-03-11 掌阅科技股份有限公司 语音播报方法、计算设备及计算机存储介质
CN112989243A (zh) * 2021-05-07 2021-06-18 浙江口碑网络技术有限公司 信息播放方法、待播放信息获得方法、装置及电子设备
CN113900618B (zh) * 2021-05-07 2023-12-19 浙江口碑网络技术有限公司 一种信息播放方法、装置、电子设备以及存储介质
CN113034249A (zh) * 2021-05-28 2021-06-25 浙江口碑网络技术有限公司 信息播放方法、装置以及电子设备
CN113190697A (zh) * 2021-06-02 2021-07-30 口碑(上海)信息技术有限公司 一种图像信息播放方法及装置
CN113489833B (zh) * 2021-06-29 2022-11-04 维沃移动通信有限公司 信息播报方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140040742A1 (en) * 2012-08-03 2014-02-06 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN104461346A (zh) * 2014-10-20 2015-03-25 天闻数媒科技(北京)有限公司 一种视障人士触控屏幕的方法、装置及智能触屏移动终端
CN108874356A (zh) * 2018-05-31 2018-11-23 珠海格力电器股份有限公司 语音播报方法、装置、移动终端和存储介质
CN110618783A (zh) * 2019-09-12 2019-12-27 北京小米移动软件有限公司 一种文本播报方法、装置及介质
CN112486451A (zh) * 2020-11-27 2021-03-12 掌阅科技股份有限公司 语音播报方法、计算设备及计算机存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9405847B2 (en) * 2008-06-06 2016-08-02 Apple Inc. Contextual grouping of a page
KR101709510B1 (ko) * 2011-06-03 2017-02-24 엘지전자 주식회사 휴대 단말기 및 그 제어 방법
CN103853355A (zh) * 2014-03-17 2014-06-11 吕玉柱 电子设备操作方法及其操控设备
CN107644647B (zh) * 2016-07-21 2020-10-30 平安科技(深圳)有限公司 语音回访的方法及装置
CN107273342A (zh) * 2016-11-25 2017-10-20 深圳市联谛信息无障碍有限责任公司 一种在辅助技术中识别html内容的方法
CN109893852B (zh) * 2019-02-26 2022-07-26 北京心智互动科技有限公司 界面信息处理方法及装置
CN110264316A (zh) * 2019-06-20 2019-09-20 浙江口碑网络技术有限公司 物品信息查询展示方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140040742A1 (en) * 2012-08-03 2014-02-06 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN104461346A (zh) * 2014-10-20 2015-03-25 天闻数媒科技(北京)有限公司 一种视障人士触控屏幕的方法、装置及智能触屏移动终端
CN108874356A (zh) * 2018-05-31 2018-11-23 珠海格力电器股份有限公司 语音播报方法、装置、移动终端和存储介质
CN110618783A (zh) * 2019-09-12 2019-12-27 北京小米移动软件有限公司 一种文本播报方法、装置及介质
CN112486451A (zh) * 2020-11-27 2021-03-12 掌阅科技股份有限公司 语音播报方法、计算设备及计算机存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117762942A (zh) * 2023-12-26 2024-03-26 深圳航天信息有限公司 一种表单设计、存储的方法、系统和可读存储介质

Also Published As

Publication number Publication date
CN112486451A (zh) 2021-03-12
CN112486451B (zh) 2022-03-11

Similar Documents

Publication Publication Date Title
WO2022111047A1 (fr) Procédé de diffusion vocale, dispositif informatique et support de stockage informatique
EP4087258A1 (fr) Procédé et appareil d'affichage de données de diffusion en direct, dispositif et support de stockage
CN105867925B (zh) 一种快捷方式建立方法和电子设备
CN107438814B (zh) 移动设备及其方法和移动设备仿真器的方法
US20200311342A1 (en) Populating values in a spreadsheet using semantic cues
US10122839B1 (en) Techniques for enhancing content on a mobile device
CN102968338B (zh) 用于电子设备的应用程序分类的方法、装置及电子设备
WO2020221162A1 (fr) Procédé et appareil de recommandation de programme d'application, dispositif électronique, et support
US20200073903A1 (en) Method and device of tagging links included in a screenshot of webpage
CN110806822B (zh) 电子书的交互方法、计算设备及计算机存储介质
CN103092604A (zh) 一种应用程序分类方法和装置
US20140082498A1 (en) Method and mobile terminal device for independently playing a video
WO2022089568A1 (fr) Procédé et appareil de partage de fichiers et dispositif électronique
RU2595524C2 (ru) Устройство и способ обработки содержимого веб-ресурса в браузере
WO2022161431A1 (fr) Procédé d'affichage, appareil et dispositif électronique
US9282178B2 (en) Method for providing call log and electronic device thereof
CN107450808B (zh) 一种浏览器的鼠标指针定位方法及计算设备
CN112711372B (zh) 视障模式下的页面响应方法、计算设备及计算机存储介质
CN107391017B (zh) 文字处理方法、装置、移动终端及存储介质
JP6250151B2 (ja) タッチパッド操作およびダブルタップ・ズーミングに対する独立ヒット・テスト
WO2018010316A1 (fr) Procédé et dispositif de gestion de page de bureau
CN109656444B (zh) 列表定位方法、装置、设备及存储介质
CN113157171A (zh) 分屏模式下的应用显示方法、计算设备及计算机存储介质
WO2016173307A1 (fr) Procédé et dispositif de copie de message, et terminal intelligent
WO2017166640A1 (fr) Procédé et terminal d'appel d'application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21896549

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21896549

Country of ref document: EP

Kind code of ref document: A1