CN112905148B - Voice broadcasting control method and device, storage medium and electronic equipment - Google Patents
Voice broadcasting control method and device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN112905148B CN112905148B CN202110268418.0A CN202110268418A CN112905148B CN 112905148 B CN112905148 B CN 112905148B CN 202110268418 A CN202110268418 A CN 202110268418A CN 112905148 B CN112905148 B CN 112905148B
- Authority
- CN
- China
- Prior art keywords
- touch screen
- target touch
- screen data
- voice broadcasting
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 230000008569 process Effects 0.000 claims description 19
- 230000003993 interaction Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 6
- 230000002452 interceptive effect Effects 0.000 claims description 5
- 230000001771 impaired effect Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000000605 extraction Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000006735 deficit Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 206010047571 Visual impairment Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 208000029257 vision disease Diseases 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000004393 visual impairment Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a control method and a device for voice broadcasting, a computer storage medium and electronic equipment, wherein the control method comprises the following steps: acquiring touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page; determining target touch screen data matched with the touch screen data; the target touch screen data are preset and can be used for carrying out broadcasting control on the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data; and according to the broadcasting control mode corresponding to the matched target touch screen data, controlling and adjusting the current voice broadcasting of the resource data service page, and improving the convenience of control operation.
Description
Technical Field
The application relates to the technical field of computer application, in particular to a control method and device for voice broadcasting. The application also relates to a computer storage medium and an electronic device.
Background
Various life service applications have been widely applied to electronic terminal devices, and provide convenience of life and entertainment of life while developing a new life style.
However, when electronic terminal devices are popular, applications such as online shopping, online ordering, video viewing, electronic listening books and the like are realized through the electronic terminal devices, and operations of facing the application of online life services for people with visual impairment appear to be holding a urn and filling.
Disclosure of Invention
The application provides a control method for voice broadcasting, which aims to solve the limitation of application service software operation on the user group.
The application provides a control method for voice broadcasting, which comprises the following steps:
acquiring touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page;
determining target touch screen data matched with the touch screen data; the target touch screen data are preset and can be used for carrying out broadcasting control on the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data;
And controlling and adjusting the current voice broadcasting of the resource data service page according to the broadcasting control mode corresponding to the matched target touch screen data.
In some embodiments, further comprising:
responding to the voice prompt of the input of the target touch screen data of the resource data service page in the barrier-free voice broadcasting state, and inputting the target touch screen data;
and establishing a corresponding association relation between the input target touch screen data and the broadcasting control mode.
In some embodiments, the responding to the voice prompt for inputting the target touch screen data in the barrier-free voice broadcasting state of the resource data service page includes:
a target touch screen identifier is arranged for an operation part of the touch screen operation;
recording the generated target touch screen track and the target touch screen mark corresponding to the operation part on the resource data service page;
determining the target touch screen identification and the corresponding target touch screen track as the target touch screen data;
the establishing of the corresponding association relationship between the input target touch screen data and the broadcasting control mode comprises the following steps:
Binding the target touch screen data with the broadcasting control mode.
In some embodiments, the recording the target touch screen track of the operation part on the resource data service page and the target touch screen identifier corresponding to the operation part includes:
responding to the voice broadcasting information of the reference touch screen data output in the input voice prompt, and recording a target touch screen track generated by the operation part on the resource data service page according to the reference touch screen data and the target touch screen identification corresponding to the operation part;
or,
and responding to the voice broadcasting information of the target touch screen custom operation output in the input voice prompt, and recording the target touch screen track generated by the operation part on the resource data service page and the target touch screen identification corresponding to the operation part.
In some embodiments, the determining target touch screen data that matches the touch screen data includes:
extracting a touch screen identifier and a touch screen track of the touch screen identifier in the touch screen data;
determining whether the touch screen identifier is matched with a target touch screen identifier in the target touch screen data;
If yes, determining whether the touch screen track is matched with a target touch screen track in the target touch screen data;
if yes, the target touch screen data is determined to be matched with the touch screen data.
In some embodiments, the extracting the touch screen identifier and the touch screen trajectory of the touch screen identifier in the touch screen data includes:
and identifying the touch screen identification and the touch screen track of the touch screen identification in the touch screen data through an image identification algorithm.
In some embodiments, the touch screen data comprises single point touch screen data or multi-point touch screen data; the target touch screen data includes single touch screen data or multi-touch screen data.
The application also provides a control device for voice broadcasting, which comprises:
the acquisition unit is used for acquiring touch screen data generated for the resource data service page based on the barrier-free voice broadcasting state of the resource data service page;
the determining unit is used for determining target touch screen data matched with the touch screen data; the target touch screen data can be used for broadcasting control data for the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data;
And the control unit is used for controlling and adjusting the current voice broadcasting of the resource data service page according to the broadcasting control mode corresponding to the matched target touch screen data.
The application also provides a computer storage medium for storing network platform generated data and a program for processing the network platform generated data;
the program, when read and executed, performs the steps of the control method of voice broadcasting as described above.
The present application also provides an electronic device including:
a processor;
and a memory for storing a program for processing network platform generation data, which when read and executed by the processor, performs the steps of the control method of voice broadcasting as described above.
The application also provides an interaction method of voice broadcasting, which comprises the following steps:
receiving touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page;
sending a control request according to the touch screen data;
and responding to the target touch screen data determined by the control request, controlling and adjusting the current voice broadcasting of the barrier-free voice broadcasting, and outputting the voice broadcasting of the resource data service page after the corresponding control and adjustment.
In some embodiments, further comprising:
and outputting the explanation voice information of the touch screen data corresponding to the control mode adjustment according to the control adjustment.
Compared with the prior art, the application has the following advantages:
in the embodiment of the control method for voice broadcasting, touch screen data generated for a resource data service page is obtained based on the barrier-free voice broadcasting state of the resource data service page; determining target touch screen data matched with the touch screen data; the target touch screen data are preset and can be used for carrying out broadcasting control on the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data; and controlling and adjusting the current voice broadcasting of the resource data service page according to the broadcasting control mode corresponding to the matched target touch screen data. Therefore, the vision-impaired user can input the operation gestures according to own operation habits, control operation of the vision-impaired user on barrier-free voice broadcasting time of the resource data service page is improved, the limitation of the existing network application service on application groups is broken, the convenience of the vision-impaired group in operating the resource service provided by the on-line application is improved, and life feeling of the vision-impaired group is enhanced.
Drawings
Fig. 1 is a flowchart of an embodiment of a control method for voice broadcasting provided by the present application;
fig. 2 is a schematic structural diagram of an embodiment of a control device for voice broadcasting according to the present application;
fig. 3 is a flowchart of an embodiment of an interaction method for voice broadcasting provided by the application;
fig. 4 is an application scenario schematic diagram of an embodiment of an interaction method for voice broadcasting provided by the application;
fig. 5 is a schematic structural diagram of an embodiment of an interactive device for voice broadcasting according to the present application;
fig. 6 is a schematic structural diagram of an embodiment of an electronic device according to the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than those herein described, and those skilled in the art will readily appreciate that the present application may be similarly embodied without departing from the spirit or essential characteristics thereof, and therefore the present application is not limited to the specific embodiments disclosed below.
The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The manner of description used in the present application and in the appended claims is for example: "a", "a" and "a" etc. are not limited in number or order, but are used to distinguish the same type of information from each other.
As can be seen from the description of the background art section, the control method for voice broadcasting provided by the present application is an inventive concept based on difficulties and obstacles of visually impaired people in performing resource application service operations through terminal devices. Because, with the continuous progress and update of electronic informatization technology, networks and lives are indistinct, however, for people with vision impairment, relying on vision to complete the operation of any electronic device has certain difficulties, but visual regrets can be made up through touch sense and hearing sense. However, in the prior art, when performing a touch operation on an electronic device, it is often required to perform control of a play function by an operator for a specific position, for example: for voice playing control operation of some commodity shopping application services, target commodity information displayed in a page needs to be aligned to realize voice playing control of corresponding introduction of the target commodity information; or when the current commodity playing process wants to jump to the next commodity for playing, the next commodity display area needs to be aligned for operation; or, for the operations such as returning to the upper page, the operation can be realized by the operator aiming at the specific control, and obviously, the control mode still has a certain difficulty for the visually impaired people. Therefore, in the prior art, a touch control film covered on a display screen of the electronic device is provided so as to obtain information in touch sense, and also can obtain information in hearing sense through a voice playing mode. However, when the touch sense or the hearing sense faces various application service software, operational inconvenience still exists for the visually impaired people, for example, different application service software has different service interfaces, the touch control film is fixed and inconvenient, and corresponding touch sense can only be realized for some conventional fixed operation controls of the terminal equipment, and corresponding operation cannot be finished for the application service software; and the voice broadcasting generally needs to sequentially read the information on the application service interface according to the set playing sequence, so that the playing control is complex. Therefore, various problems exist, and the operations such as ordering, information browsing and the like are difficult and puzzled for visually impaired people through application service software in the terminal equipment. In order to help people with vision impairment to cross over difficult and convenient electronic equipment to use application software, the application provides a control method for voice broadcasting, which is described in detail below.
Referring to fig. 1, fig. 1 is a flowchart of an embodiment of a control method for voice broadcasting according to the present application, where the embodiment includes:
step S101: acquiring touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page;
the resource data service page in step S101 may be understood as a service page of application service software installed on the electronic terminal device, for example: the ordering page of the ordering service application software, the shopping page of the shopping software, the playing page of the video playing software and the text display page of the text information application software. That is, the resource data may be meal information, commodity information, video information, text information, or the like.
The barrier-free voice broadcasting can be understood as being capable of outputting and broadcasting the resource data in the resource data service page in a voice broadcasting mode, for example: the ordering service page can be used for carrying out voice broadcasting on information displayed in the page in a barrier-free voice broadcasting state, and if the ordering service page is text information, the text information can be output in a voice broadcasting mode, for example, an electronic book is converted into a sound mode for broadcasting. In this embodiment, the barrier-free voice broadcast state may be a barrier-free voice broadcast mode that is started when the resource data service page is entered, or may be a barrier-free application service form that is a resource service application that provides the resource data service page. The determination of the barrier-free voice broadcasting state is not limited, and the condition that the resource information in the resource data service page is output and broadcasted in a barrier-free mode can be met.
The touch screen data may be understood as touch screen data generated by a touch operation of the resource data service page carrier, for example, when the resource data service page is output on the electronic terminal device, the touch screen data generated by a touch operation on the screen of the electronic terminal device is as follows: touch screen operation for a sliding and/or clicking operation of the screen. Thus, the manner of touch screen operation is not limited, and the generator of touch screen operation may be a user's finger or any other operation portion that may generate touch screen operation, and the generator of touch screen operation is not limited. In this embodiment, the touch screen data includes single touch screen data or multi touch screen data.
The purpose of step S101 is to acquire touch screen data generated based on the touch screen operation of the resource data service page in the unobstructed voice broadcast state.
In the present embodiment, the touch screen operation of the electronic terminal device by the finger of the user is mainly described as an example, but is not limited to this manner.
The specific implementation process of the step S101 may be that, when the user enters into the resource data service page provided by the barrier-free resource data service application through the electronic terminal device, touch screen data generated by touch screen operation on the current resource data service page is obtained; or when a user enters into a resource data service page provided by a resource data service application through the electronic terminal equipment, the barrier-free voice broadcasting mode can be started according to the prompting information of the outputted barrier-free voice broadcasting mode, and touch screen data generated by touch screen operation of the resource data service page is generated on the basis.
Step S102: determining target touch screen data matched with the touch screen data; the target touch screen data are preset and can be used for carrying out broadcasting control on the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data;
the target touch screen data in step S102 may be preset standard touch screen data, that is, different touch gestures correspond to different control modes, for example: the single-finger sliding corresponding control mode is to jump to the next playing information adjacent to the current playing information, etc. I.e. the target touch screen data comprises single touch screen data or multi touch screen data. Of course, other control modes and touch gestures are also possible. The meaning of the target touch screen data is only schematically illustrated and is not limiting. Control of the resource data service page may include: pause, skip, return, etc., for a life service application platform, such as commodity shopping, etc., for a resource data service page may include a plurality of UI cards (User Interface: card form of User Interface), switch between UI cards, skip between pages, pause of redundant UI card broadcast, etc., each touch screen operation corresponds to a different broadcast function control. Therefore, standard or appointed touch screen data are needed, so that the acquired touch screen data and the set touch screen data are matched to realize corresponding voice broadcasting control. It should be noted that, the UI card may, for example, use a card to carry a list of categories on an e-commerce platform; in a social media website, a card represents a single activity or thought; in news websites and magazines, cards are used to display different news items, so that the page interacts with the user in a UI card manner.
It should be noted that, in this embodiment, the resource information displayed in the resource service application page is illustrated in the design form of the UI card, and in practice, the form of the resource information displayed in the resource service application page is not limited, and the front-back order of broadcasting may be from top to bottom, from left to right, or may be according to a preset broadcasting order. As shown in fig. 4, the UI card in this embodiment may include the activity information of the application service itself or may be information specific to the merchant, and there is no limitation in designing the resource service application page.
It is to be understood that, for the skip control manner, the skip position may also be set according to the sliding distance of the sliding operation, for example: the sliding distance is 2cm, and then the next adjacent one is jumped, and the sliding distance is 4cm, and two positions are jumped, which is only an illustration, or the sliding distance is not set, and the corresponding control mode is set only according to the number of touch points of the touch screen operation, that is, the target touch screen data (touch screen gesture) and the corresponding control mode can have various implementation modes, and the method is not particularly limited herein.
The implementation process of the embodiment may further include:
Step S10a: responding to the voice prompt of the input of the target touch screen data of the resource data service page in the barrier-free voice broadcasting state, and inputting the target touch screen data;
step S10b: and establishing a corresponding association relation between the input target touch screen data and the broadcasting control mode.
In the step S10a, different resource information is provided by the pages of different resource application services, so that the input of the target touch screen data can be performed according to the related information of the resource service pages of different resource application services, and the input touch screen data can be applied to a plurality of different resource application services when the broadcast control modes are similar or identical, so that the output of the input voice prompt and the input conditions are not limited. In this embodiment, only the application scenario of online commodity shopping or online shopping and offline consumption is illustrated. The method has the advantages that the input voice prompt of the target touch screen data can be output when the resource data service page is entered and in the barrier-free voice broadcasting state, the input voice prompt can also be in a voice broadcasting mode, and the operation of visually impaired people is facilitated. And responding to the voice prompt, and recording target touch screen data, namely recording an operation mode.
The specific implementation process of the step S10a may include:
step S10a1: a target touch screen identifier is arranged for an operation part of the touch screen operation;
step S10a2: recording a target touch screen track of the operation part on the resource data service page and the target touch screen mark corresponding to the operation part;
step S10a3: and determining the target touch screen identification and the corresponding target touch screen track as the target touch screen data.
The specific implementation process of step S10a1 may be that, when the operation portion is a finger in the embodiment, different target touch screen identifiers may be allocated to the finger for operation, that is: touch point identification (pointer ID). For example: each finger is assigned a different touch point identifier, although it is also possible to assign a different touch point identifier only to a common finger.
The specific implementation process of step S10a2 may be that the finger is operated on the screen, or referred to as a touch gesture, for example, a sliding operation of one or more fingers on the screen will record a sliding track of the finger on the screen, that is, record a touch point identifier of the finger when the finger contacts the screen, record a track of movement of the touch point identifier on the screen, and record completion when the finger leaves the screen. That is, the occurred target touch screen operation and the corresponding target touch screen identification can be recorded by monitoring the touch event, and the two are determined as target touch screen data. The target touch screen track can be a moving track, and the moving track can comprise a sliding track, a point track, a long track and the like; but may of course also include touch screen sound tracks, etc.
Implementations of the entry of the target touch screen data may include at least two ways, such as: reference target entry gestures and user-defined entry gestures are provided. The so-called reference target entry gesture may be a reference gesture that provides the user with control over the voice broadcast control of the resource service page, such as: one is to slide upwards to jump to the next playing information, two is to slide upwards to jump to interval two playing information, one is to pause, two is to continue, one is to slide leftwards to return to last time, one is to slide rightwards to enter the next page, and the like, which are not listed here. The specific implementation process of the step S10a2 may include:
step S10a21: and responding to the voice broadcasting information of the reference touch screen data output in the input voice prompt, and recording a target touch screen track generated by the operation part on the resource data service page according to the reference touch screen data and the target touch screen identification corresponding to the operation part.
Or,
may include:
step S10a23: and responding to the voice broadcasting information of the target touch screen custom operation output in the input voice prompt, and recording the target touch screen track generated by the operation part on the resource data service page and the target touch screen identification corresponding to the operation part. The custom operation may be an operation performed by the user according to the operation habit of the user.
The specific implementation process of the step S10a3 may include:
step S10a31: binding the target touch screen data with the broadcasting control mode. The broadcasting control mode can include a plurality of broadcasting control modes, and the corresponding target touch screen data can also include a plurality of broadcasting control modes. If one of the directions slides upwards to jump to the next play information, the two directions slide upwards to jump to interval two play information, the single click is pause, the double click is continuous, the left slide is return to last time, the right slide is enter the next page, and the like, wherein the jump, pause, return, continuous, and the like are the control modes of broadcasting. Different operation gestures are given to each different broadcasting control mode, so that visually impaired people can control various broadcasting functions through touch operation of a screen. Binding the target touch screen data with the broadcasting control mode can acquire which touch screen data corresponds to which broadcasting control mode, so that the barrier-free voice broadcasting mode can be controlled conveniently.
The above is a description of the process of prerecording target touch screen data.
The purpose of step S102 is to determine whether there is the same data as the touch screen data in the target touch screen data, so as to facilitate triggering the control operation on the voice broadcast. Thus, the specific implementation procedure of step S102 may include:
Step S102-1: extracting a touch screen identifier and a touch screen track of the touch screen identifier in the touch screen data; it can be understood that acquiring corresponding touch screen data through listening to touch screen events (touch events) may generally include: pressing the finger onto the screen, moving the finger on the screen, and leaving the screen; with the Android system, for example, starting with a finger pressed onto the screen, the finger leaves the screen, and the series of touch screen events generated in this process constitutes a sequence of events (also referred to as an event stream). For a multi-touch event, the first finger presses onto the screen until the last finger leaves the screen. The first event of the touch screen event sequence is typically a finger pressing onto the screen and the last event is typically a finger leaving the screen. All touch screen operations on the screen are eventually converted to sequences of events by the user's finger or touch-sensitive member. The touch screen track and the touch screen mark on the screen can be obtained in the whole monitoring process from the contact of the finger with the screen to the separation of the last finger from the screen; the specific extraction mode can identify a touch screen identifier of touch screen data and a touch screen track of the touch screen identifier in a touch screen event of a screen through an image identification algorithm or a gesture identification algorithm. Image recognition algorithms belong to the prior art and are not described here too much.
Step S102-2: determining whether the touch screen identifier is matched with a target touch screen identifier in the target touch screen data; it can be understood that the touch screen identification is compared with the target touch screen identification in the target touch screen data which is input in advance, and if the identifications are the same, the matching is determined; if the voice information is not matched with the voice information, voice information indicating the operation failure can be output, so that a visually impaired user can better understand the reason of the operation failure, and the success rate of re-output is improved.
Step S102-3: if yes, determining whether the touch screen track is matched with a target touch screen track in the target touch screen data; when matching, it is also necessary to determine whether the touch screen trajectory is the same as the target touch screen trajectory. The touch screen track can be understood as a process of moving a finger on a screen, and can also be understood as a click state of the finger on the screen, the moving process can be whether the movement from a starting point to an end point of a touch mark is the same as the movement set in the target touch screen track, the click state can be whether a click operation starting point and the end point of the touch mark are the same as the click state set in the target touch screen track, and the click state can be a click or double click or a situation larger than the double click, namely, the rising and falling track of the finger on the screen is monitored. That is, the operation touch screen trajectory may be determined according to different operation modes. If the voice information is not matched with the voice information, voice information indicating the operation failure can be output, so that a visually impaired user can better understand the reason of the operation failure, and the success rate of re-output is improved.
Step S102-4: if yes, the target touch screen data is determined to be matched with the touch screen data.
Step S103: and controlling and adjusting the current voice broadcasting of the resource data service page according to the broadcasting control mode corresponding to the matched target touch screen data.
The broadcasting control manner in step S103 may include various controls, such as pause, skip, return, and continue described above, where each different control function corresponds to different target touch screen data, and when data matching with the acquired touch screen data is found in the target touch screen data, the current voice broadcasting may be controlled and adjusted according to the control manner corresponding to the matched target touch screen data. For example: when the matched target touch screen data is that the control function corresponding to the click screen is pause, and the acquired touch screen data is also that the click screen, the current broadcasted voice information can be paused; if the acquired touch screen data is operation data sliding rightwards, the operation data corresponds to a preset right-sliding operation gesture of the target touch screen data, and the control mode corresponding to the right-sliding operation gesture in the target touch screen data is to jump to the next UI card, stopping voice information of current broadcasting and jumping to voice broadcasting of the next UI card adjacent to the voice information of current broadcasting. The above is only an example, the setting of the actual preset control gesture and control mode can be preset according to the condition of the application of the resource data and/or the operation habit of the user, so as to realize the operation mode of being more attached to the visually impaired people, break the limitation of the existing network application service to the application people, improve the operation convenience of the visually impaired people for the resource service provided by the online application, and enhance the life feeling of the visually impaired people.
It can be understood that in the process of performing voice broadcasting on the resource data service page in the barrier-free voice broadcasting state, voice broadcasting information of touch screen operation can be inserted, so that a user can be reminded of how to perform operation control, namely, a preset target gesture is output, and then the visually impaired user can listen to the information of the preset target gesture while broadcasting without memorizing the target gesture, so that the visually impaired user can operate.
The foregoing is a specific description of an embodiment of a method for controlling voice broadcasting according to the present application, which corresponds to the foregoing embodiment of the method for controlling voice broadcasting, and the present application further discloses an embodiment of a device for controlling voice broadcasting, referring to fig. 2, and since the embodiment of the device is substantially similar to the embodiment of the method, the description is relatively simple, and relevant places refer to part of the description of the embodiment of the method. The device embodiments described below are merely illustrative.
As shown in fig. 2, fig. 2 is a schematic structural diagram of an embodiment of a control device for voice broadcasting according to the present application, where the embodiment of the device includes:
an obtaining unit 201, configured to obtain touch screen data generated for a resource data service page based on a non-obstacle voice broadcast state of the resource data service page;
The specific implementation process of the obtaining unit 201 may refer to the description of step S101, and the description is not repeated here.
A determining unit 202, configured to determine target touch screen data that matches the touch screen data; the target touch screen data are preset and can be used for carrying out broadcasting control on the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data;
the determining unit 202 is configured to determine whether the touch screen data in the acquiring unit 201 matches with preset target touch screen data, so as to obtain a control manner of the resource data service page, and further needs to preset the target touch screen data, so that the method further includes:
the input unit is used for responding to the input voice prompt of the target touch screen data of the resource data service page in the barrier-free voice broadcasting state and inputting the target touch screen data;
the establishing unit is used for establishing the corresponding association relation between the input target touch screen data and the broadcasting control mode.
Wherein the entry unit may include: an allocation subunit, a recording subunit, and a determination subunit;
The distribution subunit is used for providing a target touch screen identifier for an operation part of touch screen operation;
the recording subunit is used for recording a target touch screen track of the operation part on the resource data service page and the target touch screen identification corresponding to the operation part;
the determining subunit is configured to determine the target touch screen identifier and the corresponding target touch screen track as the target touch screen data.
The recording subunit may specifically be configured to record, in response to voice broadcast information of reference touch screen data output in the input voice prompt, a target touch screen track generated by the operation portion on the resource data service page according to the reference touch screen data and the target touch screen identifier corresponding to the operation portion. Or, the method is used for responding to the voice broadcasting information of the target touch screen user-defined operation output in the input voice prompt, and recording the target touch screen track generated by the operation part on the resource data service page and the target touch screen identifier corresponding to the operation part. The custom operation may be an operation performed by the user according to the operation habit of the user.
The determining subunit may include: and the binding subunit is used for binding the target touch screen data with the broadcasting control mode.
For details of the input unit and the establishing unit, reference may be made to the specific descriptions of the above step S10a and step S10b, and the detailed descriptions will not be repeated here.
The purpose of the determining unit 202 is to determine whether there is the same data as the touch screen data in the target touch screen data, so as to facilitate triggering of a control operation for voice broadcasting. Thus, the specific implementation procedure of the determining unit 202 may include: the device comprises an extraction subunit, an identification matching subunit, a track matching subunit and a determination subunit;
the extraction subunit is used for extracting the touch screen identification and the touch screen track of the touch screen identification in the touch screen data;
the identification matching subunit is used for determining whether the touch screen identification is matched with a target touch screen identification in the target touch screen data;
the track matching subunit is used for determining whether the touch screen track is matched with a target touch screen track in the target touch screen data or not when the matching result of the identification matching subunit is yes;
and the determining subunit is used for determining that the target touch screen data is matched with the touch screen data when the matching result of the track matching subunit is yes.
Reference may be made to the description of the above steps S102-1 to S102-4 for the specific implementation of the determination unit 202, which is only a summary description here.
And the control unit 203 is configured to control and adjust a current voice broadcast of the resource data service page according to the broadcast control mode corresponding to the matched target touch screen data. For the specific implementation process of the control unit 203, reference may be made to the description of step S103, and the detailed description will not be repeated here.
According to the embodiment of the control method for voice broadcasting, a visually impaired user can input target touch screen data (user-defined operation gestures) in advance according to own operation habits or reference target touch screen data (reference operation gestures) provided by a system or an application, so that voice broadcasting of resource information of a resource application service page is conveniently controlled in a barrier-free voice broadcasting state, voice information which the user wants to acquire is acquired, corresponding resource information acquisition is completed, and limitation of operation of network application service on crowds is avoided.
Based on the foregoing, the present application further provides an embodiment of an interaction method for voice broadcasting, as shown in fig. 3 and fig. 4, fig. 3 is a flowchart of an embodiment of an interaction method for voice broadcasting provided by the present application, and fig. 4 is an application scenario schematic diagram of an embodiment of an interaction method for voice broadcasting provided by the present application.
The interaction method comprises the following steps:
step S301: receiving touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page;
step S302: sending a control request according to the touch screen data;
step S303: and responding to the target touch screen data determined by the control request, controlling and adjusting the current voice broadcasting of the barrier-free voice broadcasting, and outputting the voice broadcasting of the resource data service page after the corresponding control and adjustment.
In this embodiment, an application scenario of a take-out order application service is described by taking an example, as shown in fig. 4, take-out order application service page information is displayed on an electronic terminal digital device, and the take-out order application service may be an application service for visually impaired people, that is, an accessible voice broadcast state is entered into a take-out order application service page, or an accessible voice broadcast mode is opened for a general take-out order application service, so that accessible voice broadcast of take-out order application service page information is realized. And in the barrier-free voice broadcasting state, touch screen data can be generated by touch screen operation on the screen of the electronic equipment displaying the resource data service page.
After performing relevant operations such as matching on the generated touch screen data based on the steps S101 to S103, determining the target touch screen data matched with the touch screen data, and then performing corresponding control adjustment on the current voice broadcast to adjust to the position of the target voice broadcast, thereby outputting target voice information at the electronic terminal device. Taking target touch screen data (target touch screen gestures) as rightward sliding operation, and the corresponding control mode is to jump to the next broadcasting position, as shown in fig. 4, after the touch screen data (touch screen gestures) are monitored, a control request is sent, the touch screen data and the target touch screen data are matched in response to the control request, after the touch screen data and the target touch screen data are successfully matched, the current voice broadcasting information is adjusted according to the jump control mode, namely, the current voice broadcasting information jumps to the position of a second UI card, and the voice broadcasting information of the takeaway order application service home page outputs the voice broadcasting information of the second UI card.
It will be appreciated that this embodiment may further include:
according to the control adjustment, the explanation voice information of the touch screen data corresponding to the control mode adjustment is output, so that a visually impaired user can know the realization of the control function corresponding to the self operation.
Based on the foregoing, the present application further provides a voice broadcast interaction device, as shown in fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a voice broadcast interaction device provided by the present application, where the embodiment of the interaction device includes:
a receiving unit 501, configured to receive touch screen data generated for a resource data service page based on a non-obstacle voice broadcast state of the resource data service page;
a sending unit 502, configured to send a control request according to the touch screen data;
and an output unit 503, configured to respond to the target touch screen data determined by the control request, adjust control of the current voice broadcast of the barrier-free voice broadcast, and output a voice broadcast corresponding to the resource data service page after the control adjustment.
The specific implementation process of the embodiment of the interactive device for voice broadcast provided by the present application may be combined with the content of the embodiment of the interactive method, and will not be repeated here.
Based on the above, the present application also provides a computer storage medium for storing network platform generated data and a program for processing the network platform generated data;
the program, when read and executed, performs the steps involved in a control method of voice broadcasting as described above.
Based on the above, the present application also provides an electronic device, including:
a processor 601;
a memory 602, configured to store a program for processing network platform generation data, where the program, when read and executed by the processor 602, performs steps involved in a control method for voice broadcasting as described above.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
1. Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer readable media, as defined herein, does not include non-transitory computer readable media (transmission media), such as modulated data signals and carrier waves.
2. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
While the application has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the application as defined by the appended claims.
Claims (11)
1. The control method of voice broadcasting is characterized by comprising the following steps:
acquiring touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page, wherein the resource data service page is a service page of application service software installed on electronic equipment; the touch screen data is generated based on touch screen operation of the resource data service page in an unobstructed voice broadcasting state;
Determining target touch screen data matched with the touch screen data; the target touch screen data are preset and can be used for carrying out broadcasting control on the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data;
according to the broadcasting control mode corresponding to the matched target touch screen data, controlling and adjusting the current voice broadcasting of the resource data service page; in the process of carrying out voice broadcasting on the resource data service page in the barrier-free voice broadcasting state, voice broadcasting information of touch screen operation is also inserted, so that a user is prompted to output preset touch screen operation;
wherein the method further comprises:
responding to the voice prompt of the input of the target touch screen data of the resource data service page in the barrier-free voice broadcasting state, and inputting the target touch screen data; comprising the following steps: a target touch screen identifier is arranged for an operation part of the touch screen operation; wherein the operation part comprises a finger; the method comprises the steps of recording a generated target touch screen track and a target touch screen mark corresponding to an operation part on a resource data service page; determining the target touch screen identification and the corresponding target touch screen track as the target touch screen data;
And establishing a corresponding association relation between the input target touch screen data and the broadcasting control mode.
2. The method for controlling voice broadcasting according to claim 1, wherein,
the establishing of the corresponding association relationship between the input target touch screen data and the broadcasting control mode comprises the following steps:
binding the target touch screen data with the broadcasting control mode.
3. The method for controlling voice broadcasting according to claim 2, wherein said recording the target touch screen track of the operation part on the resource data service page and the target touch screen identifier corresponding to the operation part includes:
responding to the voice broadcasting information of the reference touch screen data output in the input voice prompt, and recording a target touch screen track generated by the operation part on the resource data service page according to the reference touch screen data and the target touch screen identification corresponding to the operation part;
or,
and responding to the voice broadcasting information of the target touch screen custom operation output in the input voice prompt, and recording the target touch screen track generated by the operation part on the resource data service page and the target touch screen identification corresponding to the operation part.
4. The method for controlling voice broadcasting according to claim 1, wherein the determining the target touch screen data matched with the touch screen data comprises:
extracting a touch screen identifier and a touch screen track of the touch screen identifier in the touch screen data;
determining whether the touch screen identifier is matched with a target touch screen identifier in the target touch screen data;
if yes, determining whether the touch screen track is matched with a target touch screen track in the target touch screen data;
if yes, the target touch screen data is determined to be matched with the touch screen data.
5. The method for controlling voice broadcasting according to claim 4, wherein the extracting the touch screen identifier and the touch screen track of the touch screen identifier in the touch screen data comprises:
and identifying the touch screen identification and the touch screen track of the touch screen identification in the touch screen data through an image identification algorithm.
6. The method for controlling voice broadcasting according to claim 1, wherein the touch screen data includes single touch screen data or multi touch screen data; the target touch screen data includes single touch screen data or multi-touch screen data.
7. A control device for voice broadcasting, comprising:
The electronic equipment comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring touch screen data generated for a resource data service page based on the barrier-free voice broadcasting state of the resource data service page, wherein the resource data service page is a service page of application service software installed on the electronic equipment;
the determining unit is used for determining target touch screen data matched with the touch screen data; the target touch screen data can be used for broadcasting control data for the barrier-free voice broadcasting of the resource data service page, wherein the broadcasting control is in a broadcasting control mode and corresponds to the target touch screen data;
the control unit is used for controlling and adjusting the current voice broadcasting of the resource data service page according to the broadcasting control mode corresponding to the matched target touch screen data; in the process of carrying out voice broadcasting on the resource data service page in the barrier-free voice broadcasting state, voice broadcasting information of touch screen operation is also inserted, so that a user is prompted to output preset touch screen operation;
wherein the apparatus further comprises:
the input unit is used for responding to the input voice prompt of the target touch screen data of the resource data service page in the barrier-free voice broadcasting state and inputting the target touch screen data; comprising the following steps: a target touch screen identifier is arranged for an operation part of the touch screen operation; wherein the operation part comprises a finger; recording the generated target touch screen track and the target touch screen mark corresponding to the operation part on the resource data service page; determining the target touch screen identification and the corresponding target touch screen track as the target touch screen data;
The establishing unit establishes the corresponding association relation between the input target touch screen data and the broadcasting control mode.
8. The interactive method for voice broadcasting is characterized by comprising the following steps of:
receiving touch screen data generated for a resource data service page based on touch screen operation of the resource data service page in an unobstructed voice broadcasting state;
sending a control request according to the touch screen data;
responding to the target touch screen data determined by the control request, controlling and adjusting the current voice broadcasting of the barrier-free voice broadcasting, and outputting the voice broadcasting of the resource data service page after the corresponding control and adjustment; in the process of carrying out voice broadcasting on the resource data service page in the barrier-free voice broadcasting state, voice broadcasting information of touch screen operation is also inserted, so that a user is prompted to output preset touch screen operation;
the step of receiving the touch screen data generated for the resource data service page under the barrier-free voice broadcasting state of the resource data service page further comprises the following steps:
Responding to the voice prompt of the input of the target touch screen data of the resource data service page in the barrier-free voice broadcasting state, and inputting the target touch screen data; comprising the following steps: a target touch screen identifier is arranged for an operation part of the touch screen operation; wherein the operation part comprises a finger; recording the generated target touch screen track and the target touch screen mark corresponding to the operation part on the resource data service page; determining the target touch screen identification and the corresponding target touch screen track as the target touch screen data;
and establishing a corresponding association relation between the input target touch screen data and the broadcasting control mode.
9. The interactive method for voice broadcasting according to claim 8, further comprising:
and outputting the explanation voice information of the touch screen data corresponding to the control mode adjustment according to the control adjustment.
10. A computer storage medium storing network platform generation data and a program for processing the network platform generation data;
the program, when read and executed, performs the steps of the control method for voice broadcasting according to any one of claims 1 to 6 or the steps of the interaction method for voice broadcasting according to claim 9.
11. An electronic device, comprising:
a processor;
a memory for storing a program for processing network platform generation data, which when read by the processor performs the steps of the control method of voice broadcasting according to any one of claims 1 to 6 or the steps of the interaction method of voice broadcasting according to claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110268418.0A CN112905148B (en) | 2021-03-12 | 2021-03-12 | Voice broadcasting control method and device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110268418.0A CN112905148B (en) | 2021-03-12 | 2021-03-12 | Voice broadcasting control method and device, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112905148A CN112905148A (en) | 2021-06-04 |
CN112905148B true CN112905148B (en) | 2023-09-22 |
Family
ID=76105007
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110268418.0A Active CN112905148B (en) | 2021-03-12 | 2021-03-12 | Voice broadcasting control method and device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112905148B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113450762B (en) * | 2021-06-23 | 2024-05-14 | Oppo广东移动通信有限公司 | Text reading method, text reading device, terminal and storage medium |
CN113253909A (en) * | 2021-06-30 | 2021-08-13 | 浙江口碑网络技术有限公司 | Control method and device for resource information display and play, storage and electronic equipment |
CN113778307B (en) * | 2021-09-27 | 2023-09-19 | 口碑(上海)信息技术有限公司 | Information interaction method and device |
CN115766933B (en) * | 2022-10-31 | 2024-06-21 | 中国农业银行股份有限公司 | Barrier-free mode voice broadcasting method, device, equipment and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10143187A (en) * | 1996-11-07 | 1998-05-29 | Ricoh Co Ltd | Work support system |
CN103500067A (en) * | 2013-09-30 | 2014-01-08 | 北京航空航天大学 | Touch screen interactive system combined with clicking, sliding, gesture recognition and voice |
CN104461346A (en) * | 2014-10-20 | 2015-03-25 | 天闻数媒科技(北京)有限公司 | Method and device for visually impaired people to touch screen and intelligent touch screen mobile terminal |
CN105872685A (en) * | 2016-03-24 | 2016-08-17 | 深圳市国华识别科技开发有限公司 | Intelligent terminal control method and system, and intelligent terminal |
CN106055364A (en) * | 2016-05-31 | 2016-10-26 | 广东欧珀移动通信有限公司 | Application starting method and terminal equipment |
CN107831988A (en) * | 2017-11-27 | 2018-03-23 | 维沃移动通信有限公司 | The operating method and mobile terminal of a kind of mobile terminal |
KR101917333B1 (en) * | 2018-01-16 | 2018-11-09 | 주식회사 한길에이치씨 | A user terminal apparatus of voice guidance device for the visually impaired and control method thereof |
CN108874356A (en) * | 2018-05-31 | 2018-11-23 | 珠海格力电器股份有限公司 | Voice broadcasting method and device, mobile terminal and storage medium |
CN109857326A (en) * | 2019-02-01 | 2019-06-07 | 思特沃克软件技术(西安)有限公司 | A kind of vehicular touch screen and its control method |
CN110096211A (en) * | 2019-04-30 | 2019-08-06 | 广东美的厨房电器制造有限公司 | The control method and household electrical appliance of household electrical appliance |
CN111367458A (en) * | 2020-03-10 | 2020-07-03 | 柯旋 | Barrier-free film for intelligent touch screen mobile terminal and barrier-free touch screen method |
WO2020232615A1 (en) * | 2019-05-20 | 2020-11-26 | 深圳市欢太科技有限公司 | Information recommendation method and apparatus, and electronic device and storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11625145B2 (en) * | 2014-04-28 | 2023-04-11 | Ford Global Technologies, Llc | Automotive touchscreen with simulated texture for the visually impaired |
US20170171594A1 (en) * | 2015-12-14 | 2017-06-15 | Le Holdings (Beijing) Co., Ltd. | Method and electronic apparatus of implementing voice interaction in live video broadcast |
CN111681640B (en) * | 2020-05-29 | 2023-09-15 | 阿波罗智联(北京)科技有限公司 | Method, device, equipment and medium for determining broadcast text |
-
2021
- 2021-03-12 CN CN202110268418.0A patent/CN112905148B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10143187A (en) * | 1996-11-07 | 1998-05-29 | Ricoh Co Ltd | Work support system |
CN103500067A (en) * | 2013-09-30 | 2014-01-08 | 北京航空航天大学 | Touch screen interactive system combined with clicking, sliding, gesture recognition and voice |
CN104461346A (en) * | 2014-10-20 | 2015-03-25 | 天闻数媒科技(北京)有限公司 | Method and device for visually impaired people to touch screen and intelligent touch screen mobile terminal |
CN105872685A (en) * | 2016-03-24 | 2016-08-17 | 深圳市国华识别科技开发有限公司 | Intelligent terminal control method and system, and intelligent terminal |
CN106055364A (en) * | 2016-05-31 | 2016-10-26 | 广东欧珀移动通信有限公司 | Application starting method and terminal equipment |
CN107831988A (en) * | 2017-11-27 | 2018-03-23 | 维沃移动通信有限公司 | The operating method and mobile terminal of a kind of mobile terminal |
KR101917333B1 (en) * | 2018-01-16 | 2018-11-09 | 주식회사 한길에이치씨 | A user terminal apparatus of voice guidance device for the visually impaired and control method thereof |
CN108874356A (en) * | 2018-05-31 | 2018-11-23 | 珠海格力电器股份有限公司 | Voice broadcasting method and device, mobile terminal and storage medium |
CN109857326A (en) * | 2019-02-01 | 2019-06-07 | 思特沃克软件技术(西安)有限公司 | A kind of vehicular touch screen and its control method |
CN110096211A (en) * | 2019-04-30 | 2019-08-06 | 广东美的厨房电器制造有限公司 | The control method and household electrical appliance of household electrical appliance |
WO2020232615A1 (en) * | 2019-05-20 | 2020-11-26 | 深圳市欢太科技有限公司 | Information recommendation method and apparatus, and electronic device and storage medium |
CN111367458A (en) * | 2020-03-10 | 2020-07-03 | 柯旋 | Barrier-free film for intelligent touch screen mobile terminal and barrier-free touch screen method |
Also Published As
Publication number | Publication date |
---|---|
CN112905148A (en) | 2021-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112905148B (en) | Voice broadcasting control method and device, storage medium and electronic equipment | |
WO2020000972A1 (en) | Video access method, client, video access apparatus, terminal, server, and storage medium | |
CN108984081A (en) | A kind of searched page exchange method, device, terminal and storage medium | |
CN114302210A (en) | User interface for viewing and accessing content on an electronic device | |
US10777096B2 (en) | System for assisting in foreign language learning | |
CN112104915B (en) | Video data processing method and device and storage medium | |
CN100531336C (en) | Information processing apparatus and information processing method | |
CN113301361A (en) | Human-computer interaction, control and live broadcast method, device and storage medium | |
CN101796516A (en) | navigation systems and methods | |
CN105260109A (en) | Play speed adjusting method and terminal | |
CN107785037B (en) | Method, system, and medium for synchronizing media content using audio time codes | |
CN111209437B (en) | Label processing method and device, storage medium and electronic equipment | |
CN104205854A (en) | Method and system for providing a display of social messages on a second screen which is synched to content on a first screen | |
EP4412222A1 (en) | Multimedia information processing method and apparatus, and electronic device and storage medium | |
US9769502B2 (en) | Method, apparatus, and system for playing multimedia file | |
CN106468987B (en) | Information processing method and client | |
CN104965874A (en) | Information processing method and apparatus | |
US20180275756A1 (en) | System And Method Of Controlling Based On A Button Having Multiple Layers Of Pressure | |
CN111263204A (en) | Control method and device for multimedia playing equipment and computer storage medium | |
CN103488669A (en) | Information processing apparatus, information processing method and program | |
CN112929725B (en) | Video distribution method, video playing method, electronic device and storage medium | |
CN114449133A (en) | File display method, device, equipment, storage medium and program product | |
WO2024104182A1 (en) | Video-based interaction method, apparatus, and device, and storage medium | |
CN112653931B (en) | Control method and device for resource information playing, storage medium and electronic equipment | |
CN112423066B (en) | Television control method and device, intelligent terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |