CN117519564A - Barrage message publishing method and device - Google Patents

Barrage message publishing method and device Download PDF

Info

Publication number
CN117519564A
CN117519564A CN202210887888.XA CN202210887888A CN117519564A CN 117519564 A CN117519564 A CN 117519564A CN 202210887888 A CN202210887888 A CN 202210887888A CN 117519564 A CN117519564 A CN 117519564A
Authority
CN
China
Prior art keywords
barrage
barrage message
application
message
floating window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210887888.XA
Other languages
Chinese (zh)
Inventor
郭晓宁
安祺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210887888.XA priority Critical patent/CN117519564A/en
Publication of CN117519564A publication Critical patent/CN117519564A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A barrage message release method and device relate to the technical field of electronics. The method is applied to a first electronic device with a touch screen and comprises the following steps: displaying a floating window of a first application, wherein at least one barrage message is displayed in the floating window; detecting a first touch operation acting on a first barrage message in the floating window, and determining that the first barrage message is selected, wherein the first barrage message comprises one or more barrage messages; detecting a second touch operation acting on the first barrage message, and publishing the first barrage message to a barrage display area in a user interface of a second application; wherein the second touch operation includes dragging the first barrage message to the second application.

Description

Barrage message publishing method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a barrage message publishing method and device.
Background
With the rapid development of information technology, the use of man-machine interaction technology is becoming more and more widespread. Human-computer interaction technology (human-computer interaction techniques) refers to technology that realizes human-computer conversations in an efficient manner through computer input and output devices.
Taking the live broadcast transmitting function option provided by the live broadcast application interface as an example, firstly, popup the live broadcast application interface to provide a live broadcast entry function option, open a soft keyboard of a system input method, then input the live broadcast content in the live broadcast editing window through an input method software disc, and finally release the live broadcast application interface by clicking a 'transmitting' button. The bullet screen sending process is complex in operation and poor in user experience.
Similar problems exist in situations where a bullet screen needs to be released when other similar applications (such as gaming applications) are used, resulting in poor user experience. Therefore, how to simplify user operation based on man-machine interaction and improve user experience in these scenes is a technical problem to be solved at present.
Disclosure of Invention
The embodiment of the application provides a barrage message publishing method and device, which are used for simplifying the complexity of barrage publishing operation and further improving user experience.
In a first aspect, a barrage message publishing method is provided, applied to a first electronic device with a touch screen, the method including: displaying a floating window of a first application, wherein at least one barrage message is displayed in the floating window; detecting a first touch operation acting on a first barrage message in the floating window, and determining that the first barrage message is selected, wherein the first barrage message comprises one or more barrage messages; detecting a second touch operation acting on the first barrage message, and publishing the first barrage message to a barrage display area in a user interface of a second application; wherein the second touch operation includes dragging the first barrage message to the second application.
The first application here refers in the following embodiments to a "transfer station" application. It should be appreciated that the above operation is intended to drag a barrage message stored at a transfer station into an interface of a second application (e.g., a live application) to enable publication of the barrage message in the second application, thereby providing a quick way to publish the barrage message. It should be understood that "transfer station" is only one name of the application in which the solution provided herein is used and is not limiting of the present application.
The first application and the second application may be applications that are pre-installed before the electronic device leaves the factory, or applications that are installed by the user. It should be noted that any software component installed in an electronic device can be regarded as an "application" regardless of size. By way of example, software components that run on top of an operating system are applications in a general sense, and software components within an operating system (which may also be referred to as services) may also be referred to herein as "applications," which are not limited by the present application.
It will be appreciated that the "determine that the first barrage message is selected" operation may be made in response to detecting the first touch operation; the operation of "publishing the first barrage message to the barrage display area in the user interface of the second application" is made in response to detecting the second touch operation.
In one possible implementation, the first touch operation for selecting the first barrage message and the second touch operation for acting on the first barrage message may be different operations in one user gesture or consecutive touch operations. For example, the first touch operation may be a "pressing" operation acting on the first barrage message, for selecting the first barrage message, and the second day touch operation may be an operation of dragging and finally "lifting" after the "pressing" operation, where it can be seen that the first touch operation and the second touch operation belong to the same dragging gesture.
In another possible implementation, the first touch operation of selecting the first barrage message and the second touch operation acting on the first barrage message may be different touch operations of user gestures or discontinuous touch operations. For example, the first touch operation may be an operation of checking one or more barrage messages based on check boxes (for example, clicking the check boxes) when barrage messages in a floating window of the first application are in a check state (for example, each barrage message is correspondingly displayed with a check box), and the second touch operation may be an operation of dragging the checked one or more barrage messages from the floating window of the first application to a barrage display area in a user interface of the second application, where it can be seen that the first touch operation and the second touch operation belong to different user gestures.
In one possible implementation, the user interface of the second application includes a first control, where the first control is used to display a barrage message; the posting the first barrage message to a barrage display area in a user interface of a second application includes: and the first control responds to the second touch operation, receives the first barrage message and issues the first barrage message to a barrage display area of the second application.
In one possible implementation, the user interface of the second application includes a first control, where the first control is used to display a barrage message; the posting the first barrage message to a barrage display area in a user interface of a second application includes: and the first control responds to the second touch operation, receives the first barrage message, and issues the first barrage message to a barrage display area of the second application if the first barrage message meets the message format requirement of the second application.
The first control, which may also be referred to herein as a barrage control, is used to display a barrage message. The first control may be located in a playback window of the second application, e.g. the first control may be located at an upper layer of the playback window and may be set transparent so as to not obscure content in an underlying playback window as much as possible.
In one possible implementation, the method further includes: detecting a third touch operation acting on a second barrage message in the barrage display area, and determining that the second barrage message is selected; detecting a fourth touch operation acting on the second barrage message, and displaying the second barrage message in the floating window; the fourth touch operation includes dragging the second barrage message to the floating window.
In one possible implementation, the user interface of the second application includes a first control, where the first control is used to display a barrage message; the detecting of the third touch operation acting on the second barrage message in the barrage display area, determining that the second barrage message is selected includes: and the first control responds to the detection of a third touch operation acting on the second barrage message in the barrage display area, determines that the second barrage message is selected, and acquires the selected second barrage message.
Through the implementation manner, the barrage message (possibly issued by the user or issued by other users) displayed in the barrage display area of the second application can be dragged to the transfer station for storage for subsequent use.
It should be noted that, the touch operation for selecting the second barrage message and the second touch operation acting on the second barrage message may be one operation, or may be continuous multiple operations, or may be discontinuous multiple operations.
In one possible implementation manner, before detecting the fourth touch operation of the second barrage message acting on the barrage display area, the floating window is in a folded state or a hidden state; the method further comprises the steps of: and when the fourth touch operation of the second barrage message acted on the barrage display area is detected, unfolding the floating window in a folding state or a hiding state.
In one possible implementation, the method further includes: and after the second barrage message is displayed in the floating window, folding or hiding the unfolded floating window. The above operation of folding or hiding the unfolded floating window occurs after the second barrage message is displayed in the floating window, and may include one of the following cases, by way of example:
case 1: immediately folding or hiding the unfolded floating window after the second barrage message is displayed in the floating window;
Case 2: after displaying the second barrage message in the floating window, if a touch operation of folding or hiding the floating window is detected (such as a user triggering a function key for folding or hiding the floating window in an interface of the first application, or the user dragging the floating window to a right side frame to trigger displaying the floating window as a side bar), in response, the electronic device folds or hides the unfolded floating window;
case 3: after the second barrage message is displayed in the floating window, if no related operation is detected for the floating window within a set period of time, such as a touch operation including folding or hiding the floating window, a related operation (such as checking, selecting, editing, etc.) performed on the barrage message in the floating window, and an operation of dragging the barrage message into the floating window, the electronic device folds or hides the unfolded floating window when the set period of time arrives as a response.
Through the implementation manner, when the barrage message is required to be dragged into the first application (the transfer station), the floating window of the first application is unfolded, and after the operation is completed, the floating window of the first application is folded or hidden, so that the influence of the floating window of the first application on the use of the second application by a user is reduced.
In one possible implementation, after displaying the second barrage message in the floating window, the method further includes: and transmitting the second barrage message to a second electronic device connected with the first electronic device, so that the first application in the second electronic device receives and stores the second barrage message.
Through the implementation manner, the first application (transfer station) can be arranged in the plurality of electronic devices, and the barrage messages in the first application can be synchronized among the plurality of electronic devices, so that users can issue the barrage messages by using the first application in different electronic devices, and the same user experience can be obtained.
In one possible implementation, the method further includes: detecting a fifth touch operation of a third barrage message acting on the floating window, and determining that the third barrage message is selected; detecting a sixth touch operation acting on the third barrage message, and displaying a barrage message editing window, wherein the barrage message editing window comprises a barrage message editing frame and an input method soft keyboard window, and the third barrage message is displayed in the barrage message editing frame; wherein the sixth touch operation includes dragging the third barrage message to a barrage message entry in a user interface of the second application.
Through the implementation manner, the selected barrage message can be edited conveniently.
In one possible implementation manner, after the detecting the first touch operation of the first barrage message acting in the floating window and determining that the first barrage message is selected, the method further includes: displaying at least one bullet screen release mode option, wherein the bullet screen release mode is used for indicating bullet screen display modes and/or release times; detecting a seventh touch operation of one or more bullet screen release mode options selected, and determining a bullet screen release mode selected; the posting the first barrage message to a barrage display area in a user interface of a second application includes: and according to the selected barrage release mode, releasing the first barrage message to a barrage display area in the user interface of the second application.
Through the implementation manner, the user can select the barrage release mode (such as text color, display speed, display times and the like), so that the user experience can be improved.
In one possible implementation, the method further includes: detecting an eighth touch operation acting on a second control in the floating window, so that the floating window enters an editing state; the method comprises the steps of displaying a check box corresponding to each bullet screen message in the floating window in an editing state, and displaying a first function option, wherein the check box is used for checking bullet screen messages, and the first function option is used for correspondingly editing or managing the checked bullet screen messages; or detecting a ninth touch operation acting on the fourth barrage message in the floating window, and displaying a second function option corresponding to the fourth barrage message, wherein the second function option is used for editing or managing the fourth barrage message correspondingly.
In one possible implementation, the second application includes: an audio video application, a live broadcast application, or a gaming application.
In a second aspect, a barrage message publishing method is provided, and is applied to a first electronic device with a touch screen, and the method includes: displaying a user interface of a second application, wherein the user interface of the second application comprises a barrage display area; detecting a third touch operation acting on a second barrage message in the barrage display area, and determining that the second barrage message is selected; detecting a fourth touch operation acting on the second barrage message, and displaying the second barrage message in a floating window of the first application; the fourth touch operation includes dragging the second barrage message to the floating window.
In one possible implementation, the user interface of the second application includes a first control, where the first control is used to display a barrage message; the detecting of the third touch operation acting on the second barrage message in the barrage display area, determining that the second barrage message is selected includes: and the first control responds to the detection of a third touch operation acting on the second barrage message in the barrage display area, determines that the second barrage message is selected, and acquires the selected second barrage message.
In one possible implementation manner, before detecting the fourth touch operation of the second barrage message acting on the barrage display area, the floating window is in a folded state or a hidden state; the method further comprises the steps of: and when the fourth touch operation of the second barrage message acted on the barrage display area is detected, unfolding the floating window in a folding state or a hiding state.
In one possible implementation, the method further includes: and after the second barrage message is displayed in the floating window, folding or hiding the unfolded floating window.
In a third aspect, there is provided an electronic device comprising: one or more processors; the one or more memories store one or more computer programs comprising instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of the first aspects or the second aspects.
In a fourth aspect, there is provided a computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of any one of the first aspects or to perform the method of any one of the second aspects.
In a fifth aspect, there is provided a system on a chip comprising one or more processors which, when executing instructions, perform a method of relaying information as claimed in any of the first aspects above, or perform a method as claimed in any of the second aspects above.
The advantages of the second aspect to the fifth aspect are described with reference to the advantages of the first aspect, and the description thereof will not be repeated.
Drawings
FIG. 1 is a human-computer interaction process for transmitting a bullet screen at a play interface of a live application;
fig. 2 is a schematic diagram of an internal hardware structure of an electronic device according to an embodiment of the present application;
fig. 3A and fig. 3B are schematic software structures of an electronic device according to an embodiment of the present application;
FIGS. 4-14 are user interface diagrams of some electronic devices provided by examples of the present application;
fig. 15 is a schematic signaling interaction diagram in an electronic device according to an embodiment of the present application;
FIG. 16 is a schematic diagram of a user interface implemented based on the flow of FIG. 15 in an embodiment of the present application;
fig. 17 is a schematic signaling interaction diagram in an electronic device according to an embodiment of the present application;
fig. 18 is a schematic diagram of a user interface implemented based on the flowchart of fig. 17 in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. The terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in embodiments of the present application, "one or more" means one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The plurality of the embodiments of the present application refers to greater than or equal to two. It should be noted that, in the description of the embodiments of the present application, the terms "first," "second," and the like are used for distinguishing between the descriptions and not necessarily for indicating or implying a relative importance, or alternatively, for indicating or implying a sequential order.
Technical terms related to the embodiments of the present application will be first described below.
(1) Control piece
In computer programming, a control is a graphical user interface element that displays an arrangement of information that can be changed by a user, such as a window or text box. A control is a basic visual building block that is contained within an application program that controls all data processed by the application program and the interaction with respect to that data. Different combinations of controls are typically packaged in component toolkits, and programmers can build graphical user interfaces (graphic user interfaces, GUIs). Most operating systems include a set of controls for programming that a programmer can add to an application and specify their behavior. Controls are typically defined as classes of object oriented programming (object oriented programmin, OOP). Many controls result from class inheritance.
Android systemThe system is an example, and the following are examples of some typical controls provided for the system:
image view (ImageView) control: for displaying images (pictures);
TextBox (TextBox) control: for inputting text information in a form;
a button class control.
The control in the embodiment of the application may further include a control developed by a third party, such as a barrage control for implementing barrage management and display functions.
(2) User gestures
User gestures are input modes based on human-computer interaction, including, for example, single click, double click, long press, drag, and the like. A drag gesture is one of the user gestures, which is a gesture in which a user moves a finger while pressing a screen for a long time, and can move one element from one location to another. A full drag gesture may include: the user presses down at the moment the screen is touched, after which it is possible to trigger a movement, finally lifting off the screen. Thus, a complete drag gesture may generally include the following state change processes: pressing state→moving state … … →lifting state. Wherein the system may detect multiple movement states depending on the duration of the movement.
Different states may trigger different events (from a programming perspective, an event is also understood herein as a function) to achieve different operations. For example, taking an application icon on a terminal screen as an example, when a pressed state is detected, an operation of identifying the application icon selected by the user may be triggered; when a movement state is detected, a movement position tracking operation may be triggered; when the lifted state is detected, an operation of rearranging the application icons may be triggered so as to place the application icon selected by the user to the position where the lifting action is located.
Currently, when a user uses a terminal to watch live broadcast or video, a requirement exists for transmitting barrage messages to carry out interactive communication. A barrage message or barrage refers to the presentation of commentary text, expressions, patterns, etc. while presenting the content (e.g., video) being played. Referring to fig. 1, a human-computer interaction process in which a user sends a barrage message at a play interface of a live application is illustrated. As shown in fig. 1 (a), a bullet screen entry 11 is provided in the play interface 10 of the live application, and the bullet screen entry 11 may be displayed as a prompt of "i want to send a bullet screen" or the like, so as to prompt the user to click on the bullet screen entry area to perform a bullet screen message sending operation, and the bullet screen entry 11 may be implemented by a button-type control. As shown in fig. 1 (b), after the user clicks on the area where the bullet screen entrance 11 is located, the bullet screen content editing area 12 is displayed in the playing interface 10, and the input method soft keyboard window 13 is popped up. The barrage content editing area 12 may provide editing functions such as adjusting fonts, text selections, cursor movement to, displaying entered content, and the like. As shown in fig. 1 (c), the user can input the bullet message content using the input method software pad in the input method soft keyboard window 13 and click the send button 14 to release the bullet message after the bullet message content input is completed. As shown in fig. 1 (d), the barrage message is sent to the server for processing (such as queuing based on priority, information filtering, etc.), and the server issues the barrage message after processing is completed, so that the barrage message is displayed in the playing interface 10.
The process of sending the barrage in the live interface at least needs to click the barrage entry function button to display the barrage message editing window, the barrage message content editing is performed through the barrage message editing window, the barrage message is released by clicking the send button, and the like, so that the user operation is complex and the user experience is poor. Moreover, when the user inputs the barrage message, the user cannot give consideration to the content played on the interface, so that the use experience of the application is affected.
Based on the above, the embodiment of the application provides a man-machine interaction method, a man-machine interaction device and a storage medium, which are used for simplifying barrage message release operation, reducing man-machine interaction operation complexity and further improving user experience.
The method and the device can be applied to terminal equipment such as mobile phones, the barrage information can be stored through the first application, and the barrage information stored in the first application is dragged to the second application for release through simple user gestures (such as dragging gestures). Taking the live broadcast watching of the user as an example, the bullet screen message is sent, and the bullet screen message displayed in the floating window of the first application can be dragged into the live broadcast interface to finish the release of the bullet screen message. Further, the barrage message displayed in the live interface can be dragged into the floating window of the first application, so that the barrage message is stored in the first application for subsequent use.
In this embodiment of the present application, the first application may implement a function of storing and displaying a barrage message, where the first application may receive and store barrage messages from other applications, and may also provide the stored barrage messages to other applications. Furthermore, the first application can also realize the management function of barrage messages. Based on the functionality of the first application described above, in embodiments of the present application, the first application may be referred to as a "relay station".
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The embodiment of the application can be applied to electronic equipment with a touch screen. The saidElectronic devices include, but are not limited to, on-boardHong (Harmonyos) or other operating system. By way of example, the electronic device may be, for example, a mobile phone, a tablet computer, a personal computer (personal computer, PC), a personal digital assistant (personal digital assistant, PDA), a smart watch, a netbook, a wearable electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a vehicle-mounted device, a smart screen, a smart car, a smart stereo, a robot, etc., and the specific form of the electronic device is not particularly limited in this application.
Referring to fig. 2, a schematic diagram of an internal hardware structure of an electronic device 100 according to an embodiment of the present application is provided.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like. Optionally, the display screen 194 is a touch panel (touch panel), or "touch screen, touch panel, etc., and is an inductive liquid crystal display device capable of receiving input signals such as contacts.
It is to be understood that the structure illustrated by the examples of this application does not constitute a specific limitation on the electronic device 100. In other examples of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some examples, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some examples, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others. It should be understood that the connection between the modules illustrated in the examples of the present invention is merely illustrative, and does not limit the structure of the electronic device 100. In other examples of the present application, the electronic device 100 may also employ different interfaces in the above examples, or a combination of interfaces.
The charge management module 140 is configured to receive a charge input from a charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other examples, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some examples, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some examples, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor.
The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some examples, the modem processor may be a stand-alone device. In other examples, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some examples, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The present example illustrates the software architecture of the electronic device 100 using a layered architecture Android system as an example.
Referring to fig. 3A, a software architecture block diagram of an electronic device 100 is illustrated in an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some examples, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android run times) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3A, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be understood that the modules included in the various tiers shown in fig. 3A do not constitute a limitation on the architecture of the electronic device and the hierarchy (illustration) of the deployment of the modules. In one embodiment, the modules shown in FIG. 3A may be deployed alone, or several modules may be deployed together, with the division of the modules in FIG. 3A being an example. In one embodiment, the names of the modules shown in FIG. 3A are exemplary.
Based on the software system architecture shown in fig. 3A, fig. 3B illustrates some components related to the embodiments of the present application.
The application layer includes a first application, which may also be referred to as a "transfer station," for storing bullet messages. Alternatively, the first application may be a system application, or may be a third party application that allows the user to install and uninstall.
The application layer also includes a second application, which may be any application capable of implementing a barrage function. Alternatively, the second application may be a third party application. By way of example, the second application may include a video class application for playing video, a live application for live broadcast, a game class application, an audio class application, and the like. Alternatively, the number of second applications may be one or more. When the number of second applications is plural, different second applications may share use of the barrage message stored in the first application.
In some embodiments of the present application, a control with an extended function may be used, so that the first application and the second application can obtain the barrage message of the application based on the gesture of the user for transmitting to other applications, or receive the barrage message from other applications for barrage message publishing or barrage message storage. The control with the expansion function not only has the original functions of the control (such as element position setting, layout, display effect and the like), but also has the expansion function of the control, such as a data interface function, and the data interface function is used for acquiring and receiving data (barrage information). Further, the data interface may also implement data format checking, which includes performing format checking on the received data according to the data format requirements provided by the application.
After the development of the control with the functions is completed, the control can be released to a public platform, so that an application program developer can obtain the control, and in the development process of the application program, the control can be used for programming and designing a user interface of the application program, so that the application program can realize the expansion functions. The floating window of the first application (transfer station) in the embodiment of the application may include a control with the above-mentioned extended function, and the control for implementing the barrage function in the playing window of the second application may have the above-mentioned extended function.
The drag service module may be included in the application framework layer. Optionally, the drag service module may transmit the barrage message from the first application to the second application for publishing and/or transmit the barrage message from the second application to the first application for storage based on the drag gesture of the user.
The technical solutions involved in the following examples may be implemented in the electronic device 100 having the above-described hardware architecture and software architecture.
In one possible implementation, the first application and the second application are located on the same electronic device. In another possible implementation, the first application and the second application are located on different electronic devices, with a communication connection between the devices. That is, the barrage message may be sent to another device or devices through the transfer station. It should be noted that the type of communication connection between the devices is not limited in this example, and may be a wired connection, or may be a wireless connection, for example, a communication connection provided by the mobile communication module 150 or a communication connection provided by the wireless communication module 160 in fig. 2, or may be another type of communication connection. In some examples, when the barrage message is sent to another device, it may be sent to a designated application, such as a second application; the application to which the bullet screen message is sent may not be limited, in which case, after the bullet screen message is received by another device, the operating system of the device may provide a default storage location to store the bullet screen message, and the stored bullet screen message may be presented by the first application (transfer station), clipboard, or other manner of the other device, and then the bullet screen message may be sent, copied, deleted, or other operations performed in the device.
It should be noted that "transfer station" is merely a name provided by the present application for convenience of description of the solution, and in some examples, should not be taken as limiting the function of the solution.
The electronic device 100 is a mobile phone, and the technical solutions provided by the examples of the present application are described in detail with reference to the accompanying drawings. It will be appreciated that when the electronic device is a mobile phone, some of the user operations described below are typically performed by the user touching the screen with a finger; when the electronic device is a tablet computer, a notebook computer, a desktop computer, a smart large screen or other types of devices, some of the user operations described below may be implemented by a stylus, a mouse, keys, a remote control, voice, or the like.
In this embodiment of the present application, the user interface of the transfer station may be displayed in a floating window manner.
1. Display of floating windows of the transfer station and several possible operations.
This section is described with respect to several possible operations of a floating window of a transfer station.
(1) And displaying a floating window of the transfer station.
One or more barrage messages may be displayed in the floating window of the transfer station. Alternatively, if the number of barrage messages is large, the user is allowed to view the barrage messages by a sliding operation (such as sliding up or down). Optionally, to facilitate user viewing of the barrage message, a slider bar may be displayed in the floating window for user sliding operation and viewing of the location of the currently displayed barrage message in all barrage messages stored in the transfer station.
In some examples, as shown in fig. 4 (1), a floating window 402 of the transfer station is displayed in the mobile phone display interface 401. The floating window 402 displays a plurality of bullet screen messages stored in the transfer station, such as bullet screen message 1, bullet screen message 2, and bullet screen message 3 shown in the figure.
Alternatively, if the length of the barrage message is short, such as barrage message 1 and barrage message 2, the complete content of the barrage message may be displayed. If the length of the barrage message is longer, such as barrage message 3, then a portion of the content of the barrage message may be displayed, such as only the content of the first few characters of the barrage message, with the other characters omitted with indicators such as ellipses. The user may select the barrage message, such as by pressing the barrage message with a user's finger for a period of time greater than the set period of time T1, so that the content of the barrage message is displayed in its entirety in the pop-up display box 403 for viewing by the user, as shown in fig. 4 (2).
Alternatively, if the number of bullet screen messages is large, the user may be allowed to view other bullet screen messages through a slide-up and slide-down operation, as shown in (3) of fig. 4.
(2) Starting the transfer station and calling out the floating window of the transfer station.
The transfer station may be started by:
-activating the transfer station by triggering a specified function key on the handset, such as a virtual HOME key;
-activating the transfer station by triggering a specified key or key combination on the handset;
-starting the transfer station by an option in the menu to start the transfer station; for example, the user can pull out the function menu interface downwards from the top of the mobile phone and select a prompt menu command or icon in the interface for starting the transfer station so as to start the transfer station;
-starting the transfer station by voice, i.e. the user can start the transfer station by a corresponding voice command.
The foregoing is merely illustrative of several ways of starting a transfer station, and embodiments of the present application are not limited in this regard.
For example, when a transfer station is started, a floating window 402 as shown in (1) of fig. 4 may be displayed at a specific position (e.g., upper right corner of the screen) of the current interface, which is used to indicate that the mobile phone has started the transfer station currently. It can be noted that the mobile phone displays a complete floating window at this time, and the bullet screen message is displayed in the floating window.
(3) And moving a floating window of the transfer station.
In some examples, the cell phone may display the floating window of the transfer station in a fixed position in the screen (e.g., the upper right corner of the screen). In other examples, the mobile phone may also select a floating window for displaying the transfer station at a blank location or a location where the interface content is not important based on the content in the current interface.
In some examples, the user may manually adjust the position of the floating window of the transfer station by dragging the floating window of the transfer station. For example, as shown in fig. 5 (1), the user may press the floating window of the transfer station and drag to an arbitrary position (e.g., upper left corner/upper right corner/lower left corner/lower right corner/middle position of left side frame/middle position of right side frame/screen center position, etc.).
In other examples, the mobile phone determines a preset position nearest to the user's hand-loosening position according to the drag operation of the user. For example, the mobile phone is provided with four preset positions, namely an upper left corner, an upper right corner, a lower left corner and a lower right corner, for displaying a floating window of the transfer station. In one example, the handset may default to the floating window of the transfer station to be displayed in the upper right corner. As shown in (2) of fig. 5, the mobile phone may divide the screen into an upper left area, an upper right area, a lower left area, and a lower right area. And in response to detecting the operation of dragging the floating window of the transfer station by the user, the mobile phone determines at which preset position to display the floating window of the transfer station according to the position of the user loosening the hands. For example, if the user's loose hand is in the lower right area, the cell phone moves the floating window of the transfer station to the lower right corner as shown in fig. 5 (3). If the user's loose hand position is in the lower left area, the handset moves the floating window of the transfer station to the lower left corner. If the user's loose hand position is in the upper left area, the handset moves the floating window of the transfer station to the upper left corner. If the user releases his hands in the upper right area, the handset still keeps the floating window of the transfer station displayed in the upper left corner. For another example, the cell phone is provided with two preset positions for displaying floating windows of the transfer station, such as the top left side (i.e., upper left corner) and top right side (i.e., upper right corner) of the screen. Then, in response to detecting an operation of the user dragging the floating window of the transfer station, the mobile phone determines at which preset position to display the floating window of the transfer station according to the position where the user loosens the hand. For example, if the user loosens his hands closer to the left frame of the screen, the cell phone moves the floating window of the transfer station to the upper left corner. If the user loosens the hands and places the position of the floating window closer to the right side frame of the screen, the mobile phone moves the floating window of the transfer station to the upper right corner.
It will be understood that the rule that the mobile phone determines the position of the floating window of the transfer station according to the drag operation of the user is merely an example, and the present application is not limited to a specific rule.
(4) Folding or hiding the floating window of the transfer station.
Alternatively, to reduce the impact of the floating window of the transfer station on the user's use of a second application (e.g., a live application), the floating window of the transfer station may be folded or hidden (e.g., shown in the form of a side rail) and may be unfolded when desired.
In some examples, after the cell phone starts the transfer station, the transfer station may be displayed in a floating window by default. For example, after the mobile phone starts the transfer station, a floating window 601 shown in (1) of fig. 6 may be displayed at a specific position (e.g., upper right corner of the screen) of the current interface, where the floating window 601 is used to indicate that the mobile phone has started the transfer station currently. It can be noted that at this time, the mobile phone displays the complete floating window 601, and it can be seen that the barrage message is displayed in the floating window 601. In some examples, one edge of the floating window 601 is in contact with the bezel of the screen. For example, the right side frame of the floating window 601 is in contact with the right side frame of the screen. In other examples, either side of the floating window 601 is not in contact with the bezel of the screen.
In some examples, in response to detecting that the user performs the operation of switching to the sidebar display for the floating window 601, the mobile phone displays an interface as shown in (2) of fig. 6, that is, the original floating window 601 changes to the sidebar 602, for prompting the user that the mobile phone has opened the transfer station. The sidebar may be a portion of the contents of the floating window 601 (alternatively referred to as displaying an incomplete floating window 601) or other type of indicia. The above-mentioned switching operation may be, for example, that the user drags the floating window 601 to the frame of the screen after pressing the floating window 601 for a long time, or that the user clicks or double clicks the floating window 601, or that the user slides the floating window 601 in the direction of the side of the screen, or that the user makes a sliding gesture like the side of the screen, or a voice command, or other types of operation commands. In some examples, as shown in fig. 6 (2), the screen area occupied by the sidebar 602 becomes smaller, so the sidebar 602 may no longer display a barrage message. It can be appreciated that the sidebar 602 does not substantially obscure the content of the current interface, facilitating the user's view of the content of the current interface. In other examples, for example, in the case where the screen is large, even if the floating window 601 is switched to another state display, the bullet screen message may be continued to be displayed.
It will be appreciated that the collapsed or hidden state of the floating window of the transfer station may be in other forms than the side strip forms described above, such as merely displaying a logo of the transfer station (e.g., the logo may be a small icon or other form of logo). The implementation form of the flag of the transfer station is not limited in this application, and may be, for example, graphic information, text information, or combination information of graphic and text. For example, a flag of the transfer station may be displayed as a text prompt "drag to here" to identify the location of the floating window of the transfer station. The floating window of the transfer station may also be completely hidden, which is not limited in this embodiment of the present application.
In other examples, when the mobile phone monitors that the user has not operated the floating window 601 for a preset period of time, the mobile phone may also automatically switch the unfolded floating window 601 to a folded state or a hidden state display such as a side bar 602, so as to reduce the influence of the user on an application such as using a live broadcast application.
Optionally, after the transfer station is started, in the folded or hidden state, if a touch operation of selecting a barrage message in a play window of a second application (such as a live broadcast application) is detected, or a touch operation of dragging the barrage message to the floating window of the transfer station is detected, the mobile phone may automatically expand the floating window.
For example, when the handset displays the sidebar 602 at the border of the screen, the user may still select a barrage message in the second application to be stored at the transfer station. As shown in fig. 6 (3), when it is detected that the user drags the selected barrage message to the sidebar 602, optionally, the mobile phone displays an interface as shown in fig. 6 (4), at which time the sidebar 602 automatically expands, i.e., switches back to displaying the floating window 601 (or alternatively, does not switch to the expanded form). Further, it is detected that the user drags the barrage message into the floating window 601, and the mobile phone stores the barrage message in the transfer station, and optionally, an interface shown in (5) of fig. 6 is displayed, where the floating window 601 is completely displayed.
In some examples, as shown in fig. 6 (2), after the mobile phone displays the sidebar 602 at the frame of the screen, the user may switch back to display the floating window 601 as shown in fig. 6 (1) by performing an operation of switching back to the floating window display in the unfolded state (for example, an operation in which the user slides from the position of the sidebar 602 and in a direction away from the sidebar 602). For example, as shown in fig. 6 (2), in response to detecting the user's operation of sliding from the position of the side bar 602 in a direction away from the side bar 602, the cellular phone displays the complete floating window 601 as shown in fig. 6 (1).
(5) And (5) exiting the transfer station, and closing a floating window of the transfer station.
In some examples, the handset exits the transfer station upon receiving an operation from the user indicating to exit the transfer station. Optionally, the mobile deletes the bullet screen message stored before exiting the transfer station. Optionally, the mobile phone may also store the barrage message stored before the mobile phone exits from the transfer station in the history data of the transfer station, where the history data may store the barrage message stored in the process of opening the transfer station. Alternatively, the handset may store all barrage messages in the transfer station.
In other examples, as shown in (1) of fig. 7, a floating window 702 of the transfer station is displayed in the mobile phone display interface 701, and when the mobile phone detects that the user drags the floating window 702 of the transfer station, the mobile phone displays a delete control 703. Optionally, the user's finger does not leave the screen. Further, after detecting that the user drags the floating window 702 of the transfer station to move to the deletion control 703, the mobile phone releases the hand, and the mobile phone exits the transfer station, and the interface shown in (2) in fig. 7 is displayed.
In other examples, when the handset detects that the user has dragged the floating window 702 of the transfer station to the delete control 703 and has released his hand, the handset may no longer display the floating window 702 of the transfer station, but at this point the handset has not exited the transfer station. The user may also redisplay the floating window 702 of the transfer station by selecting a barrage message in a second application (e.g., a live application), or dragging a barrage message in the second application, or the like.
In other examples, when the handset detects that the duration of the user pressing the floating window 702 of the transfer station for a long time reaches a preset duration, the handset displays a delete control 703 to instruct the user to exit the transfer station.
In still other examples, the exit instruction of the transfer station may also be a voice instruction, a two-finger click, or a swipe in a particular direction, among other operations.
In other examples, the transfer station may also provide historical data for the transfer station. In some examples, the history data for the transfer station includes bullet messages stored by the transfer station prior to the last exit of the transfer station. In some examples, after the mobile phone exits the transfer station, the mobile phone clears the barrage message stored before the transfer station exits, and only the barrage message stored before the mobile phone exits this time is reserved in the history data of the transfer station.
(5) Other display modes.
In some examples, the transfer station may also be displayed in the form of a non-floating window. For example, in the interface 801 shown in fig. 8 (1), after detecting that the user presses the selected barrage message for a preset period of time, the mobile phone may display the interface shown in fig. 8 (2), the original interface 801 is zoomed out, and the content of the transfer station is displayed in the area 804 at the bottom (or the top) of the zoomed out interface 801, for example, the barrage message stored in the transfer station is displayed. In some examples, as shown in fig. 8 (2), region 804 directly expands the barrage message stored by the transfer station. Further, in some examples, the user may drag the selected barrage message to loose his hands after moving to area 804, thereby saving it to the transfer station.
In other examples, as shown in the interface 801 in fig. 8 (1), after detecting that the target object selected by the user is pressed for a long time for a preset period of time, the mobile phone may display the interface shown in fig. 8 (3), zoom out the original interface 801 and move to the lower right of the whole screen (for the user to operate with one hand), and display the content of the transfer station in the area 804 (or the area at the top) on the left side of the zoomed out interface 801, for example, display the bullet screen message stored in the transfer station. In some examples, as shown in fig. 8 (3), region 804 directly expands the barrage message stored by the transfer station. Further, in some examples, the user may drag the selected barrage message to loose his hands after moving to area 804, thereby saving it to the transfer station.
2. And (5) barrage message release.
The following describes a process of distributing a barrage message in a transfer station in a second application.
In the embodiment of the application, when the mobile phone detects a first touch operation (such as long press) of one or more barrage messages in a floating window acting on a transfer station, it is determined that the barrage messages are selected; and after the mobile phone detects a second touch operation (such as dragging) acting on the barrage message, issuing the barrage message to a barrage display area in a user interface of a second application. Optionally, the second touch operation includes dragging the barrage message to a second application.
In some examples, after the user selects a barrage message in the transfer station, the barrage message is still retained in the transfer station. That is, the handset sends the barrage message to the second application in a duplicate manner. In other examples, when the user selects a barrage message in the transfer station to send to the second application, the barrage message is no longer retained in the transfer station. That is, the handset sends the barrage message to the second application in a cut-out manner. In a specific implementation, an option may be set in a system setting of the mobile phone, so that the user can select to implement the bullet screen message transmission in a copy mode or a clip mode. Of course, the mobile phone may default to any mode, which is not limited herein.
Several possible ways of distributing the barrage message in the transfer station in the second application in the embodiments of the present application are described below.
(1) And transmitting a barrage message in the transfer station to the second application.
In some examples, the user may drag one barrage message in the transfer station to the second application at a time for barrage message publication. For example, as shown in fig. 9 (1), the mobile phone displays an interface of the second application (e.g., the second application is a live application) and a floating window 902 of the transfer station. The user interface of the second application includes a play window 901, and the play window 901 includes a bullet screen display area 903. In response to detecting that the user has pressed the barrage message 904 in the floating window 902 of the transfer station for a length of time equal to or greater than a preset length of time, the barrage message 904 in the floating window 902 of the transfer station floats as shown in fig. 9 (2). Optionally, the floating barrage message 904 is displayed in style 1. Wherein pattern 1 is, for example, "blue background". Alternatively, floating window 902 of the transfer station may be temporarily hidden. Further, it is detected that the user drags the floating barrage message 904, and the floating barrage message 904 moves following the movement of the user's finger. Alternatively, when the floating bullet screen message 904 is dragged to the bullet screen display area 903 in the play window 901, the floating bullet screen message 904 is displayed in style 2. Pattern 2 is different from pattern 1. Wherein pattern 2 is, for example, "green background". It should be noted that, in one implementation, the mobile phone may use a floating layer or floating window to implement the floating barrage message 904. Then the floating barrage message 904 is displayed in style 1, including the cell phone may display the floating layer or floating window in style 1, or the cell phone may present the barrage message in the floating layer or floating window in style 1. Similarly, the floating barrage message 904 is displayed in style 2, including the cell phone may display the floating layer or floating window in style 2, or the cell phone may present the barrage message in the floating layer or floating window in style 2. The following description of the display modes may refer to the description herein, and will not be repeated.
In some examples, in response to detecting that the user is released from the location of the bullet screen display area 903, the cell phone publishes a bullet screen message 904 in the transfer station at the bullet screen display area of the second application. As shown in (3) of fig. 9.
Of course, in other examples, when it is detected that the user presses the barrage message 904 in the floating window 902 of the transfer station for a long time equal to or greater than the preset time period, the barrage message 904 in the floating window 902 of the transfer station may not float as shown in fig. 9 (2). That is, the user can drag the bullet screen message directly to the bullet screen display area location in the live application to post the bullet screen message in the bullet screen display area.
(2) A plurality of barrage messages in the transfer station are sent to the second application.
In some examples, a user may drag multiple barrage messages in a transfer station to a second application at a time for barrage message publication.
In some examples, a control is included in floating window 1001 of the transfer station. In response to detecting the user operation control, the handset displays a menu 1003 as shown in fig. 10 (1), the menu 1003 including a multi-choice option, optionally, a history clipboard option. In response to detecting the user selecting the multiple choice option, the handset displays a transfer station floating window 1001 as shown in fig. 10 (1), each bullet screen message in the transfer station floating window 1001 corresponding to a check box. I.e. the transfer station enters the multi-choice state. The user can select a plurality of barrage messages, and the check boxes corresponding to the barrage messages to be checked show the check state. In some examples, as shown in fig. 10 (2), the user holds down the selected bullet message 1004 or bullet message 1005 and drags the bullet message 1004 or bullet message 1005 to the bullet display area in the live application play window. Alternatively, as shown in the interface (3) of fig. 10, the mobile phone may temporarily conceal the floating window of the transfer station when the user starts to drag the bullet screen message 1004 or the bullet screen message 1005, or when the user drags the bullet screen message 1004 or the bullet screen message 1005 out of the transfer station. Subsequently, when detecting that the user is loose in the bullet screen display area of the live broadcast application playing window, the mobile phone displays a bullet screen message 1004 or a bullet screen message 1005 in the bullet screen display area.
It should be noted that, in the process of dragging the plurality of barrage messages to the second application, the display manner of the barrage messages may be similar to the process of dragging one barrage message, which is not described in detail herein.
(3) When the barrage message is dragged to the second application, the barrage message is subjected to format checking and related processing.
In some examples, after the second application receives the barrage message from the transfer station, the format of the barrage message may be checked to check whether the format of the barrage message meets the requirements of the second application, for example, whether the barrage message is within a specified number of words, and if it is determined that the barrage message meets the message format requirements of the second application, the barrage message is issued to the barrage display area of the second application.
Optionally, if it is determined that the barrage message does not meet the message format requirement of the second application, appropriate processing may be performed, for example, deleting a portion exceeding the word count limit, or may also prompt that the transmission fails, and end the operation.
In one possible implementation, after detecting a touch operation of one or more barrage messages acting in the floating window, determining that the barrage message is selected, the method may further include the steps of: one or more bullet screen distribution mode options are displayed, wherein the bullet screen distribution mode is used for indicating the bullet screen display mode and/or the distribution times, for example, option 1 can be displayed for indicating that the font color is red, and option 2 is used for indicating that the distribution times are 2 times. And after detecting one or more selected touch operations in the bullet screen release mode options, determining the selected bullet screen release mode. When the barrage is released, the barrage message can be released to the barrage display area in the user interface of the second application according to the selected barrage release mode.
3. And storing bullet screen information.
Several possible implementations of storing the barrage message in the second application into the transfer station are described below.
(1) And after the barrage message displayed by the second application is pressed for a long time, dragging the barrage message to a floating window of the transfer station.
In some examples, as shown in (1) in fig. 11, a play window 1101 of a second application (such as a live application) is displayed in the mobile phone screen, and the play window 1101 includes a bullet screen display area 1102 in which bullet screen messages 1103 and 1104 are displayed. After the mobile phone responds to the detection that the duration of the user pressing the bullet screen message 1103 for a long time reaches the preset duration, the mobile phone displays a suspension layer (also referred to as a drag suspension layer, a drag shadow layer, etc.) 1105 as shown in (2) in fig. 11, and the suspension layer 1105 displays all or part of the content of the bullet screen message 1103 selected by the user. In some examples, the cell phone may display the floating window of the transfer station or the sign of the transfer station after displaying the floating layer 1105 or simultaneously with displaying the floating layer 1105 (of course, may also be before displaying the floating layer 1105).
In some examples, after the user has pressed the bullet screen message 1103 for a predetermined period of time, the user's finger does not leave the screen of the cell phone. As shown in fig. 11 (3), the levitation layer 1105 is located under the user's finger, and the levitation layer 1105 can move following the movement of the user's finger on the screen. Further, in some examples, in response to detecting that the user drags the hover layer 1105 near a hover window 1106 of the transfer station, and that a distance between the hover layer 1105 and the hover window 1106 is less than or equal to a threshold, the handset temporarily stores content of the hover layer 1105 (i.e., the bullet message 1103) to the transfer station; in other examples, in response to detecting a user's hand being released after dragging the hover layer 1105, the position of the hand being located in the area where the hover window 1106 is located (or the distance between the position of the hand being released and the hover layer 1105 being less than or equal to a threshold), the handset temporarily stores the content of the hover layer 1105 (i.e., the bullet screen message 1103) to the transfer station; in other examples, in response to detecting the hover layer 1105 and the duration of the user's finger hovering over the hover layer 1105 reaching a preset duration, the handset temporarily stores the content of the hover layer 1105 (i.e., the bullet screen message 1103) to the transfer station.
In other examples, the user's finger may leave the screen of the handset after the handset displays the hover layer 1105 after the user has pressed the bullet screen message 1103 for a predetermined period of time. Further, in some examples, the user holds down the hover layer 1105 again and drags the hover layer 1105 closer to the hover window 1106. When detecting that the distance between the floating layer 1105 and the floating window 1106 is less than or equal to the threshold value, the mobile phone temporarily stores the content of the floating layer 1105 (i.e. the barrage message 1103) to the transfer station; in other examples, when it is detected that the user has released his hand after dragging, the mobile phone temporarily stores the content of the floating layer 1105 (i.e., the bullet screen message 1103) to the transfer station in the area where the floating window 1106 is located (or the distance between the released hand and the floating window 1106 is less than or equal to a threshold); in other examples, in response to detecting that the user drags the hover layer 1105, and after the duration of the user's finger hover reaches the preset duration, the handset temporarily stores the content of the hover layer 1105 (i.e., the bullet screen message 1103) to the transfer station.
In some examples, as shown in fig. 4 (1), a user may select a plurality of barrage messages that are stored in the transfer station.
(2) The bullet screen message in the second application is quickly slid to a preset direction (for example, to the floating window direction of the transfer station) after being pressed for a long time.
In some examples, as shown in (1) in fig. 12, the mobile phone displays a play window 1201 of the live application, where the play window 1201 includes a barrage display area in which a barrage message 1203 and a barrage message 1204 are displayed, and the barrage message 1203 is selected. In response to detecting that the length of time the user presses the barrage message 1203 for a long time is equal to or greater than the preset length of time, the mobile phone displays a hover layer 1205 as shown in fig. 12 (2) for presenting the content of the barrage message 1203. At this time, the user's finger does not leave the cell phone screen. The hover layer 1205 may be positioned under a user's finger and the hover layer 1205 may move following the movement of the finger. In response to detecting a user's operation to hold the floating layer 1205 and slide quickly toward the floating window 1206 of the transfer station (i.e., an operation to hold the floating layer 1205 and throw it away toward the floating window 1206 of the transfer station), the handset stores the barrage message 1203 in the transfer station.
Therefore, in some scenes, even if the selected barrage message is far away from the floating window of the transfer station, the barrage message can be stored into the transfer station quickly through the operation of pressing the barrage message for a long time and sliding the barrage message in a preset direction, so that a longer dragging path can be avoided, and the interaction efficiency of a user and a mobile phone is improved.
It will be appreciated that the manner in which the drag entries provided in the foregoing examples are replaced with "flick" entries, and other implementations, such as whether a finger is allowed to leave the screen, a floating window display timing/method, an implementation of storing multiple bullet screen messages at a time, etc., may be referred to in this example and are not repeated herein.
(3) After the target object is pressed for a long time, the floating window of the transfer station moves to the vicinity of the target object, and the target object is dragged to the floating window of the transfer station.
In some examples, as shown in (1) in fig. 12, the mobile phone displays a play window 1201 of the live application, where the play window 1201 includes a barrage display area in which a barrage message 1203 and a barrage message 1204 are displayed, and the barrage message 1203 is selected. In response to detecting that the length of time the user presses the barrage message 1203 for a long time is equal to or greater than the preset length of time, the mobile phone displays a hover layer 1205 as shown in fig. 12 (2) for presenting the content of the barrage message 1203. The hover layer 1205 may move following the movement of the finger. In some examples, the cell phone displays the floating window 1206 of the transfer station directly in the vicinity of the floating layer 1205. In other examples, the cell phone first displays the floating window 1206 of the transfer station at another location (e.g., upper right of the screen) and then the floating window 1206 of the transfer station moves to the vicinity of the floating floor 1205. Further, in some examples, in response to detecting that the user holds down the floating layer 1205 and drags the floating layer 1205 into the floating window 1206 of the transfer station (or drags until the distance between the floating layer 1205 and the floating window 1206 is equal to or less than a certain threshold), the handset stores the barrage message 1203 into the transfer station, as shown in (3) of fig. 12.
Therefore, in some scenes, even if the selected barrage message is far away from the floating window of the transfer station, the floating window of the transfer station can be moved to the vicinity of the barrage message, so that the dragging path is shortened, and the interaction efficiency of the user and the mobile phone is improved.
(4) After the target object is pressed for a long time, a menu bar is popped up, and the option of storing in the transfer station in the menu bar is selected.
In some examples, as shown in fig. 13, the mobile phone displays a play window 1301 of the live application, where the play window 1301 includes a barrage display area 1302, and a barrage message 1303 and a barrage message 1304 are displayed in the barrage display area 1302. When the user selects the barrage message 1303, the menu bar 1305 may be opened by an operation such as a long press (or a double click or a gravity press). Menu bar 1305 may include one or more menus. In some examples, menu bar 1305 is displayed near barrage message 1303; in other examples, menu bar 1305 may be displayed at other locations, for example, when both hands are operating the same screen, the left hand is selected for barrage message 1303, and the menu bar may be displayed near the right hand instead of near the selected barrage message 1303. Further, in some examples, the selected barrage message 1303 is deposited into the transfer station by selecting the operation of depositing into the transfer station in menu bar 1305.
In other examples, menu bar 1305 may be a menu that supports multiple selections, i.e., the user may select multiple operations simultaneously, e.g., the user may select both "translate" and "store in transit". The selected barrage message 1303 may then be stored in the transfer station, and the selected barrage message 1303 translated into another language. Alternatively, the translated barrage message may be stored at the transfer station.
(5) After long pressing the barrage message in the second application and dragging the barrage message to the preset range, the barrage message is sucked into the transfer station.
In some examples, the mobile phone displays a play window of the live application, where the play window includes a bullet screen display area in which bullet screen messages are displayed. After the user presses the barrage message for a long time, the mobile phone displays a suspension layer of the barrage message. Further, when it is detected that the user drags the floating layer of the barrage message to reach the preset area, the user loosens (or does not loosen) the hand, the floating layer of the barrage message is automatically sucked into the floating window of the transfer station, and the mobile phone stores the barrage message into the transfer station. The predetermined area is, for example, an area having a distance from the frame of the floating window or the center of the floating window of the transfer station within a predetermined distance value.
Alternatively, after the floating layer of the barrage message is dragged to the preset area, the floating window of the transfer station may be enlarged, and then the floating layer of the barrage message automatically gradually shrinks and enters the floating window of the transfer station, visually giving the user the experience that the floating layer 1405 is sucked into the floating window of the transfer station.
It should be noted that, in this application, a mobile phone is taken as an example to introduce implementation of a scheme, and in most of the foregoing embodiments, a "long press" operation that is common on a mobile phone is taken as an operation example of a selected target object. It will be appreciated that other operations may be substituted for the long press operation, such as: a single click operation (e.g., a finger click or a mouse click), a double click operation, a pressure sensitive operation, etc.
It should be noted that, in the present application, various operations related to the transfer station are provided, for example, various operations of storing a barrage message in the transfer station, or various operations of distributing a barrage message of the transfer station in a second application, for different applications, the functions related to the transfer station may be implemented by using the same operation, or the functions related to the transfer station may be implemented by using different operations.
4. Editing operation of barrage message.
In some embodiments, as shown in fig. 14 (1), the mobile phone displays a playing window 1401 of the live broadcast application, where the playing window 1401 includes a barrage display area 1402 and a barrage message entry control 1403, and a floating window 1404 of the transfer station is also displayed in the mobile phone screen. The user selects the barrage message 1405 in the floating window 1404, for example, after selecting the barrage message 1405 by an operation such as long press, drags the barrage message 1405 to the barrage message entry control 1403 to lift up, as shown in (2) of fig. 14, the mobile phone detects the operation, displays the barrage message edit window 1406, as shown in (3) of fig. 14, and includes a barrage message edit box and an input method soft keyboard window in the barrage message edit window 1406, the content of the barrage message 1405 is displayed in the barrage message edit box, and the user can edit the barrage message through the input method software disc, and after editing is completed, clicks the send button to release the barrage message, as shown in (4) of fig. 14.
In other embodiments, the mobile phone displays a playing window of the live broadcast application, wherein the playing window comprises a barrage display area, and a floating window of the transfer station is also displayed in the mobile phone screen. And in response to detecting that the duration of the barrage message in the long-press floating window of the user is equal to or greater than the preset duration, displaying a menu (function options) by the mobile phone, wherein the menu comprises editing options. When the mobile phone detects that the user selects the editing option, a barrage message editing window is displayed, wherein the barrage message editing window comprises a barrage message editing frame and an input method soft keyboard window, the content of the barrage message is displayed in the barrage message editing frame, the user can edit the barrage message through the input method software disc, and after the editing is completed, a sending button is clicked to release the barrage message.
5. Management operations of barrage messages.
In some possible implementations, the embodiments of the present application may also provide a function of managing barrage messages in a transfer station, such as deleting barrage messages.
In some embodiments, the mobile phone displays a playing window of the live broadcast application, wherein the playing window comprises a barrage display area, and a floating window of the transfer station is also displayed in the mobile phone screen. And in response to detecting that the duration of the barrage message in the long-press floating window of the user is equal to or greater than the preset duration, the mobile phone displays a menu (function options), wherein the menu can comprise a deletion option. When the mobile phone detects that the user selects the deletion option, the barrage message in the floating window of the transfer station is deleted.
In other examples, a control is included in a floating window of the transfer station. In response to detecting the user operating the control, the handset displays a menu including a multi-choice option. And responding to the detection that the user selects the multi-selection option, and each barrage message is correspondingly provided with a check box in the interface displayed by the mobile phone. I.e. the transfer station enters the multi-choice state. The user can select a plurality of barrage messages, and the check boxes corresponding to the barrage messages to be checked show the check state. In some examples, the user presses the selected plurality of barrage messages for more than a set period of time, a menu is displayed, which may include a delete option. When the mobile phone detects that the user selects the deletion option, the selected barrage messages in the floating window of the transfer station are deleted. By the mode, the bullet screen messages in the transfer station can be conveniently deleted at one time.
By way of example, in connection with the software architecture shown in fig. 3A or fig. 3B, fig. 15 and fig. 17 show a message interaction flow inside the electronic device in the embodiment of the present application. In this process, a change in the user interface may be referred to the foregoing embodiments.
Referring to fig. 15, a schematic flow chart of pulling a barrage message in a first application (transfer station) to a second application (e.g. a live application) for publishing is provided in the embodiment of the present application, and as shown in the fig. 15, the flow may include the following steps:
Step 1: the system services module receives "push" event information from the underlying layer, and determines that the event needs to be distributed to the first application (the transfer station) because the push occurs at a location within the floating window of the first application.
The user presses a finger in the area where the first barrage message in the floating window of the first application is located, and in response, the hardware (e.g., touch sensor) of the electronic device detects a "press down" event and sends the event information to the system service module.
Alternatively, the event information may include location information, event type, time stamp, etc. where the "press down" action occurs.
Step 2: the system service module sends the "press" event information to the first application.
Step 3: the first application is responsive to the "press down" event to determine a first barrage message that the press down action is to act in the floating window of the first application based on the location at which the press down action occurred, and the first barrage message is selected accordingly.
Step 4: the first application sends a notification to the drag service module, wherein the notification carries the first barrage message.
Step 5: after the drag service module receives the notification, the first barrage message carried by the notification is cached, and events related to the drag gesture, such as a 'lift-up' event, are subscribed to the system service module (such as the event service module).
Step 6: the system service module receives the 'moving' event information from the bottom layer and performs position tracking according to the position where the moving action occurs.
The user moves the finger in a "press down" state, and in response, the hardware of the electronic device (e.g., touch sensor) detects a "move" event and sends the event information to the system services module.
The hardware of the electronic device may detect multiple movement events in sequence, depending on the distance or duration that the user's finger movement motion occurs.
In one possible implementation, the system service module sends the movement event to the drag service module, which may not respond after receiving the movement event. In another possible implementation, the system service module does not send the movement event to the drag service module.
Step 7: the system services module receives "lift-off" event information from the underlying layer.
The user lifts a finger in the bullet screen display area of the second application, and in response, the hardware (e.g., touch sensor) of the electronic device detects a "lift" event and sends the event information to the system services module.
Step 8: and according to the event subscription relationship, the system service module sends the lifting event information to the dragging service module.
Step 9: after receiving the "lift" event information, the drag service module determines that the drag gesture is completed, acquires the first barrage message cached before (when the drag user gesture starts, i.e., when the "press" event occurs), and determines that the first barrage message needs to be distributed to the second application.
Since the position where the lifting action occurs is within the bullet screen display area in the play window of the second application, it is determined that the event needs to be distributed to the second application.
Step 10: the drag service module sends the lift-off event information and the first barrage message to the second application.
Step 11: the second application issues the first barrage message in response to the "lift-off" event and displays the first barrage message in a barrage display area of the second application.
Optionally, the playing window of the second application includes a barrage control, where the barrage control is located in a barrage display area and is used to display the barrage in the playing window. In this embodiment of the present application, the function of the barrage control is extended, and the extended barrage control includes a data interface function, where the data interface function may respond to a "lift-up" event acting on the control, receive a barrage message from the transfer station, and display the barrage message in a barrage display area.
Step 12: the second application sends a notification to the drag service module to notify the first barrage message that the release is complete.
Step 13: and after the drag service module receives the notification, deleting the first barrage message cached before. Further, the subscription to drag-related operations may be released.
The steps 12 to 13 are optional steps.
An example of a user interface for the flow shown in fig. 15 described above may be as shown in fig. 16.
Referring to fig. 17, a schematic flow chart of dragging a barrage message displayed in a second application (such as a live broadcast application) to a first application (a transfer station) for saving is provided in an embodiment of the present application, and as shown in the fig. 17, the flow may include the following steps:
step 1: the system services module receives "push" event information from the underlying layer, and determines that the event needs to be distributed to a second application (e.g., a live application) because the push occurs at a location within a bullet screen display area in a play window of the second application.
The user presses a finger on a second bullet screen message in a bullet screen display area of a play window of a bullet screen of the second application, and in response, hardware (such as a touch sensor) of the electronic device detects a "press" event and sends event information to the system service module.
Alternatively, the event information may include location information, event type, time stamp, etc. where the "press down" action occurs.
Step 2: the system service module sends the "press" event information to the second application.
Step 3: the second application determines, in response to the "press down" event, a second barrage message that the press down action acts on the second application in the barrage display area of the play window based on the location at which the press down action occurred, and the second barrage message is selected accordingly.
Optionally, the playing window of the second application includes a barrage control, where the barrage control is located in a barrage display area and is used to display the barrage in the playing window. In this embodiment of the present application, the functions of the barrage control are extended, and the extended barrage control includes a data interface function, where the data interface function may respond to a "press" event acting on the control, and obtain the content of the barrage message acted on by the event.
Step 4: the second application sends a notification to the drag service module, wherein the notification carries a second barrage message.
Step 5: after the drag service module receives the notification, the second barrage message carried by the notification is cached, and events related to the drag gesture, such as a 'lift-up' event, are subscribed to the system service module (such as the event service module).
Step 6: the system service module receives the 'moving' event information from the bottom layer and performs position tracking according to the position where the moving action occurs.
The user moves the finger in a "press down" state, and in response, the hardware of the electronic device (e.g., touch sensor) detects a "move" event and sends the event information to the system services module.
The hardware of the electronic device may detect multiple movement events in sequence, depending on the distance or duration that the user's finger movement motion occurs.
In one possible implementation, the system service module sends the movement event to the drag service module, which may not respond after receiving the movement event. In another possible implementation, the system service module does not send the movement event to the drag service module. 15 is
Step 7: the system services module receives "lift-off" event information from the underlying layer.
The user lifts a finger within the floating window of the first application (transfer station), and in response, the hardware of the electronic device (e.g., touch sensor) detects a "lift" event and sends the event information to the system services module.
Step 8: and according to the event subscription relationship, the system service module sends the lifting event information to the dragging service module.
Step 9: after receiving the "lift" event information, the drag service module determines that the drag gesture is completed, acquires a second barrage message cached before (when the drag user gesture starts, i.e., when the "press" event occurs), and determines that the second barrage message needs to be distributed to the first application.
Since the location where the lifting action occurs is within the floating window of the first application, it is determined that the event needs to be distributed to the first application.
Step 10: the drag service module sends the lift-up event information and the second barrage message to the first application.
Step 11: the first application saves the second barrage message in the first application in response to the "lift-off" event and displays the second barrage message in the floating window of the first application.
Step 12: the first application sends a notification to the drag service module to notify that the second barrage message is saved.
Step 13: and after the drag service module receives the notification, deleting the second barrage message cached before. Further, the subscription to drag-related operations may be released.
The steps 12 to 13 are optional steps.
An example of a user interface for the flow shown in fig. 17 described above may be as shown in fig. 18.
The embodiment of the application also provides a computer readable storage medium, which is used for storing a computer program, and when the computer program is executed by a computer, the computer can implement the method provided by the embodiment of the method.
The present application also provides a computer program product, where the computer program product is configured to store a computer program, where the computer program when executed by a computer may implement the method provided by the foregoing method embodiment.
The embodiment of the application also provides a chip, which comprises a processor, wherein the processor is coupled with the memory and is used for calling the program in the memory to enable the chip to realize the method provided by the embodiment of the method.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (19)

1. A barrage message distribution method, characterized in that it is applied to a first electronic device having a touch screen, the method comprising:
displaying a floating window of a first application, wherein at least one barrage message is displayed in the floating window;
detecting a first touch operation acting on a first barrage message in the floating window, and determining that the first barrage message is selected, wherein the first barrage message comprises one or more barrage messages;
detecting a second touch operation acting on the first barrage message, and publishing the first barrage message to a barrage display area in a user interface of a second application; wherein the second touch operation includes dragging the first barrage message to the second application.
2. The method of claim 1, wherein the user interface of the second application includes a first control therein, the first control for displaying a barrage message;
the posting the first barrage message to a barrage display area in a user interface of a second application includes:
and the first control responds to the second touch operation, receives the first barrage message and issues the first barrage message to a barrage display area of the second application.
3. The method of claim 1, wherein the user interface of the second application includes a first control therein, the first control for displaying a barrage message;
the posting the first barrage message to a barrage display area in a user interface of a second application includes:
and the first control responds to the second touch operation, receives the first barrage message, and issues the first barrage message to a barrage display area of the second application if the first barrage message meets the message format requirement of the second application.
4. A method according to any one of claims 1-3, wherein the method further comprises:
detecting a third touch operation acting on a second barrage message in the barrage display area, and determining that the second barrage message is selected;
detecting a fourth touch operation acting on the second barrage message, and displaying the second barrage message in the floating window; the fourth touch operation includes dragging the second barrage message to the floating window.
5. The method of claim 4, wherein the user interface of the second application includes a first control therein, the first control for displaying a barrage message;
The detecting of the third touch operation acting on the second barrage message in the barrage display area, determining that the second barrage message is selected includes:
and the first control responds to the detection of a third touch operation acting on the second barrage message in the barrage display area, determines that the second barrage message is selected, and acquires the selected second barrage message.
6. The method of claim 4 or 5, wherein the floating window is in a folded state or a hidden state prior to detecting the fourth touch operation of the second bullet screen message acting in the bullet screen display area;
the method further comprises the steps of:
and when the fourth touch operation of the second barrage message acted on the barrage display area is detected, unfolding the floating window in a folding state or a hiding state.
7. The method of claim 6, wherein the method further comprises:
and after the second barrage message is displayed in the floating window, folding or hiding the unfolded floating window.
8. The method of any one of claims 4-7, wherein after displaying the second barrage message in the floating window, the method further comprises:
And transmitting the second barrage message to a second electronic device connected with the first electronic device, so that the first application in the second electronic device receives and stores the second barrage message.
9. The method of any one of claims 1-8, wherein the method further comprises:
detecting a fifth touch operation of a third barrage message acting on the floating window, and determining that the third barrage message is selected;
detecting a sixth touch operation acting on the third barrage message, and displaying a barrage message editing window, wherein the barrage message editing window comprises a barrage message editing frame and an input method soft keyboard window, and the third barrage message is displayed in the barrage message editing frame; wherein the sixth touch operation includes dragging the third barrage message to a barrage message entry in a user interface of the second application.
10. The method of any of claims 1-9, wherein the detecting a first touch operation on a first barrage message in the floating window, after determining that the first barrage message was selected, further comprises:
displaying at least one bullet screen release mode option, wherein the bullet screen release mode is used for indicating bullet screen display modes and/or release times;
Detecting a seventh touch operation of one or more bullet screen release mode options selected, and determining a bullet screen release mode selected;
the posting the first barrage message to a barrage display area in a user interface of a second application includes:
and according to the selected barrage release mode, releasing the first barrage message to a barrage display area in the user interface of the second application.
11. The method of any one of claims 1-10, wherein the method further comprises:
detecting an eighth touch operation acting on a second control in the floating window, so that the floating window enters an editing state; the method comprises the steps of displaying a check box corresponding to each bullet screen message in the floating window in an editing state, and displaying a first function option, wherein the check box is used for checking bullet screen messages, and the first function option is used for correspondingly editing or managing the checked bullet screen messages; or alternatively
And detecting a ninth touch operation acting on the fourth barrage message in the floating window, and displaying a second function option corresponding to the fourth barrage message, wherein the second function option is used for editing or managing the fourth barrage message correspondingly.
12. The method of any of claims 1-11, wherein the second application comprises: an audio video application, a live broadcast application, or a gaming application.
13. A barrage message distribution method applied to a first electronic device with a touch screen, the method comprising:
displaying a user interface of a second application, wherein the user interface of the second application comprises a barrage display area;
detecting a third touch operation acting on a second barrage message in the barrage display area, and determining that the second barrage message is selected;
detecting a fourth touch operation acting on the second barrage message, and displaying the second barrage message in a floating window of the first application; the fourth touch operation includes dragging the second barrage message to the floating window.
14. The method of claim 13, wherein the user interface of the second application includes a first control therein, the first control for displaying a barrage message;
the detecting of the third touch operation acting on the second barrage message in the barrage display area, determining that the second barrage message is selected includes:
and the first control responds to the detection of a third touch operation acting on the second barrage message in the barrage display area, determines that the second barrage message is selected, and acquires the selected second barrage message.
15. The method of claim 13 or 14, wherein the floating window is in a folded state or a hidden state prior to detecting the fourth touch operation of the second bullet screen message in the bullet screen display area;
the method further comprises the steps of:
and when the fourth touch operation of the second barrage message acted on the barrage display area is detected, unfolding the floating window in a folding state or a hiding state.
16. The method of claim 15, wherein the method further comprises:
and after the second barrage message is displayed in the floating window, folding or hiding the unfolded floating window.
17. An electronic device, comprising: one or more processors; the one or more memories store one or more computer programs comprising instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-12 or perform the method of any of claims 13-16.
18. A computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1-12 or to perform the method of any one of claims 13-16.
19. A system on a chip comprising one or more processors which, when executing instructions, perform the method of relaying information according to any of claims 1-12 or perform the method of any of claims 13-16.
CN202210887888.XA 2022-07-26 2022-07-26 Barrage message publishing method and device Pending CN117519564A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210887888.XA CN117519564A (en) 2022-07-26 2022-07-26 Barrage message publishing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210887888.XA CN117519564A (en) 2022-07-26 2022-07-26 Barrage message publishing method and device

Publications (1)

Publication Number Publication Date
CN117519564A true CN117519564A (en) 2024-02-06

Family

ID=89746189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210887888.XA Pending CN117519564A (en) 2022-07-26 2022-07-26 Barrage message publishing method and device

Country Status (1)

Country Link
CN (1) CN117519564A (en)

Similar Documents

Publication Publication Date Title
WO2021159922A1 (en) Card display method, electronic device, and computer-readable storage medium
WO2021115194A1 (en) Application icon display method and electronic device
CN110737386A (en) screen capturing method and related equipment
US20220413695A1 (en) Split-screen display method and electronic device
CN111966252A (en) Application window display method and electronic equipment
WO2022062898A1 (en) Window display method and device
CN111597000B (en) Small window management method and terminal
CN111147660B (en) Control operation method and electronic equipment
CN116302227A (en) Method for combining multiple applications and simultaneously starting multiple applications and electronic equipment
CN111225108A (en) Communication terminal and card display method of negative screen interface
CN113805980A (en) Method and terminal for displaying notification
CN114047867A (en) Suspended window display method and electronic equipment
CN115348350A (en) Information display method and electronic equipment
CN114201097A (en) Interaction method among multiple application programs
CN110865765A (en) Terminal and map control method
CN116560865A (en) Method and terminal for sharing information between applications
WO2023221946A1 (en) Information transfer method and electronic device
CN115700461A (en) Cross-device handwriting input method and system in screen projection scene and electronic device
CN114879880A (en) Electronic device, display method and medium for application thereof
CN115480670A (en) Navigation bar display method, navigation bar display method and first electronic equipment
WO2023029985A1 (en) Method for displaying dock bar in launcher and electronic device
WO2022001261A1 (en) Prompting method and terminal device
CN114095610B (en) Notification message processing method and computer readable storage medium
CN116954409A (en) Application display method and device and storage medium
CN117519564A (en) Barrage message publishing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination