CN111135557A - Interaction method and device for multiple screens - Google Patents

Interaction method and device for multiple screens Download PDF

Info

Publication number
CN111135557A
CN111135557A CN201911366530.7A CN201911366530A CN111135557A CN 111135557 A CN111135557 A CN 111135557A CN 201911366530 A CN201911366530 A CN 201911366530A CN 111135557 A CN111135557 A CN 111135557A
Authority
CN
China
Prior art keywords
option
screen
user
interaction
display interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911366530.7A
Other languages
Chinese (zh)
Other versions
CN111135557B (en
Inventor
王斌
王莹
何志刚
赵文芳
苏威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN201911366530.7A priority Critical patent/CN111135557B/en
Publication of CN111135557A publication Critical patent/CN111135557A/en
Application granted granted Critical
Publication of CN111135557B publication Critical patent/CN111135557B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games

Abstract

The embodiment of the disclosure discloses an interaction method and device for multiple screens. The method comprises the following steps: in response to receiving the operation of a user for selecting the multi-screen interaction option, projecting the current display interface to a target screen; presenting an option to select an input device; receiving a selection operation of selecting an option of an input device; and responding to the selection operation instruction, generating an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen by adopting a virtual input device, and displaying the operation interface on the local screen without displaying the current display interface projected to the target screen. The method can realize information sharing, reduce the heating and energy consumption of the electronic equipment where the local screen is positioned, and improve the accuracy of information input by the local operation interface.

Description

Interaction method and device for multiple screens
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a terminal sharing technology, and in particular, to an interaction method and apparatus for multiple screens.
Background
In the prior art, when a user plays games, plays audio and video and performs other interactive operations with a terminal, a screen picture of an intelligent terminal device can be transmitted to a large-screen device through the technology of WIFI-P2P, and the large-screen device decodes the picture or the audio and video for presentation.
However, in the prior art, a human-computer interaction picture on a mobile phone is transmitted to a large-screen device through a multi-screen sharing technology while the human-computer interaction operation is performed, a mobile phone terminal device and the large-screen device play the human-computer interaction picture at the same time, and a virtual input option of the mobile phone cannot generate a physical touch sense.
Disclosure of Invention
The embodiment of the disclosure provides an interaction method and device for multiple screens.
In a first aspect, an embodiment of the present disclosure provides an interaction method for multiple screens, including: in response to receiving the operation of a user for selecting the multi-screen interaction option, projecting the current display interface to a target screen; presenting an option to select an input device; receiving a selection operation of selecting an option of an input device; and responding to the selection operation instruction, generating an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen by adopting a virtual input device, and displaying the operation interface on the local screen without displaying the current display interface projected to the target screen.
In some embodiments, in response to receiving a user selection operation of the multi-screen interaction option, projecting the current display interface to the target screen includes: in response to receiving the selected operation of the local user on the multi-screen interaction option, projecting the current display interface including the interactive operation of the local user and the interactive operation of the remote user to the target screen.
In some embodiments, generating an operator interface including at least one virtual input option based on the currently displayed interface projected to the target screen includes: predicting a user portrait of a user on a current display interface projected to a target screen by adopting AI big data; based on the predicted representation of the user, an operator interface is generated that includes at least one virtual input option.
In some embodiments, generating an operator interface including at least one virtual input option based on the currently displayed interface projected to the target screen includes: and generating an operation interface comprising a preset background and at least one virtual input option based on the current display interface projected to the target screen, wherein the preset background comprises a default background or a background set by a user based on a background setting option.
In some embodiments, the virtual input options include: a virtual interactive handle.
In some embodiments, the interaction method further comprises: and presenting vibration with vibration frequency corresponding to the selected virtual input option in response to the selection operation of the virtual input option by the user.
In some embodiments, the interaction method further comprises: presenting a movable identification of the virtual input option in response to a user's selected operation on the generated virtual input option; in response to a movement operation of the user on the virtual input option with the movable identification, moving the virtual input option indicated by the movement operation to a target position indicated by the movement operation; and releasing the selection of the virtual input option in response to the user inputting the quitting operation of the selection operation.
In some embodiments, the interaction method further comprises: and responding to the selection operation instruction, adopting the entity input device, starting to adopt the local entity input device based on the corresponding relation between the preset local entity input device and the current display interface, and blacking the local screen.
In some embodiments, initiating the employing of the local entity input device comprises: and ejecting the entity interaction handle.
In some embodiments, the target screen is larger in size than the native screen.
In some embodiments, in response to receiving a user selection operation of the multi-screen interaction option, projecting the current display interface to the target screen includes: and in response to receiving the operation of the user for selecting the multi-screen interaction option, projecting the current display interface to the television screen through an adapter of the external equipment in the television.
In a second aspect, an embodiment of the present disclosure provides an interaction apparatus for multiple screens, including: the display interface projection unit is configured to project a current display interface to a target screen in response to receiving a user selection operation of a multi-screen interaction option; a selection option presenting unit configured to present an option to select the input device; a selection operation receiving unit configured to receive a selection operation of selecting an option of the input device; and the operation interface display unit is configured to respond to the selection operation instruction, adopt the virtual input device, generate an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen, and display the operation interface on the local screen without displaying the current display interface projected to the target screen.
In some embodiments, the display interface projection unit is further configured to: in response to receiving the selected operation of the local user on the multi-screen interaction option, projecting the current display interface including the interactive operation of the local user and the interactive operation of the remote user to the target screen.
In some embodiments, the operation interface display unit is further configured to: predicting a user portrait of a user on a current display interface projected to a target screen by adopting AI big data; based on the predicted representation of the user, an operator interface is generated that includes at least one virtual input option.
In some embodiments, the operation interface display unit is further configured to: and generating an operation interface comprising a preset background and at least one virtual input option based on the current display interface projected to the target screen, wherein the preset background comprises a default background or a background set by a user based on a background setting option.
In some embodiments, manipulating the virtual input options in the interface display unit includes: a virtual interactive handle.
In some embodiments, the interaction device further comprises: and the option vibration presenting unit is configured to present vibration with vibration frequency corresponding to the selected virtual input option in response to the selection operation of the virtual input option by the user.
In some embodiments, the interaction device further comprises: a mobile identifier presenting unit configured to present a mobile identifier of the virtual input option in response to a user's selection operation of the generated virtual input option; an input option moving unit configured to move a virtual input option indicated by the moving operation to a target position indicated by the moving operation in response to a moving operation of a virtual input option having a movable identification by a user; and an option selection releasing unit configured to release the selection of the virtual input option in response to a user input of an exit operation of the selection operation.
In some embodiments, the interaction device further comprises: and the input device starting unit is configured to respond to the selection operation instruction, adopt the entity input device, start to adopt the local entity input device based on the preset corresponding relation between the local entity input device and the current display interface, and blank the local screen.
In some embodiments, the input device activation unit is further configured to: and ejecting the entity interaction handle.
In some embodiments, the size of the target screen in the display interface projection unit is larger than the native screen.
In some embodiments, the display interface projection unit is further configured to: and in response to receiving the operation of the user for selecting the multi-screen interaction option, projecting the current display interface to the television screen through an adapter of the external equipment in the television.
In a third aspect, an embodiment of the present disclosure provides an electronic device/terminal/server, including: one or more processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors implement the multi-screen interaction method as described in any one of the above.
In a fourth aspect, the disclosed embodiments provide a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the interaction method for multiple screens as described in any one of the above.
According to the multi-screen interaction method and device provided by the embodiment of the disclosure, firstly, in response to receiving a selection operation of a user on a multi-screen interaction option, a current display interface is projected to a target screen; presenting an option to select an input device; receiving a selection operation of selecting an option of an input device; and responding to the selection operation instruction, generating an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen by adopting a virtual input device, and displaying the operation interface on the local screen without displaying the current display interface projected to the target screen. In the process, the local current display interface can be projected to the target screen to realize information sharing, and the local screen is adopted to display the operation interface comprising at least one virtual input option and not display the projected current display interface, so that the heating and energy consumption of the electronic equipment where the local screen is located are reduced, and the accuracy of information input of the local operation interface is improved.
In some embodiments, the electronic device may send out different frequency sensations to remind the user according to different selected virtual buttons, thereby enhancing the accuracy of the user operation.
In some embodiments, the AI big data is adopted to predict the user portrait of the user on the current display interface, and an operation interface is generated based on the user portrait, so that the pertinence of the operation interface is improved, and the operation interface is more in line with the operation habits of the user.
In some embodiments, the current display interface projected to the target screen is operated by adopting a local entity interactive handle, so that the accuracy and efficiency of information input are improved.
Drawings
Other features, objects, and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present disclosure may be applied;
FIG. 2a is a schematic flow chart diagram illustrating one embodiment of an interaction method for multiple screens, according to an embodiment of the present disclosure;
FIG. 2b exemplarily illustrates a diagram that receives a single user selection of a multi-screen interaction option and projects a current display interface to a television screen;
FIG. 2c is a diagram that exemplarily illustrates receiving a single user selection of a multi-screen interaction option and projecting a currently displayed interface that interacts with multiple users to a television screen;
FIG. 2d is a diagram illustrating an operation interface generated when the application to which the current display interface belongs is the first application;
FIG. 2e is a diagram illustrating an operation interface with a background generated when the application to which the current display interface belongs is a second application;
FIG. 2f is a schematic diagram illustrating an operation interface generated when the application to which the current display interface belongs is a third application;
FIG. 2g schematically shows a diagram presenting different vibration frequencies for the operator interface in FIG. 2 d;
FIG. 2h schematically shows a diagram presenting different vibration frequencies for the operator interface in FIG. 2 f;
FIG. 3 is an exemplary application scenario of an interaction method for multiple screens according to an embodiment of the present disclosure;
FIG. 4a is a schematic flow chart diagram illustrating yet another embodiment of an interaction method for multiple screens, according to an embodiment of the present disclosure;
FIG. 4b is a diagram illustrating an example of a convex pattern of actual input options in an actual interactive handle;
FIG. 4c is a diagram illustrating an example of a concave pattern of actual input options in an actual interactive handle;
FIG. 4d is a schematic diagram illustrating an actual input option in an actual interactive handle as a planar sensing mode;
FIG. 5 is an exemplary block diagram of one embodiment of an interaction device for multiple screens of the present disclosure;
FIG. 6 is a schematic block diagram of a computer system suitable for use with a server embodying embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the interaction method for multiple screens or the interaction apparatus for multiple screens of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various communication client applications installed thereon, such as game applications, browser applications, shopping applications, search applications, instant messaging tools, mailbox clients, social platform software, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices that support browser applications, including but not limited to tablet computers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for applications running on the terminal devices 101, 102, 103. The background server can analyze and process the received data such as the request and feed back the processing result to the terminal equipment.
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as multiple pieces of software or software modules, for example, to provide distributed services, or as a single piece of software or software module. And is not particularly limited herein.
In practice, the multi-screen interaction method provided by the embodiment of the present disclosure may be executed by the terminal devices 101, 102, and 103, and the multi-screen interaction apparatus may also be disposed in the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2a, fig. 2a illustrates a flow 200 of one embodiment of an interaction method for multiple screens according to the present disclosure. The multi-screen interaction method comprises the following steps:
in step 201, in response to receiving a user selection operation of a multi-screen interaction option, a current display interface is projected to a target screen.
In this embodiment, an execution subject (e.g., the terminal device shown in fig. 1) of the multi-screen interaction method may receive, through an input device, a user's selection operation of a multi-screen interaction option.
The multi-screen interaction option refers to an option for projecting the current display interface of the software running on the local screen to the other screens for display.
The selection operation may be a preset operation that can be recognized as a selection action.
For example, the terminal device may collect a click operation of a user, and if an operation position of the click operation corresponds to a multi-screen interaction option, identify the click operation as a selection operation on the multi-screen interaction option.
For another example, the terminal device may collect an operation trajectory of the user, identify whether the collected operation trajectory matches a preset operation trajectory, and if so, use the collected operation trajectory as a selection operation of the multi-screen interaction option. The preset operation track is a preset selection operation corresponding to the multi-screen interaction option.
The target screen is a screen indicated by the multi-screen interaction option and establishing a projection relation with the local screen, and the size of the target screen may be greater than, equal to or smaller than that of the local screen, which is not limited in the present application.
When the execution main body projects the current display interface to the target screen, the user terminal interacting with the current display interface may be a single user terminal or a plurality of user terminals.
In some optional implementations of the embodiment, in response to receiving a selection operation of the multi-screen interaction option by the user, projecting the current display interface to the target screen includes: in response to receiving the selected operation of the local user on the multi-screen interaction option, projecting the current display interface including the interactive operation of the local user and the interactive operation of the remote user to the target screen.
In this implementation manner, if there are a plurality of user terminals that interact with the current display interface, the execution main body may serve as a terminal device that transmits the current display interface to the target screen and serves as one of the terminal devices that interact with the current display interface, and the other terminal devices may serve as other terminal devices that interact with the current display interface, so that adaptability to simultaneous interaction of a plurality of users is improved.
Specifically, when a user selects a multi-screen interaction option, the execution main body may determine whether the current operation interface supports simultaneous interaction of multiple terminals, and if so, may provide a many-to-one control mode, allowing the terminal device currently selecting the multi-screen interaction option to be used as a picture output and a control terminal output, and allowing other terminal devices to be used as control terminals to output.
In some optional implementations, the projecting the current display interface to the target screen in response to receiving the user selection operation of the multi-screen interaction option may include: and in response to receiving the operation of the user for selecting the multi-screen interaction option, projecting the current display interface to the television screen through an adapter of the external equipment in the television.
In the implementation mode, the television screen is used as the target screen, so that the display definition of the current display interface can be improved, and the interaction efficiency is improved.
As shown in FIG. 2b, FIG. 2b exemplarily illustrates a situation in which a single user's selected operation of a multi-screen interaction option is received and a current display interface is projected to a television screen. Wherein, the operator 210 projects the current display interface to the television screen and displays the operation interface comprising the virtual input options on the terminal equipment, and the surrounding persons 211, 212 surround the interactive operation of the operator 210 on the television screen.
As shown in FIG. 2c, FIG. 2c exemplarily illustrates a situation in which a single user's selected operation of a multi-screen interaction option is received and a current display interface interacting with multiple users is projected to a television screen. The operator 220 projects the current display interface to the television screen, and displays an operation interface including virtual input options on the terminal device, and the interactive operations of other operators 221, 222 with the current interactive interface are also displayed on the current interactive interface on the television screen.
Returning to FIG. 2a, in step 202, an option to select an input device is presented.
In this embodiment, when the execution main body projects the current display interface to the target screen, an option of selecting an input device may be presented on the home screen and/or the target screen. For example, options are presented for selecting a virtual input device and options for selecting a physical input device.
In step 203, a selection operation for selecting an option of an input device is received.
In this embodiment, when the execution main body presents the option of the input device, a selection operation of the option of the input device by the user may be received.
In step 204, in response to the selection operation instruction, the virtual input device is adopted, an operation interface including at least one virtual input option is generated based on the current display interface projected to the target screen, and the operation interface is displayed on the local screen without displaying the current display interface projected to the target screen.
In this embodiment, if the virtual input device is used for the selection operation instruction, the execution body may generate an operation interface corresponding to the current display interface projected to the target screen. The operator interface may include one, two or more virtual input options. The virtual input options can be in the form of virtual interactive keys, and can also be some special virtual input options, such as a virtual interactive handle and the like.
When generating the operation interface, the execution main body may generate different operation interfaces according to different current display interfaces. Or different operation interfaces are generated according to different applications of the current display interface or different interaction stages of the same application. Specifically, the manner of generating the operation interface by the execution subject may be determined by a manner of generating an interface in the prior art or in a technology developed in the future, and the present application is not limited thereto. For example, the operation interface may be generated according to a preset association relationship between the operation interface and the current display interface.
After the execution main body generates the operation interface, the operation interface is displayed on the local screen, and the current display interface projected to the target screen is not displayed on the local screen. The local screen at this time serves as a virtual input device, and provides an operation interface for inputting information for the user. For example, when the current interactive interface is a game interface, the execution subject may generate different key layouts according to different games.
Exemplarily, fig. 2d, 2e and 2f respectively show a situation in which different operation interfaces are generated for different applications to which the current display interface belongs in the virtual interaction apparatus. Fig. 2d shows a situation of an operation interface generated when the application to which the current display interface belongs is the first application (for example, a game GOK of the kingdom). Fig. 2e shows a situation of an operation interface with a background generated when the application to which the current display interface belongs is a second application (e.g., fire fighter Naruto). Fig. 2f shows a situation of an operation interface generated when the application to which the current display interface belongs is a third application (e.g., QQ flying QQspeed).
In some optional implementations of the embodiment, the generating an operation interface including at least one virtual input option based on the current display interface projected to the target screen may include: predicting a user portrait of a user on a current display interface projected to a target screen by adopting AI big data; based on the predicted representation of the user, an operator interface is generated that includes at least one virtual input option.
In this implementation, the execution main body may predict the user portrait of the user on the current display interface projected to the target screen by using the artificial intelligence AI big data locally or in the cloud when the user uses the virtual input device. The user portrait is an effective tool for depicting a target user and associating a user appeal with a design direction. For example, the user profile may include the age, sex, interest, hobbies, professiveness, character features, and the like of the current user, and may further include the layout of the virtual input options of the user category to which the current user belongs in the virtual input device corresponding to the current interactive interface, and the layout of the virtual input options of the current user in the virtual input devices corresponding to different interactive interfaces.
After the user representation is predicted, the execution body may generate an operation interface that conforms to the user representation, conforms to the current display interface, and includes at least one virtual input option based on the predicted user representation.
In the implementation mode, the operation interface comprising at least one virtual input option is generated based on the predicted user portrait, so that the pertinence of the generated operation interface can be improved, and the operation interface can better meet the requirements of users.
In some optional implementations of the embodiment, the generating an operation interface including at least one virtual input option based on the current display interface projected to the target screen may include: and generating an operation interface comprising a preset background and at least one virtual input option based on the current display interface projected to the target screen, wherein the preset background comprises a default background or a background set by a user based on a background setting option.
In the implementation mode, the default background or the background set by the user based on the background setting option is introduced into the operation interface, so that the pertinence of the operation interface can be improved, and the virtual input option is clearer, clearer or more in line with the requirements of the user.
In some optional implementations of this embodiment, the interaction method further includes: and presenting vibration with vibration frequency corresponding to the selected virtual input option in response to the selection operation of the virtual input option by the user.
In this implementation manner, when the user selects one or more options of the virtual input options, the execution main body may present different vibration frequencies for different options, so as to prompt the user that the option corresponding to the vibration frequency is selected, thereby improving the identifiability of the virtual input options.
Referring to fig. 2g and 2h, fig. 2g exemplarily shows that different vibration frequencies 231, 232 are presented for the operation interface in fig. 2 d. Fig. 2h exemplarily shows a situation where different vibration frequencies 233, 234 are present for the operation interface in fig. 2 f.
In some optional implementations of this embodiment, the interaction method further includes: presenting a movable identification of the virtual input option in response to a user's selected operation on the generated virtual input option; and in response to the user's movement operation on the virtual input option with the movable identification, moving the virtual input option indicated by the movement operation to the target position indicated by the movement operation.
In this implementation, the selection operation is a predefined operation for selecting a virtual input option; the move operation is a predefined operation for moving a virtual input option having a movable identification.
When the user selects the virtual input option, the local screen of the execution body can present a movable identifier of one or all of the virtual input options, and the virtual input option with the movable identifier is in a movable state. When the user performs a moving operation on the virtual input option having the movable mark, the virtual input option indicated by the moving operation may be moved to a target position indicated by the moving operation.
The interaction method in the implementation mode can move the virtual input options to the target positions according to the needs of the user, so that the flexibility of the layout of the virtual input options in the virtual input device is improved.
In some optional implementations of this embodiment, the interaction method further includes: and releasing the selection of the virtual input option in response to the user inputting the quitting operation of the selection operation.
In this implementation manner, if the user selects a virtual input option due to an erroneous operation and presents one or all of the movable identifiers of the virtual input option, a quit operation of the selected operation may be input, and the selection of the virtual input option is released. The exit operation here is a predefined operation for releasing the selection of the virtual input option.
According to the multi-screen interaction method, the local current display interface can be projected to the target screen, information sharing is achieved, the local screen is adopted to display the operation interface comprising at least one virtual input option, the projected current display interface is not displayed, heating and energy consumption of electronic equipment where the local screen is located are reduced, and accuracy of information input of the local operation interface is improved.
An exemplary application scenario of the multi-screen interaction method of the present disclosure is described below in conjunction with fig. 3.
As shown in fig. 3, fig. 3 illustrates one exemplary application scenario of the interaction method for multiple screens according to the present disclosure.
As shown in fig. 3, the interaction method 300 for multiple screens is executed in an electronic device 320, and may include:
firstly, in response to receiving a user selection operation 302 of a multi-screen interaction option 301, projecting a current display interface 303 to a target screen 304;
thereafter, an option 305 to select the input device is presented;
thereafter, a selection operation 306 of the option 305 to select the input device is received;
thereafter, in response to the selection operation 306 indicating that the virtual input device 307 is employed, an operation interface 309 including at least one virtual input option 308 is generated based on the current display interface 303 projected to the target screen 304, and the operation interface 309 is displayed on the local screen 310 without displaying the current display interface 303 projected to the target screen 304.
It should be understood that the application scenario of the interaction method for multiple screens illustrated in fig. 3 is only an exemplary description of the interaction method for multiple screens, and does not represent a limitation on the method. For example, the steps shown in fig. 3 above may be implemented in further detail. The interaction steps for multiple screens can be further added on the basis of the above-mentioned fig. 3.
With further reference to fig. 4a, fig. 4a illustrates a schematic flow chart diagram of one embodiment of a method for uploading files in an interactive method for multiple screens according to the present disclosure.
As shown in fig. 4a, a method 400 for uploading files in an interaction method for multiple screens according to this embodiment may include the following steps:
step 401, in response to receiving a selection operation of a user on a multi-screen interaction option, projecting a current display interface to a target screen.
In this embodiment, an execution subject (e.g., the terminal shown in fig. 1) of the multi-screen interaction method may receive, through an input device, a user's selection operation of a multi-screen interaction option.
The multi-screen interaction option refers to an option for projecting the current display interface of the local screen to other screens for display. The selection operation may be a preset operation that can be recognized as a selection action. The target screen is a screen which is indicated by the multi-screen interaction option and establishes a projection relation with the local screen.
At step 402, an option to select an input device is presented.
In this embodiment, when the execution main body projects the current display interface to the target screen, an option of selecting an input device may be presented on the home screen and/or the target screen. For example, options are presented for selecting a virtual input device and options for selecting a physical input device.
In step 403, a selection operation for selecting an option of the input device is received.
In this embodiment, when the execution main body presents the option of the input device, a selection operation of the option of the input device by the user may be received.
And step 404, responding to the selection operation instruction, adopting the entity input device, starting to adopt the local entity input device and blacking the local screen based on the corresponding relation between the preset local entity input device and the application software or system software to which the current display interface belongs.
In this embodiment, the physical input device may be a physical button provided on a screen or a housing of the electronic apparatus. When the solid button is opened, the electronic device is completely used as a solid interaction handle, and meanwhile, the design of the solid interaction handle can have concave, convex or plane sensing modes and the like.
Exemplarily, fig. 4b, 4c and 4d show the case that the actual input options in the actual interactive handle are respectively a convex mode, a concave mode and a plane sensing mode.
If the selection operation indicates that the entity input device is adopted, the execution main body can start to adopt the local entity input device according to the corresponding relation between the preset local entity input device and the current display interface so as to improve the accuracy and precision of input. Further, the execution main body can also blank a local screen to reduce the energy consumption of the execution main body.
In some optional implementations of this embodiment, the initiating using the local entity input device includes: and ejecting the entity interaction handle.
In this implementation, if the local entity input device is used for starting, the entity interaction handle can be popped up, so as to improve the interaction efficiency between the user and the current display interface projected to the target screen.
In the multi-screen interaction method in the embodiment shown in fig. 4a of the present disclosure, unlike the virtual input device in fig. 2a, the physical input device is used in fig. 4a, so that the operation efficiency of the user can be improved.
Those skilled in the art will understand that steps 401, 402, and 403 in fig. 4a correspond to steps 201, 202, and 203, respectively, and therefore, the operations and features described in fig. 2a for steps 201, 202, and 203 are also applicable to steps 401, 402, and 403, and are not described herein again.
As an implementation of the method shown in the above figures, the embodiment of the present disclosure provides an embodiment of an interactive apparatus for multiple screens, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2a to 4d, and the apparatus may be specifically applied to the terminal or the server shown in fig. 1.
As shown in fig. 5, the interaction device 500 for multiple screens of the present embodiment may include: a display interface projection unit 510 configured to project a current display interface to a target screen in response to receiving a user selection operation of a multi-screen interaction option; a selection option presenting unit 520 configured to present an option to select an input device; a selection operation receiving unit 530 configured to receive a selection operation of selecting an option of the input device; and an operation interface display unit 540 configured to generate an operation interface including at least one virtual input option based on the current display interface projected to the target screen using the virtual input device in response to the selection operation instruction, and display the operation interface on the local screen without displaying the current display interface projected to the target screen.
In some embodiments, the display interface projection unit is further configured to: in response to receiving the selected operation of the local user on the multi-screen interaction option, projecting the current display interface including the interactive operation of the local user and the interactive operation of the remote user to the target screen.
In some embodiments, the operation interface display unit is further configured to: predicting a user portrait of a user on a current display interface projected to a target screen by adopting AI big data; based on the predicted representation of the user, an operator interface is generated that includes at least one virtual input option.
In some embodiments, the operation interface display unit is further configured to: and generating an operation interface comprising a preset background and at least one virtual input option based on the current display interface projected to the target screen, wherein the preset background comprises a default background or a background set by a user based on a background setting option.
In some embodiments, manipulating the virtual input options in the interface display unit includes: a virtual interactive handle.
In some embodiments, the interaction means further comprises (not shown in the figures): and the option vibration presenting unit is configured to present vibration with vibration frequency corresponding to the selected virtual input option in response to the selection operation of the virtual input option by the user.
In some embodiments, the interaction means further comprises (not shown in the figures): a mobile identifier presenting unit configured to present a mobile identifier of the virtual input option in response to a user's selection operation of the generated virtual input option; an input option moving unit configured to move a virtual input option indicated by the moving operation to a target position indicated by the moving operation in response to a moving operation of a virtual input option having a movable identification by a user; and an option selection releasing unit configured to release the selection of the virtual input option in response to a user input of an exit operation of the selection operation.
In some embodiments, the interaction means further comprises (not shown in the figures): and the input device starting unit is configured to respond to the selection operation instruction, adopt the entity input device, start to adopt the local entity input device based on the preset corresponding relation between the local entity input device and the current display interface, and blank the local screen.
In some embodiments, the input device activation unit is further configured to: and ejecting the entity interaction handle.
In some embodiments, the size of the target screen in the display interface projection unit is larger than the native screen.
In some embodiments, the display interface projection unit is further configured to: and in response to receiving the operation of the user for selecting the multi-screen interaction option, projecting the current display interface to the television screen through an adapter of the external equipment in the television.
It should be understood that the various units recited in the apparatus 500 correspond to the various steps recited in the method described with reference to fig. 2 a-4 d. Thus, the operations and features described above for the method are equally applicable to the apparatus 500 and the various units included therein and will not be described again here.
Referring now to fig. 6, a schematic diagram of an electronic device (e.g., the server or terminal device of fig. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. Terminal devices in embodiments of the present disclosure may include, but are not limited to, devices such as notebook computers, desktop computers, and the like. The terminal device/server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure. It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: in response to receiving the operation of a user for selecting the multi-screen interaction option, projecting the current display interface to a target screen; presenting an option to select an input device; receiving a selection operation of selecting an option of an input device; and responding to the selection operation instruction, generating an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen by adopting a virtual input device, and displaying the operation interface on the local screen without displaying the current display interface projected to the target screen.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a display interface projection unit, a selection option presentation unit, a selection operation reception unit, and an operation interface display unit. The names of the units do not form a limitation on the units themselves under certain conditions, for example, the display interface projection unit may also be described as a "unit that projects the current display interface to the target screen in response to receiving a user selection operation of the multi-screen interaction option".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept as defined above. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (18)

1. An interaction method for multiple screens, comprising:
in response to receiving the operation of a user for selecting the multi-screen interaction option, projecting the current display interface to a target screen;
presenting an option to select an input device;
receiving a selection operation of selecting an option of an input device;
and responding to the selection operation instruction, generating an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen by adopting a virtual input device, and displaying the operation interface on the local screen without displaying the current display interface projected to the target screen.
2. The interaction method according to claim 1, wherein the projecting the current display interface to the target screen in response to receiving the user's selection of the multi-screen interaction option comprises:
in response to receiving the selected operation of the local user on the multi-screen interaction option, projecting the current display interface including the interactive operation of the local user and the interactive operation of the remote user to the target screen.
3. The interactive method of claim 1, wherein generating an operator interface including at least one virtual input option based on the currently displayed interface projected to the target screen comprises:
predicting a user portrait of a user on a current display interface projected to a target screen by adopting AI big data;
based on the predicted representation of the user, an operator interface is generated that includes at least one virtual input option.
4. The interaction method of claim 1, wherein generating an operation interface including at least one virtual input option based on the currently displayed interface projected to the target screen comprises:
and generating an operation interface comprising a preset background and at least one virtual input option based on the current display interface projected to the target screen, wherein the preset background comprises a default background or a background set by a user based on a background setting option.
5. The interaction method according to any one of claims 1 to 4, wherein the virtual input options include: a virtual interactive handle.
6. The interaction method of claim 1, wherein the interaction method further comprises:
and presenting vibration with vibration frequency corresponding to the selected virtual input option in response to the selection operation of the virtual input option by the user.
7. The interaction method of claim 1, wherein the interaction method further comprises:
presenting a movable identification of the virtual input option in response to a user's selected operation on the generated virtual input option;
in response to a user's movement operation on a virtual input option with a movable identifier, moving the virtual input option indicated by the movement operation to a target position indicated by the movement operation;
and responding to the exit operation of the user input on the selected operation, and releasing the selection of the virtual input option.
8. The interaction method of claim 1, wherein the interaction method further comprises:
responding to the selection operation instruction, adopting an entity input device, starting to adopt the local entity input device based on the corresponding relation between the preset local entity input device and the current display interface, and blacking a local screen.
9. The interaction method of claim 8, wherein the initiating employs a local entity input device comprising: and ejecting the entity interaction handle.
10. The interaction method of claim 1, wherein the target screen is larger in size than the native screen.
11. The interaction method according to claim 1, wherein the projecting the current display interface to the target screen in response to receiving the user's selection of the multi-screen interaction option comprises:
and in response to receiving the operation of the user for selecting the multi-screen interaction option, projecting the current display interface to the television screen through an adapter of the external equipment in the television.
12. An interaction device for multiple screens, comprising:
the display interface projection unit is configured to project a current display interface to a target screen in response to receiving a user selection operation of a multi-screen interaction option;
a selection option presenting unit configured to present an option to select the input device;
a selection operation receiving unit configured to receive a selection operation of selecting an option of the input device;
and the operation interface display unit is configured to respond to the selection operation instruction, adopt the virtual input device, generate an operation interface comprising at least one virtual input option based on the current display interface projected to the target screen, and display the operation interface on the local screen without displaying the current display interface projected to the target screen.
13. The interaction device of claim 12, wherein the display interface projection unit is further configured to:
in response to receiving the selected operation of the local user on the multi-screen interaction option, projecting the current display interface including the interactive operation of the local user and the interactive operation of the remote user to the target screen.
14. The interaction device of claim 12, wherein the interaction device further comprises:
a mobile identifier presenting unit configured to present a mobile identifier of the virtual input option in response to a user's selection operation of the generated virtual input option;
an input option moving unit configured to move a virtual input option indicated by a moving operation to a target position indicated by the moving operation in response to the moving operation of a user on the virtual input option having a movable identifier;
an option selection releasing unit configured to release selection of the virtual input option in response to a user input of an exit operation of the selection operation.
15. The interaction device of claim 12, wherein the interaction device further comprises:
and the input device starting unit is configured to respond to the selection operation instruction, adopt the entity input device, start to adopt the local entity input device based on the corresponding relation between the preset local entity input device and the current display interface, and blank the local screen.
16. The interaction device of claim 12, wherein the display interface projection unit is further configured to:
and in response to receiving the operation of the user for selecting the multi-screen interaction option, projecting the current display interface to the television screen through an adapter of the external equipment in the television.
17. An electronic device/terminal/server comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-11.
18. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-11.
CN201911366530.7A 2019-12-26 2019-12-26 Interaction method and device for multiple screens Active CN111135557B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911366530.7A CN111135557B (en) 2019-12-26 2019-12-26 Interaction method and device for multiple screens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911366530.7A CN111135557B (en) 2019-12-26 2019-12-26 Interaction method and device for multiple screens

Publications (2)

Publication Number Publication Date
CN111135557A true CN111135557A (en) 2020-05-12
CN111135557B CN111135557B (en) 2024-01-12

Family

ID=70520382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911366530.7A Active CN111135557B (en) 2019-12-26 2019-12-26 Interaction method and device for multiple screens

Country Status (1)

Country Link
CN (1) CN111135557B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022037533A1 (en) * 2020-08-17 2022-02-24 International Business Machines Corporation Failed user-interface resolution

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009143294A2 (en) * 2008-05-20 2009-11-26 Citrix Systems, Inc. Methods and systems for using external display devices with a mobile computing device
US20100245260A1 (en) * 2009-03-26 2010-09-30 Apple Inc. Virtual Input Tools
CN102918490A (en) * 2010-04-01 2013-02-06 思杰系统有限公司 Interacting with remote applications displayed within a virtual desktop of a tablet computing device
CN106412291A (en) * 2016-09-29 2017-02-15 努比亚技术有限公司 Equipment control method and mobile terminal
CN109885746A (en) * 2019-01-17 2019-06-14 平安城市建设科技(深圳)有限公司 Page Dynamic Distribution method, apparatus, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009143294A2 (en) * 2008-05-20 2009-11-26 Citrix Systems, Inc. Methods and systems for using external display devices with a mobile computing device
CN102027450A (en) * 2008-05-20 2011-04-20 思杰系统有限公司 Methods and systems for using external display devices with a mobile computing device
US20100245260A1 (en) * 2009-03-26 2010-09-30 Apple Inc. Virtual Input Tools
CN102918490A (en) * 2010-04-01 2013-02-06 思杰系统有限公司 Interacting with remote applications displayed within a virtual desktop of a tablet computing device
CN106412291A (en) * 2016-09-29 2017-02-15 努比亚技术有限公司 Equipment control method and mobile terminal
CN109885746A (en) * 2019-01-17 2019-06-14 平安城市建设科技(深圳)有限公司 Page Dynamic Distribution method, apparatus, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐璐瑶;姜增祺;黄婷婷;刘云鹏;: "基于大数据的用户画像系统概述", vol. 1, no. 02, pages 220 - 221 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022037533A1 (en) * 2020-08-17 2022-02-24 International Business Machines Corporation Failed user-interface resolution
US11269453B1 (en) 2020-08-17 2022-03-08 International Business Machines Corporation Failed user-interface resolution
GB2613730A (en) * 2020-08-17 2023-06-14 Ibm Failed user-interface resolution
GB2613730B (en) * 2020-08-17 2023-11-15 Ibm Failed user-interface resolution

Also Published As

Publication number Publication date
CN111135557B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
US10341716B2 (en) Live interaction system, information sending method, information receiving method and apparatus
CN113965807B (en) Message pushing method, device, terminal, server and storage medium
CN105979312B (en) Information sharing method and device
CN111408136A (en) Game interaction control method, device and storage medium
US10637804B2 (en) User terminal apparatus, communication system, and method of controlling user terminal apparatus which support a messenger service with additional functionality
KR20140144104A (en) Electronic apparatus and Method for providing service thereof
CN111263175A (en) Interaction control method and device for live broadcast platform, storage medium and electronic equipment
WO2023103956A1 (en) Data exchange method and apparatus, electronic device, storage medium and program product
JP2023522759A (en) MOVIE FILE PROCESSING METHOD, DEVICE, ELECTRONIC DEVICE AND COMPUTER STORAGE MEDIUM
JP2021086626A (en) Method, system, and computer program for providing reputation badge for video chat
CN114727146A (en) Information processing method, device, equipment and storage medium
CN106027631B (en) Data transmission method and device
KR20200120288A (en) Method, system, and non-transitory computer readable record medium for providing multiple group call in one chat room
EP2991289B1 (en) Electronic device and method for sending messages using the same
US20220300144A1 (en) Method, system, and non-transitory computer readable record medium for providing chatroom in 3d form
US20170185422A1 (en) Method and system for generating and controlling composite user interface control
EP2838225A1 (en) Message based conversation function execution method and electronic device supporting the same
KR102299789B1 (en) Method, system and computer readable recording medium for providing social service and video service
CN111135557B (en) Interaction method and device for multiple screens
US20230370686A1 (en) Information display method and apparatus, and device and medium
CN109947528B (en) Information processing method and device
KR102309243B1 (en) Method, system, and computer program for sharing content to chat room in picture-in-picture mode
CN113419650A (en) Data moving method and device, storage medium and electronic equipment
JP2021103520A (en) Method, system, and computer program for expressing emotion in dialog message by use of gestures
Jenner et al. Towards the development of 1-to-n human machine interfaces for unmanned aerial vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant