WO2022089273A1 - 视频通话的互动方法和装置 - Google Patents

视频通话的互动方法和装置 Download PDF

Info

Publication number
WO2022089273A1
WO2022089273A1 PCT/CN2021/124942 CN2021124942W WO2022089273A1 WO 2022089273 A1 WO2022089273 A1 WO 2022089273A1 CN 2021124942 W CN2021124942 W CN 2021124942W WO 2022089273 A1 WO2022089273 A1 WO 2022089273A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
video call
user
display
preset
Prior art date
Application number
PCT/CN2021/124942
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
胡双双
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to EP21885001.4A priority Critical patent/EP4220369A4/de
Priority to KR1020237008016A priority patent/KR20230047172A/ko
Publication of WO2022089273A1 publication Critical patent/WO2022089273A1/zh
Priority to US18/138,076 priority patent/US20230259260A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present application belongs to the field of communication technologies, and in particular relates to an interactive method and device for a video call.
  • the related art when both users conduct a video call through an electronic device, the users can watch the real picture of the other party on the screen, so as to realize the call experience of watching each other and communicating with each other.
  • the video call is separated from the network and the screen, the related art lacks some interactive functions with high interest, and the interaction method is relatively simple, which cannot provide a diversified interactive experience for the user during the video call.
  • the purpose of the embodiments of the present application is to provide an interactive method and device for a video call, which can solve the problem of a relatively simple interaction method during a video call in the related art.
  • an embodiment of the present application provides an interactive method for a video call, which is applied to an electronic device having at least one display screen, and the method includes:
  • the first object is displayed in the video call interface according to a first preset display mode corresponding to the first input, wherein the first input includes a touch input on a target part of the first object, and the first preset The display mode includes deformation corresponding to the target part of the touch input.
  • an embodiment of the present application provides an interactive device for a video call, which is applied to an electronic device having at least one display screen, and the device includes:
  • a receiving module configured to receive a first input from the second user to the first object corresponding to the first user in the video call interface when the first user conducts a video call with the second user;
  • a display module configured to display the first object in the video call interface according to a first preset display mode corresponding to the first input in response to the first input, wherein the first input includes a touch input to a target part of the first object , and the first preset display manner includes that the target part corresponding to the touch input is deformed.
  • an embodiment of the present application provides an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored in the memory and executable on the processor.
  • the program or instruction is executed by the processor, the The steps of the interactive method of the video call of the first aspect.
  • an embodiment of the present application provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the interactive method for a video call according to the first aspect are implemented.
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or instruction to implement the interactive method for a video call according to the first aspect.
  • the second user may perform a first input on the first object corresponding to the first user displayed on the video call interface.
  • the first object may be displayed in a first preset display manner corresponding to the first input.
  • the user can perform a touch operation on the target part of the video call object displayed on the video call interface, and in response to the touch operation, the target part of the video call object displayed on the interface can be deformed accordingly.
  • the user can change the display method of the video call object on the video call interface through some operations implemented on the video call interface, so as to achieve interesting interactive effects and effectively improve the The user's video calling experience.
  • FIG. 1 is a schematic flowchart of an embodiment of an interactive method for a video call provided by the present application
  • FIG. 2 is one of schematic diagrams of an interactive interface for a video call provided by an embodiment of the present application
  • FIG. 3 is a second schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 4 is a third schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 5 is a fourth schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 6 is a fifth schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of another embodiment of an interactive method for a video call provided by the present application.
  • FIG. 8 is a sixth schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 9 is a seventh schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of still another embodiment of an interactive method for a video call provided by the present application.
  • FIG. 11 is an eighth schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 12 is a ninth schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 14 is a tenth schematic diagram of an interactive interface for a video call provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an interactive device for a video call provided by an embodiment of the present application.
  • 16 is a schematic diagram of a hardware structure of an example of an electronic device provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a hardware structure of another example of an electronic device provided by an embodiment of the present application.
  • first, second and the like in the description and claims of the present application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances so that the embodiments of the present application can be practiced in sequences other than those illustrated or described herein, and distinguish between “first”, “second”, etc.
  • the objects are usually of one type, and the number of objects is not limited.
  • the first object may be one or more than one.
  • “and/or” in the description and claims indicates at least one of the connected objects, and the character “/" generally indicates that the associated objects are in an "or” relationship.
  • the related art lacks some interactive functions with high interest, and the interaction method is relatively simple, which cannot provide a diversified interactive experience for the user during the video call.
  • an embodiment of the present application provides an interactive method for a video call.
  • the second user can interact with the first user displayed on the video call interface.
  • the corresponding first object performs the first input.
  • the first object may be displayed in a first preset display manner corresponding to the first input.
  • the user can perform a touch operation on the target part of the video call object displayed on the video call interface, and in response to the touch operation, the target part of the video call object displayed on the interface can be deformed accordingly.
  • users can be provided with a variety of interactive methods during the video call process.
  • the user can change the display method of the video call object on the video call interface through some operations implemented on the video call interface, so as to achieve interesting interactive effects and effectively improve the
  • the user's video call experience solves the problem that the interaction method in the video call process in the related art is relatively simple.
  • the electronic device in the embodiment of the present application may include at least one of the following: a mobile phone, a tablet computer, a smart wearable device, and other devices having functions of receiving information and displaying information.
  • FIG. 1 is a schematic flowchart of an embodiment of an interactive method for a video call provided by the present application. As shown in FIG. 1 , the interaction method of the video call may include S101 and S102.
  • the video call interface may display a first object corresponding to the first user and a second object corresponding to the second user.
  • FIG. 2 is a schematic diagram of an example of an interactive interface for a video call provided by the present application. As shown in FIG. 2 , objects a and b are displayed on the video call interface of users A and B. The object a is the first object corresponding to user A, and the object b is the second object corresponding to user B.
  • the electronic device receives a first input from a second user to a first object corresponding to the first user in the video call interface, where the first input may include a touch input to a target portion of the first object.
  • the target part may be a certain body part selected by the second user from a plurality of body parts of the first object, for example, the face, shoulder, hand, etc. of the first object.
  • the touch input may be a click input, a double-click input, a drag input, a slide input, etc. of the second user on the target part, and a combination of two or more inputs.
  • the first object corresponding to the first user is the object a
  • the first input may be a drag input of the second user on the face position N1 (ie, the target part) of the object a.
  • receiving the first input from the second user to the first object corresponding to the first user in the video call interface may include: in the first user In the case of making a video call with the second user, a touch input by the second user on the target part of the first object corresponding to the first user in the video call interface is received.
  • the second object corresponding to the second user may be a panoramic portrait of the second user.
  • the first preset display manner corresponding to the first input may include: deformation of the target part corresponding to the touch input.
  • displaying the first object in the video call interface according to a first preset display manner corresponding to the first input may include: displaying, on the video call interface, the first object deformed in the target part. object.
  • the first object corresponding to the first user is object a
  • the electronic device receives a drag input of the second user on the face position N1 of object a
  • the face of object a is the target part.
  • the face of the object a can be deformed accordingly.
  • the electronic device receives the user's touch input on the target part of the video call object, and in response to the touch input, can display the video call object whose target part is deformed on the video call interface, so as to realize interesting interaction with the video call object and enrich the how to interact during a video call.
  • the first input may further include a drag input, a click input, a double-click input, a slide input, a rotation input, etc., and a combination of two or more inputs by the second user on the first object, and Different display modes corresponding to the first input can be set according to specific requirements.
  • the first preset display mode corresponding to the first input is the rotation of the first object.
  • the first preset display manner corresponding to the first input may include: displaying the jumping first object.
  • the first input is that the second user presses the first object with two fingers for a long time, and the two fingers move in opposite directions at the same time
  • the first preset display mode corresponding to the first input may include: the first object is enlarged or Zoom out.
  • the first preset display mode corresponding to the first input is displayed in the video call interface.
  • the video call interface of the electronic device corresponding to the first user may also display the first object according to the first preset display mode, so as to realize the video call experience in which both parties of the video call simultaneously watch the interactive effect.
  • the second user can perform a first input on the first object corresponding to the first user displayed on the video call interface.
  • the first object may be displayed in a first preset display manner corresponding to the first input.
  • the user can perform a touch operation on the target part of the video call object displayed on the video call interface, and in response to the touch operation, the target part of the video call object displayed on the interface can be deformed accordingly. In this way, users can be provided with a variety of interactive methods during the video call process.
  • the user can change the display method of the video call object on the video call interface through some operations implemented on the video call interface, so as to achieve interesting interactive effects and effectively improve the The user's video calling experience.
  • the foregoing S101 and S102 are described in detail below with reference to specific embodiments.
  • the method includes: determining the behavior characteristic of the first input; and displaying the first object on the video call interface according to the first preset display mode corresponding to the first input when the behavior characteristic of the first input is consistent with the preset behavior characteristic.
  • the preset behavior feature may be the behavior feature corresponding to the safety behavior.
  • the electronic device receives the first input, it can save the image corresponding to the current video call interface, determine the behavioral feature corresponding to the first input by detecting and recognizing the image, and record the behavioral feature corresponding to the first input. Match with preset behavioral characteristics.
  • the behavior characteristic corresponding to the first input is consistent with the preset behavior characteristic, it is determined that the first input is a safety behavior, and at this time, the first object is displayed according to the first preset display mode corresponding to the first input.
  • the security behavior may include touch input to non-private parts, eg, touch input to the hand, head, shoulder, arm, and the like.
  • the first input is a second user's touch input on the shoulder of the first object
  • the electronic device receives the first input, and determines that the behavior characteristic of the first input is "touching the shoulder", which corresponds to the safety behavior.
  • the behavioral characteristics are consistent, so the first object whose shoulders are tapped is displayed on the video call interface.
  • the method may further include: displaying prompt information, where the prompt information is used to prompt the second user to stop the unsafe behavior.
  • the first input is a touch input on the chest of the first object by the second user
  • the electronic device receives the first input, and determines that the behavior characteristic of the first input is "touching the chest", which is an obscene behavior and a safe behavior.
  • the corresponding behavior characteristics are inconsistent, so a prompt message "Please stop unsafe behavior!” can be displayed on the video call interface to warn the second user.
  • the behavior corresponding to the behavior characteristics inconsistent with the preset behavior characteristics can be effectively filtered out. , such as indecent behavior, etc. In this way, the security of the interactive behavior during the video call can be guaranteed, thereby improving the security of the video call.
  • the video call interface may display an interactive control
  • the first input may include a click input on the interactive control
  • the interactive control may be a control for realizing interaction with the first object, for example, an expression control, an action control, and the like.
  • the interactive control corresponds to the first preset display mode.
  • the first preset display mode may include a preset interactive animation
  • S102 responds to the first input, according to the first input corresponding to the first input.
  • the preset display manner of displaying the first object in the video call interface may include: in response to the first input, displaying a preset interactive animation with the first object on the video call interface.
  • the preset interactive animation may be an animation of performing an interactive action on the first object, for example, an animation of holding hands with the first object, an animation of bouncing the first object’s head, an animation of patting the shoulder of the first object, rubbing the first object Animation of the face, etc.
  • the first user corresponds to the object a
  • the video call interface displays an interactive control of “pat on the shoulder”
  • the electronic device receives the click input of the second user on the interactive control, and responds to the click Enter, the video call interface displays an animation of a small hand clapping the shoulder of object a.
  • the user can interact with the video call object by clicking on the interactive control, which simplifies the user's operation difficulty, realizes the intimate communication between the two parties in the video call, and further enriches the interaction method during the video call.
  • the first input may further include a second input for moving the first object to a target area of the video call interface.
  • receiving the first input from the second user to the first object corresponding to the first user in the video call interface may include: in the first When the user conducts a video call with the second user, receive a second input from the second user to move the first object in the video call interface to the target area of the video call interface; S102, in response to the first input, according to the corresponding first input
  • the first preset display manner of displaying the first object in the video call interface may include: displaying the first object in a target area of the video call interface.
  • the target area may be any area selected by the second user in the video call interface.
  • the second input may be a drag input for the second user to move the first object to the target area.
  • the target area is area 1
  • the first object is object a
  • the electronic device receives a drag input from the second user to move object a to area 1
  • Object a is displayed in area 1.
  • displaying the first object in the target area may include: displaying the first object and the second object in the target area.
  • the target area is area 2
  • the first object is object a
  • the second object is object b
  • area 2 is displayed with object b.
  • the electronic device receives a drag input from the second user to move the object a to the area 2, and displays the object a and the object b in the area 2 in response to the drag input.
  • the user can move the video call object in the video call interface, move the video call object to any area of the video call interface, and at the same time, move the video call object to the area where the video call object is located, so that the two parties in the video call can get closer to each other.
  • the distance between them brings users a new video calling experience.
  • the video call interface may display a first control or a preset area.
  • FIG. 7 is a schematic flowchart of another embodiment of the video call interaction method provided by the present application.
  • the execution body of the video call interaction method may be an electronic device having at least one display screen.
  • the interactive method for a video call provided by this embodiment of the present application may include S701-S704.
  • the preset area may be set according to specific requirements, and may be any area of the video call interface.
  • the first control and the preset area can be used to enable a "moving mode" of the video call interface, and in the "moving mode", the second user can move the first object.
  • the third input may be a click input, a sliding input, a double-click input, etc. of the second user on the first control or the preset area.
  • the first object is object a
  • the first control is the “Start moving!” control
  • the preset area is area 3
  • the third input is the sliding input to area 3 by the second user.
  • the electronic device receives the sliding input, and turns on the "moving mode" of the video call interface. In the "moving mode", the second user can freely move the object a.
  • the second control may be a tool for moving the first object, for example, a "drag tool" as shown in FIG. 9 .
  • S703 Receive a fourth input of the second user dragging the first object to the target area of the video call interface through the second control.
  • the target area may be any area selected by the second user on the video call interface.
  • the fourth input may be a drag input, a slide input, a drag input, etc. of the second user to the second control.
  • the second user can move the second control to any body part of the first object, and then drag the first object to the target area by dragging the second control.
  • the first object is object a
  • the second control is “drag tool”
  • the target area is area 4 .
  • the second user moves the "drag tool” to the head position of the object a, and then drags the object a to the area 4 through the drag input to the "drag tool”.
  • the object a is displayed in the area 4 .
  • the electronic device can move the video call object only after receiving the user's input on the first control or the preset area, so as to avoid some misoperations caused by the user accidentally touching the video playback interface.
  • Objects are moved to the corners of the video playback interface to further enhance the user experience.
  • the user can not only change the display mode of the video call object in the video call interface, but also change the display mode of the object corresponding to the user himself.
  • the interactive method of the video call provided by the embodiment of the present application may further include: Receive the seventh input from the second user to the second object corresponding to the second user in the video call interface; in response to the seventh input, display the second input in the video call interface according to the second preset display mode corresponding to the seventh input object.
  • the electronic device receives a seventh input from the second user to the second object corresponding to the second user in the video call interface, where the seventh input may be a drag input, a click input, a double-click input, a slide input, etc. by the second user to the second object , and combinations between two or more inputs.
  • different display modes corresponding to the seventh input can be set according to specific requirements.
  • the second preset display manner corresponding to the seventh input may include: displaying the jumping second object.
  • the second preset display mode corresponding to the seventh input may include: the second object is enlarged or Zoom out.
  • the second object is displayed in the video call interface according to the second preset display mode corresponding to the seven inputs.
  • the video call interface of the electronic device corresponding to the first user may also display the second object according to the second preset display mode, so as to realize the video call experience in which both parties of the video call simultaneously watch the interactive effect.
  • the user can not only change the display mode of the video call object in the video call interface, but also change the display mode of the user's own corresponding object, thereby further enriching the interactive functions during the video call and providing the user with a comprehensive video call interactive experience.
  • the user in order to improve the user experience, can freely change the video call background, and the video call interface can display a third control and a second object corresponding to the second user.
  • FIG. 10 is a schematic flowchart of still another embodiment of a video call interaction method provided by the present application.
  • the execution body of the video call interaction method may be an electronic device having at least one display screen.
  • the interaction method for a video call provided by this embodiment of the present application may include S1001-S1004.
  • the third control may be a tool for changing the background of the video call
  • the fifth input may be a click input, a double-click input, a slide input, etc. of the second user on the third control.
  • the third control is a “background replacement” control
  • the fifth input is a double-click input of the “background replacement” control by the second user.
  • the scene information includes at least one of the current video call background of the first user, the current video call background of the second user, and preset virtual scene information.
  • the preset virtual scene information can be set according to specific needs.
  • the electronic device can directly acquire the current video call background of the second user through the depth of field effect of the camera and 360 panoramic imaging, and the current video call background can realistically show the real environment where the second user is located. Meanwhile, the electronic device may directly acquire the current video call background of the first user displayed in the video call interface.
  • the preset virtual scene information may include game scene information, landscape scene information, etc., and may also include image information stored locally by the electronic device, for example, picture information in an album.
  • S1003 Receive a sixth input from the second user for selecting target scene information from multiple scene information.
  • the sixth input may be a click input, a double-click input, a slide input, etc. of the target scene information from the plurality of scene information displayed on the interface by the second user.
  • the video call interface displays scene 1 - scene 4, and the sixth input is the second user's click input on scene 2.
  • the second object is object b.
  • the electronic device changes the video call background of object b to scene 2, and displays object b in scene 2.
  • the electronic device when the electronic device receives the input of the user selecting the target scene information from the plurality of scene information, in response to the input, the user's video call background can be replaced with the target scene information, thereby providing the user with self-replacement during the video call.
  • the function of video call background can effectively improve the user experience.
  • displaying the second object in the scene corresponding to the target scene information may include: displaying the first object and the second object in the scene corresponding to the target scene information. Two objects.
  • the first object is object a
  • the second object is object b
  • the video playback interface displays the current call background of object a "the other party's call scene” and the current call background of object b "current call scene”.
  • Call Scene "Park Scene”, "Super Mario Scene”.
  • the electronic device receives the click input of "park scene” by the second user, changes the video call background of object a and object b to "park scene”, and simultaneously displays objects a and b in the "park scene”.
  • the user can select the scene corresponding to the target scene information as the public scene of both parties of the video call, and both parties of the video call can appear in the public scene at the same time, thereby simulating a video call experience of face-to-face communication between the two parties of the video call. It can deepen the intimacy between the two parties in the video call.
  • the method may further include: when the target scene information is game scene information, controlling the first object and the second object to perform preset game actions.
  • the video call interface can start the game mode, in this game mode, the video call interface can display game controls, and the first user and the second user can operate the game
  • the control controls the first object and the second object to perform preset game actions, and displays corresponding game effects on the video call interface.
  • the second user makes the second object launch a hidden weapon to hit the first object, then the first object can fall to the ground, and a scar effect will appear at the hit position.
  • the electronic device can perform video synchronous recording of the interaction process between the first object and the second object in the game mode, and generate a short video for saving or sharing.
  • the electronic device when the electronic device receives the input of the user selecting the game scene information, it can respond to the input and start the game mode of the video call interface. In the game mode, it can control the first object and the second object to perform preset game actions, Effectively combine video calling with games, and provide users with a new video calling experience through this interesting and interactive way.
  • the electronic device in the embodiment of the present application may be a single screen, a dual screen, a multi-screen, a folding screen, a telescopic screen, etc., which is not limited herein.
  • the first object is displayed in the video call interface according to the first preset display mode corresponding to the first input, which may be The method includes: displaying the first object on the first display screen according to the first preset display mode.
  • the electronic device includes a first display screen 1301 and a second display screen 1302 .
  • a first object, ie, object a is displayed on 1301
  • a second object corresponding to the second user, ie, object b is displayed on 1302 .
  • the electronic device receives the drag input of the second user on the face position N1 of the object a.
  • an object a whose face is correspondingly deformed may be displayed on 1301 .
  • the two sides of the video call can be displayed on two screens respectively, and the user can perform some touch-screen operations on the screen on which the video call object is displayed to change the video call object in the video call.
  • the display mode of the interface realizes interesting interactive effects and effectively improves the user's video calling experience.
  • the execution body may be an electronic device having at least one display screen, or may be an interactive device for a video call or an interactive device for executing a video call in the interactive device for a video call.
  • an interactive method for performing a video call performed by an interactive device for a video call is used as an example to describe the interactive device for a video call provided by the embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of a video call interaction apparatus provided by the present application, and the video call interaction apparatus can be applied to an electronic device having at least one display screen.
  • the interactive device 1500 for a video call may include: a receiving module 1501 and a display module 1502 .
  • the receiving module 1501 is used for receiving the first input from the second user to the first object corresponding to the first user in the video call interface when the first user and the second user conduct a video call;
  • the display module 1502 is used for In response to the first input, the first object is displayed in the video call interface according to a first preset display mode corresponding to the first input, wherein the first input includes a touch input on a target part of the first object, and the first preset It is assumed that the display mode includes the deformation of the target part corresponding to the touch input.
  • the second user can perform a first input on the first object corresponding to the first user displayed on the video call interface.
  • the first object may be displayed in a first preset display manner corresponding to the first input.
  • the user can perform a touch operation on the target part of the video call object displayed on the video call interface, and in response to the touch operation, the target part of the video call object displayed on the interface can be deformed accordingly.
  • the user can change the display method of the video call object on the video call interface through some operations implemented on the video call interface, so as to achieve interesting interactive effects and effectively improve the The user's video calling experience.
  • the apparatus further includes: a determining module 1503, configured to determine the behavior characteristic of the first input; the display module 1502 is specifically configured to, in the case that the behavior characteristic of the first input is consistent with the preset behavior characteristic, according to the The first preset display mode corresponding to the first input displays the first object on the video call interface.
  • the behavior corresponding to the behavior characteristics inconsistent with the preset behavior characteristics can be effectively filtered out. , such as indecent behavior, etc. In this way, the security of the interactive behavior during the video call can be guaranteed, thereby improving the security of the video call.
  • the first input further includes a second input for moving the first object to a target area of the video call interface.
  • the display module 1502 is further configured to display the first object in the target area.
  • the display module 1502 is specifically configured to display the first object and the second object in the target area.
  • the user can move the video call object in the video call interface, move the video call object to any area of the video call interface, and at the same time, move the video call object to the area where the video call object is located, so that the two parties in the video call can get closer to each other.
  • the distance between them brings users a new video calling experience.
  • the video call interface displays a first control or a preset area
  • the receiving module 1501 is further configured to receive a third input to the first control or preset area
  • the display module 1502 is further configured to respond to the first control or preset area.
  • the third input is to display the second control
  • the receiving module 1501 is further configured to receive the fourth input of the second user dragging the first object to the target area of the video call interface through the second control
  • the display module 1502 is further configured to respond to the first Four inputs, display the first object in the target area.
  • the electronic device can move the video call object only after receiving the user's input on the first control or the preset area, so as to avoid some misoperations caused by the user accidentally touching the video playback interface.
  • Objects are moved to the corners of the video playback interface to further enhance the user experience.
  • the video call interface displays a third control and a second object corresponding to the second user.
  • the receiving module 1501 is further configured to receive a fifth input from the second user on the third control; the display module 1502 is further configured to use In response to the fifth input, display a plurality of scene information; the receiving module 1501 is further configured to receive the sixth input of the second user selecting target scene information from the plurality of scene information; the display module 1502 is further configured to respond to the sixth input. Input to display the second object in the scene corresponding to the target scene information.
  • the electronic device when the electronic device receives the input of the user selecting the target scene information from the plurality of scene information, in response to the input, the user's video call background can be replaced with the target scene information, thereby providing the user with self-replacement during the video call.
  • the function of video call background can effectively improve the user experience.
  • the scene information includes at least one of the current video call background of the first user, the current video call background of the second user, and preset virtual scene information.
  • the display module 1502 is specifically configured to, in the target scene information corresponding to The first object and the second object are displayed in the scene.
  • the user can select the scene corresponding to the target scene information as the public scene of both parties of the video call, and both parties of the video call can appear in the public scene at the same time, thereby simulating a video call experience of face-to-face communication between the two parties of the video call. It can deepen the intimacy between the two parties in the video call.
  • the interactive device for a video call in this embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the apparatus may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
  • UMPC ultra-mobile personal computer
  • netbook or a personal digital assistant
  • non-mobile electronic devices can be servers, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (television, TV), teller machine or self-service machine, etc., this application Examples are not specifically limited.
  • Network Attached Storage NAS
  • personal computer personal computer, PC
  • television television
  • teller machine or self-service machine etc.
  • the interactive device for a video call in this embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android (Android) operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
  • the interactive device for video call provided by the embodiment of the present application can implement each process implemented by the method embodiments of FIG. 1 , FIG. 7 , and FIG. 10 , and to avoid repetition, details are not repeated here.
  • FIG. 16 is a schematic diagram of a hardware structure of an example of an electronic device provided by an embodiment of the present application.
  • the electronic device 1600 includes a processor 1601 and a memory 1602, which are stored in the memory 1602 and can be stored in the processor 1601
  • the program or instruction is executed by the processor 1601
  • each process of the above-mentioned embodiment of the interactive method for a video call can be achieved, and the same technical effect can be achieved. To avoid repetition, it will not be repeated here.
  • the electronic devices in the embodiments of the present application include the aforementioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 17 is a schematic diagram of a hardware structure of another example of an electronic device provided by an embodiment of the present application.
  • the electronic device 1700 includes but is not limited to: a radio frequency unit 1701 , a network module 1702 , an audio output unit 1703 , an input unit 1704 , a sensor 1705 , a display unit 1706 , a user input unit 1707 , an interface unit 1708 , and a memory 1709 , and the processor 1710 and other components.
  • the electronic device 1700 may also include a power source (such as a battery) for supplying power to various components, and the power source may be logically connected to the processor 1710 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. consumption management and other functions.
  • a power source such as a battery
  • the structure of the electronic device shown in FIG. 17 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than those shown in the figure, or combine some components, or arrange different components, which will not be repeated here. .
  • the user input unit 1707 is configured to receive the first input from the second user to the first object corresponding to the first user in the video call interface when the first user conducts a video call with the second user; the display unit 1706, is used to display the first object in the video call interface according to the first preset display mode corresponding to the first input in response to the first input, wherein the first input includes a touch input on the target part of the first object, and the first The preset display mode includes the deformation of the target part corresponding to the touch input.
  • the second user may perform a first input on the first object corresponding to the first user displayed on the video call interface.
  • the first object may be displayed in a first preset display manner corresponding to the first input.
  • the user can perform a touch operation on the target part of the video call object displayed on the video call interface, and in response to the touch operation, the target part of the video call object displayed on the interface can be deformed accordingly.
  • the user can change the display method of the video call object on the video call interface through some operations implemented on the video call interface, so as to achieve interesting interactive effects and effectively improve the The user's video calling experience.
  • the processor 1710 is configured to determine the behavior characteristics of the first input; the display unit 1706 is specifically configured to, in the case that the behavior characteristics of the first input are consistent with the preset behavior characteristics, according to the first input corresponding to the first input.
  • a preset display mode displays the first object in the video call interface.
  • the behavior corresponding to the behavior characteristics inconsistent with the preset behavior characteristics can be effectively filtered out. , such as indecent behavior, etc. In this way, the security of the interactive behavior during the video call can be guaranteed, thereby improving the security of the video call.
  • the first input further includes a second input for moving the first object to a target area of the video call interface.
  • the display unit 1706 is further configured to display the first object in the target area.
  • the display unit 1706 is specifically configured to display the first object and the second object in the target area.
  • the user can move the video call object in the video call interface, move the video call object to any area of the video call interface, and at the same time, move the video call object to the area where the video call object is located, so that the two parties in the video call can get closer to each other.
  • the distance between them brings users a new video calling experience.
  • the video call interface displays a first control or a preset area
  • the user input unit 1707 is further configured to receive a third input to the first control or preset area
  • the display unit 1706 is further configured to respond to the third input input to display the second control
  • the user input unit 1707 is further configured to receive the fourth input of the second user dragging the first object to the target area of the video call interface through the second control
  • the display unit 1706 is further configured to respond to the first Four inputs, display the first object in the target area.
  • the electronic device can move the video call object only after receiving the user's input on the first control or the preset area, so as to avoid some misoperations caused by the user accidentally touching the video playback interface.
  • Objects are moved to the corners of the video playback interface to further enhance the user experience.
  • the video call interface displays a third control and a second object corresponding to the second user.
  • the user input unit 1707 is further configured to receive a fifth input from the second user to the third control; the display unit 1706 is also configured to In response to the fifth input, a plurality of scene information is displayed; the user input unit 1707 is further configured to receive a sixth input of the second user selecting target scene information from the plurality of scene information; the display unit 1706 is further configured to respond to the sixth input Input to display the second object in the scene corresponding to the target scene information.
  • the electronic device when the electronic device receives the input of the user selecting the target scene information from the plurality of scene information, in response to the input, the user's video call background can be replaced with the target scene information, thereby providing the user with self-replacement during the video call.
  • the function of video call background can effectively improve the user experience.
  • the scene information includes at least one of the current video call background of the first user, the current video call background of the second user, and preset virtual scene information;
  • the display unit 1706 is specifically configured to display in the scene corresponding to the target scene information.
  • the first object and the second object are displayed in .
  • the user can select the scene corresponding to the target scene information as the public scene of both parties of the video call, and both parties of the video call can appear in the public scene at the same time, thereby simulating a video call experience of face-to-face communication between the two parties of the video call. It can deepen the intimacy between the two parties in the video call.
  • the embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium.
  • a program or an instruction is stored on the readable storage medium.
  • the processor is the processor in the electronic device in the above embodiment.
  • a readable storage medium including a computer-readable storage medium, examples of the computer-readable storage medium include a non-transitory computer-readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory) Memory, RAM), magnetic disk or optical disk, etc.
  • ROM computer read-only memory
  • RAM random access memory
  • magnetic disk or optical disk etc.
  • An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used for running a program or an instruction to implement each process of the above-mentioned embodiment of the interactive method for a video call, and can achieve The same technical effect, in order to avoid repetition, will not be repeated here.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
  • processors may be, but are not limited to, general purpose processors, special purpose processors, application specific processors, or field programmable logic circuits. It will also be understood that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can also be implemented by special purpose hardware for performing the specified functions or actions, or by special purpose hardware and/or A combination of computer instructions is implemented.
  • the method of the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course can also be implemented by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in a part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, CD-ROM), including several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of this application.
  • a storage medium such as ROM/RAM, magnetic disk, CD-ROM

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/CN2021/124942 2020-10-27 2021-10-20 视频通话的互动方法和装置 WO2022089273A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21885001.4A EP4220369A4 (de) 2020-10-27 2021-10-20 Interaktionsverfahren und -vorrichtung für videoanruf
KR1020237008016A KR20230047172A (ko) 2020-10-27 2021-10-20 영상 통화의 인터렉션 방법과 장치
US18/138,076 US20230259260A1 (en) 2020-10-27 2023-04-22 Interaction method and apparatus for video call

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011167360.2 2020-10-27
CN202011167360.2A CN112363658B (zh) 2020-10-27 2020-10-27 视频通话的互动方法和装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/138,076 Continuation US20230259260A1 (en) 2020-10-27 2023-04-22 Interaction method and apparatus for video call

Publications (1)

Publication Number Publication Date
WO2022089273A1 true WO2022089273A1 (zh) 2022-05-05

Family

ID=74510961

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/124942 WO2022089273A1 (zh) 2020-10-27 2021-10-20 视频通话的互动方法和装置

Country Status (5)

Country Link
US (1) US20230259260A1 (de)
EP (1) EP4220369A4 (de)
KR (1) KR20230047172A (de)
CN (1) CN112363658B (de)
WO (1) WO2022089273A1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112363658B (zh) * 2020-10-27 2022-08-12 维沃移动通信有限公司 视频通话的互动方法和装置
CN113286082B (zh) * 2021-05-19 2023-06-30 Oppo广东移动通信有限公司 目标对象跟踪方法、装置、电子设备及存储介质
CN115396390A (zh) * 2021-05-25 2022-11-25 Oppo广东移动通信有限公司 基于视频聊天的互动方法、系统、装置及电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090103211A (ko) * 2008-03-27 2009-10-01 주식회사 케이티테크 영상 통화 중 터치 피드백 제공 기능을 구비한 이동 단말기및 영상 통화 중 터치 피드백 제공 방법
CN106067960A (zh) * 2016-06-20 2016-11-02 努比亚技术有限公司 一种处理视频数据的移动终端和方法
CN107197194A (zh) * 2017-06-27 2017-09-22 维沃移动通信有限公司 一种视频通话方法及移动终端
CN108259810A (zh) * 2018-03-29 2018-07-06 上海掌门科技有限公司 一种视频通话的方法、设备和计算机存储介质
CN109862434A (zh) * 2019-02-27 2019-06-07 上海游卉网络科技有限公司 一种美妆通话系统及其方法
CN111010526A (zh) * 2019-11-11 2020-04-14 珠海格力电器股份有限公司 一种视频通讯中的互动方法及装置
CN112363658A (zh) * 2020-10-27 2021-02-12 维沃移动通信有限公司 视频通话的互动方法和装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9245177B2 (en) * 2010-06-02 2016-01-26 Microsoft Technology Licensing, Llc Limiting avatar gesture display
WO2012007034A1 (en) * 2010-07-13 2012-01-19 Nokia Corporation Sending and receiving information
US8730294B2 (en) * 2010-10-05 2014-05-20 At&T Intellectual Property I, Lp Internet protocol television audio and video calling
US8767034B2 (en) * 2011-12-01 2014-07-01 Tangome, Inc. Augmenting a video conference
KR101978219B1 (ko) * 2013-03-15 2019-05-14 엘지전자 주식회사 이동 단말기 및 이의 제어 방법
CN105554429A (zh) * 2015-11-19 2016-05-04 掌赢信息科技(上海)有限公司 一种视频通话显示方法及视频通话设备
CN105872438A (zh) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 一种视频通话方法、装置及终端
CN107071330A (zh) * 2017-02-28 2017-08-18 维沃移动通信有限公司 一种视频通话互动的方法及移动终端
CN107529096A (zh) * 2017-09-11 2017-12-29 广东欧珀移动通信有限公司 图像处理方法及装置
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera
CN109873971A (zh) * 2019-02-27 2019-06-11 上海游卉网络科技有限公司 一种美妆通话系统及其方法
US20200359892A1 (en) * 2019-05-15 2020-11-19 Rollins Enterprises, Llc. Virtual consultation systems

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090103211A (ko) * 2008-03-27 2009-10-01 주식회사 케이티테크 영상 통화 중 터치 피드백 제공 기능을 구비한 이동 단말기및 영상 통화 중 터치 피드백 제공 방법
CN106067960A (zh) * 2016-06-20 2016-11-02 努比亚技术有限公司 一种处理视频数据的移动终端和方法
CN107197194A (zh) * 2017-06-27 2017-09-22 维沃移动通信有限公司 一种视频通话方法及移动终端
CN108259810A (zh) * 2018-03-29 2018-07-06 上海掌门科技有限公司 一种视频通话的方法、设备和计算机存储介质
CN109862434A (zh) * 2019-02-27 2019-06-07 上海游卉网络科技有限公司 一种美妆通话系统及其方法
CN111010526A (zh) * 2019-11-11 2020-04-14 珠海格力电器股份有限公司 一种视频通讯中的互动方法及装置
CN112363658A (zh) * 2020-10-27 2021-02-12 维沃移动通信有限公司 视频通话的互动方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4220369A4 *

Also Published As

Publication number Publication date
KR20230047172A (ko) 2023-04-06
EP4220369A1 (de) 2023-08-02
CN112363658A (zh) 2021-02-12
EP4220369A4 (de) 2024-03-20
US20230259260A1 (en) 2023-08-17
CN112363658B (zh) 2022-08-12

Similar Documents

Publication Publication Date Title
WO2022089273A1 (zh) 视频通话的互动方法和装置
WO2022063022A1 (zh) 视频预览方法、装置及电子设备
WO2017054465A1 (zh) 一种信息处理方法、终端及计算机存储介质
WO2018126957A1 (zh) 显示虚拟现实画面的方法和虚拟现实设备
WO2022012657A1 (zh) 图像编辑方法、装置和电子设备
WO2018192417A1 (zh) 一种界面渲染方法和装置
WO2016106997A1 (zh) 屏幕截图方法及装置、移动终端
WO2017032078A1 (zh) 一种界面控制方法及移动终端
WO2023061280A1 (zh) 应用程序显示方法、装置及电子设备
WO2023050722A1 (zh) 信息显示方法及电子设备
WO2022135409A1 (zh) 显示处理方法、显示处理装置和可穿戴设备
WO2022199454A1 (zh) 显示方法和电子设备
WO2022135290A1 (zh) 截屏方法、装置及电子设备
WO2022268024A1 (zh) 视频播放方法、装置及电子设备
WO2023030114A1 (zh) 界面显示方法和装置
WO2023284632A1 (zh) 图像展示方法、装置及电子设备
WO2018000606A1 (zh) 一种虚拟现实交互界面的切换方法和电子设备
CN112148167A (zh) 控件设置方法、装置和电子设备
WO2022156703A1 (zh) 一种图像显示方法、装置及电子设备
WO2022156602A1 (zh) 显示方法、显示装置和电子设备
WO2022068721A1 (zh) 截屏方法、装置及电子设备
WO2022111458A1 (zh) 图像拍摄方法和装置、电子设备及存储介质
WO2019105062A1 (zh) 一种内容显示方法、装置和终端设备
WO2024037419A1 (zh) 显示控制方法、装置、电子设备及可读存储介质
WO2024037438A1 (zh) 分屏控制方法、装置、电子设备和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21885001

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20237008016

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021885001

Country of ref document: EP

Effective date: 20230425

NENP Non-entry into the national phase

Ref country code: DE