US20230259260A1 - Interaction method and apparatus for video call - Google Patents

Interaction method and apparatus for video call Download PDF

Info

Publication number
US20230259260A1
US20230259260A1 US18/138,076 US202318138076A US2023259260A1 US 20230259260 A1 US20230259260 A1 US 20230259260A1 US 202318138076 A US202318138076 A US 202318138076A US 2023259260 A1 US2023259260 A1 US 2023259260A1
Authority
US
United States
Prior art keywords
input
video call
user
displaying
call interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/138,076
Other languages
English (en)
Inventor
Shuangshuang HU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Assigned to VIVO MOBILE COMMUNICATION CO., LTD. reassignment VIVO MOBILE COMMUNICATION CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HU, Shuangshuang
Publication of US20230259260A1 publication Critical patent/US20230259260A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • This application relates to the field of communication technologies, and specifically, to an interaction method and apparatus for a video call.
  • Objectives of embodiments of this application are to provide an interaction method and apparatus for a video call.
  • an embodiment of this application provides an interaction method for a video call, applied to an electronic device having at least one display screen.
  • the method includes:
  • the first object in the video call interface displaying, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input, where the first input includes a touch input for a target portion of the first object, and the first preset display manner includes that the target portion corresponding to the touch input is deformed.
  • an embodiment of this application provides an interaction apparatus for a video call, applied to an electronic device having at least one display screen.
  • the apparatus includes:
  • a receiving module configured to receive, in a case that a first user performs a video call with a second user, a first input by the second user for a first object corresponding to the first user in a video call interface
  • a display module configured to display, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input, where the first input includes a touch input for a target portion of the first object, and the first preset display manner includes that the target portion corresponding to the touch input is deformed.
  • an embodiment of this application further provides an electronic device, including a processor, a memory, and a program or instruction stored in the memory and executable on the processor, the program or instruction, when executed by the processor, implementing the steps of the interaction method for a video call according to the first aspect.
  • an embodiment of this application provides a readable storage medium, storing a program or instruction, the program or instruction, when executed by the processor, implementing the steps of the interaction method for a video call according to the first aspect.
  • an embodiment of this application further provides a chip, including a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to run a program or instruction, to implement the interaction method for a video call according to the first aspect.
  • the second user may perform a first input for a first object that corresponds to the first user and that is displayed in a video call interface.
  • an electronic device may display, in response to the first input, the first object according to a first preset display manner corresponding to the first input. For example, a user may perform a touch operation on a target portion of a video call object displayed in a video call interface. In response to the touch operation, the target portion of the video call object displayed in the interface may be correspondingly deformed.
  • a diversified interaction manner can be provided for the user during a video call, and the user can change a display manner of the video call object in the video call interface through some operations implemented on the video call interface, thereby realizing an interesting interactive effect, and effectively improving video call experience of the user.
  • FIG. 1 is a schematic flowchart of an embodiment of an interaction method for a video call according to this application
  • FIG. 2 is a first schematic diagram of an interaction interface for a video call according to an embodiment of this application
  • FIG. 3 is a second schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 4 is a third schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 5 is a fourth schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 6 is a fifth schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 7 is a schematic flowchart of another embodiment of an interaction method for a video call according to this application.
  • FIG. 8 is a sixth schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 9 is a seventh schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 10 is a schematic flowchart of still another embodiment of an interaction method for a video call according to this application.
  • FIG. 11 is an eighth schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 12 is a ninth schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of this application.
  • FIG. 14 is a tenth schematic diagram of an interaction interface for a video call according to an embodiment of this application.
  • FIG. 15 is a structural schematic diagram of an interaction apparatus for a video call according to an embodiment of this application.
  • FIG. 16 is a schematic diagram of a hardware structure of an example of an electronic device according to an embodiment of this application.
  • FIG. 17 is a schematic diagram of a hardware structure of another example of an electronic device according to an embodiment of this application.
  • first and second are used to distinguish similar objects, but are unnecessarily used to describe a specific sequence or order. It should be understood that the data in such a way are interchangeable in proper circumstances, so that the embodiments of this application can be implemented in an order other than the orders illustrated or described herein.
  • Objects distinguished by “first”, “second”, and the like are usually one type, and a number of objects is not limited.
  • the first object may have one or more than one.
  • “and/or” means at least one of the connected objects, and the character “/” generally indicates an “or” relationship between the associated objects.
  • an embodiment of this application provides an interaction method for a video call.
  • the second user may perform a first input for a first object that corresponds to the first user and that is displayed in a video call interface.
  • an electronic device may display, in response to the first input, the first object according to a first preset display manner corresponding to the first input. For example, a user may perform a touch operation on a target portion of a video call object displayed in a video call interface. In response to the touch operation, the target portion of the video call object displayed in the interface may be correspondingly deformed.
  • an execution subject of the interaction method for a video call is an electronic device having at least one display screen
  • the interaction method for a video call provided in this embodiment of this application is described in detail below through specific embodiments with reference to the accompanying drawings. It should be noted that, the foregoing execution subject does not constitute a limitation to this application.
  • the electronic device in this embodiment of this application may include at least one of the following devices capable of receiving information and displaying information, such as a mobile phone, a tablet computer, and a smart wearable device.
  • FIG. 1 is a schematic flowchart of an embodiment of an interaction method for a video call according to this application. As shown in FIG. 1 , the interaction method for a video call may include S 101 and S 102 .
  • the first object corresponding to the first user and a second object corresponding to the second user may be displayed in the video call interface.
  • the first user is a user A
  • the second user is a user B
  • FIG. 2 is a schematic diagram of an example of an interaction interface for a video call according to this application.
  • objects a and b are displayed in a video call interface of the users A and B.
  • the object a is the first object corresponding to the user A
  • the object b is the second object corresponding to the user B.
  • the electronic device receives the first input by the second user for the first object corresponding to the first user in the video call interface, where the first input may include a touch input for a target portion of the first object.
  • the target portion may be a body portion selected by the second user from a plurality of body portions of the first object, for example, a face, a shoulder, a hand, or the like of the first object.
  • the touch input may be a click input, a double-click input, a pull input, a sliding input, a combination of two or more inputs, or the like of the second user for the target portion.
  • the first object corresponding to the first user is an object a
  • the first input may be a pull input by the second user for a face location N 1 (that is, the target portion) of the object a.
  • the receiving, in a case that a first user performs a video call with a second user, a first input by the second user for a first object corresponding to the first user in a video call interface may include: receiving, in a case that the first user performs the video call with the second user, the touch input by the second user for the target portion of the first object corresponding to the first user in the video call interface.
  • the second object corresponding to the second user may be a panoramic portrait of the second user.
  • the first preset display manner corresponding to the first input may include that: the target portion corresponding to the touch input is deformed.
  • the displaying, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input may include: displaying the first object whose target portion is deformed in the video call interface.
  • the first object corresponding to the first user is an object a
  • the electronic device receives a pull input by the second user for a face location N 1 of the object a
  • a face of the object a is the target portion.
  • the face of the object a may be correspondingly deformed.
  • an electronic device receives a touch input by a user for a target portion of a video call object, and may display, in response to the touch input, the video call object whose target portion is deformed in a video call interface, to realize an interesting interaction with the video call object, thereby enriching an interaction manner during a video call.
  • the first input may include a pull input, a click input, a double-click input, a sliding input, a rotation input, a combination of two or more inputs, or the like by the second user for the first object, and display manners corresponding to different first inputs may be set according to specific requirements.
  • the first preset display manner corresponding to the first input is that the first object rotates.
  • the first preset display manner corresponding to the first input may include: displaying the first object that jumps.
  • the first preset display manner corresponding to the first input may include that: the first object is enlarged or reduced.
  • the first object when the first object is displayed in the video call interface according to the first preset display manner corresponding to the first input, the first object may also be displayed in a video call interface of an electronic device corresponding to the first user according to the first preset display manner, to realize video call experience in which the two parties of the video call synchronously watch the interactive effect.
  • the second user may perform a first input for a first object that corresponds to the first user and that is displayed in a video call interface.
  • an electronic device may display, in response to the first input, the first object according to a first preset display manner corresponding to the first input. For example, a user may perform a touch operation on a target portion of a video call object displayed in a video call interface. In response to the touch operation, the target portion of the video call object displayed in the interface may be correspondingly deformed.
  • the displaying, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input may include: determining a behavioral feature of the first input; and displaying, in a case that the behavioral feature of the first input is consistent with a preset behavioral feature, the first object in the video call interface according to the first preset display manner corresponding to the first input.
  • the preset behavioral feature may be a behavioral feature corresponding to a safe behavior.
  • the electronic device may save an image corresponding to a current video call interface, detect and recognize the image to determine the behavioral feature corresponding to the first input, and match the behavioral feature corresponding to the first input with the preset behavioral feature.
  • the behavioral feature corresponding to the first input is consistent with the preset behavioral feature, the first input is determined as a safe behavior, and in this case, the first object is displayed according to the first preset display manner corresponding to the first input.
  • the safe behavior may include a touch input for a non-private portion, for example, a touch input for a hand, a head, a shoulder, an arm, or the like.
  • the first input is a touch input by the second user for a shoulder of the first object.
  • the electronic device receives the first input, and determines that the behavioral feature of the first input is “touching the shoulder”, which is consistent with the behavioral feature corresponding to the safe behavior. Therefore, the first object whose shoulder is patted on is displayed in the video call interface.
  • the method may further include: displaying prompt information, where the prompt information is used for prompting the second user to stop an unsafe behavior.
  • the first input is a touch input by the second user for a chest of the first object.
  • the electronic device receives the first input, and determines the behavioral feature of the first input is “touching the chest”, which belongs to an indecent behavior and is inconsistent with the behavioral feature corresponding to the safe behavior. Therefore, prompt information of “Please stop the unsafe behavior” may be displayed in the video call interface, to warn the second user.
  • the behavioral feature of the first input is determined, and only a safe behavior corresponding to a behavioral feature consistent with the preset behavioral feature is displayed in the video call interface, thereby effectively filtering out a behavior corresponding to a behavioral feature inconsistent with the preset behavioral feature, for example, an indecent behavior.
  • a behavior corresponding to a behavioral feature inconsistent with the preset behavioral feature for example, an indecent behavior.
  • the safety of the interaction behavior during the video call can be ensured, thereby improving safety of the video call.
  • an interaction control may be displayed in the video call interface, and the first input may include a click input for the interaction control.
  • the interaction control may be a control for implementing interaction with the first object, for example, an emotion control or an action control.
  • the interaction control corresponds to the first preset display manner.
  • the first preset display manner may include a preset interactive animation
  • the displaying, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input may include: displaying, in response to the first input, the preset interactive animation with the first object in the video call interface.
  • the preset interactive animation may be an animation of performing an interaction action on the first object, for example, an animation of holding hands with the first object, an animation of flicking a head of the first object, an animation of patting on the shoulder of the first object, or an animation of rubbing the face of the first object.
  • the first user corresponds to an object a.
  • an interaction control of “patting on the shoulder” is displayed in the video call interface.
  • the electronic device receives a click input by the second user for the interaction control, and displays, in response to the click input, an animation of using a hand to pat on a shoulder of the object a in the video call interface.
  • a user can implement interaction with a video call object through an operation of clicking an interaction control, which realizes intimate communication between two parties of a video call while simplifying the operation difficulty of the user, and further enriches the interaction manner during the video call.
  • the first input may further include a second input of moving the first object to a target region in the video call interface.
  • the receiving, in a case that a first user performs a video call with a second user, a first input by the second user for a first object corresponding to the first user in a video call interface may include: receiving, in a case that the first user performs the video call with the second user, the second input of moving the first object in the video call interface to the target region in the video call interface by the second user.
  • the displaying, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input may include: displaying the first object in the target region in the video call interface.
  • the target region may be any region selected by the second user in the video call interface.
  • the second input may be a drag input of moving the first object to the target region by the second user.
  • the target region is a region 1
  • the first object is an object a.
  • the electronic device receives a drag input of moving the object a to the region 1 by the second user, and displays, in response to the drag input, the object a in the region 1 .
  • the displaying the first object in the target region may include: displaying the first object and the second object in the target region.
  • the target region is a region 2
  • the first object is an object a
  • the second object is an object b
  • the object b is displayed in the region 2 .
  • the electronic device receives a drag input of moving the object a to the region 2 by the second user, and displays, in response to the drag input, the object a and the object b in the region 2 .
  • a user can perform a moving operation on a video call object in a video call interface and move the video call object to any region in the video call interface, and can also move the video call object to a region in which the user is located, thereby shortening a distance between two parties of a video call, and bringing new video call experience for the user.
  • a first control or a preset region may be displayed in the video call interface.
  • FIG. 7 is a schematic flowchart of another embodiment of an interaction method for a video call according to this application.
  • An execution subject of the interaction method for a video call may be an electronic device having at least one display screen.
  • the interaction method for a video call provided in this embodiment of this application may include S 701 to S 704 .
  • the preset region may be set according to a specific requirement, and may be any region in a video call interface.
  • the first control and the preset region may be used for enabling a “movement mode” of the video call interface. In the “movement mode”, a second user may move a first object.
  • the third input may be a click input, a sliding input, a double-click input, or the like by the second user for the first control or the preset region.
  • the first object is an object a
  • the first control is a “start to move!” control
  • the preset region is a region 3
  • the third input is a sliding input by the second user for the region 3 .
  • the electronic device receives the sliding input, and enables a “movement mode” of the video call interface. In the “movement mode”, the second user may freely move the object a.
  • the second control may be a tool for moving the first object, for example, a “drag tool” shown in FIG. 9 .
  • the target region may be any region selected by the second user in the video call interface.
  • the fourth input may be a drag input, a sliding input, a pull input, or the like by the second user for the second control.
  • the second user may move the second control to any body portion of the first object, and then drag the second control to drag the first object to the target region.
  • the first object is an object a
  • the second control is a “drag tool”
  • the target region is a region 4 .
  • the second user moves the “drag tool” to a head location of the object a, and then drags the object a to the region 4 through a drag input for the “drag tool”.
  • the object a is displayed in the region 4 .
  • an electronic device can move a video call object only when the electronic device receives an input by a user for a first control or a preset region, to avoid some misoperations caused by the user accidentally touching a video playback interface, for example, avoid the user accidentally moving a first object to a corner region of the video playback interface, thereby improving use experience of the user.
  • a user can change a display manner of a video call object in a video call interface, and can also change a display manner of an object corresponding to the user.
  • the interaction method for a video call provided in this embodiment of this application may further include: receiving a seventh input by a second user for the second object corresponding to the second user in the video call interface; and displaying, in response to the seventh input, the second object in the video call interface according to a second preset display manner corresponding to the seventh input.
  • the electronic device receives the seventh input by the second user for the second object corresponding to the second user in the video call interface, where the seventh input may be a pull input, a click input, a double-click input, a sliding input, a combination of two or more inputs, or the like of the second user for the second object.
  • display manners corresponding to different seventh inputs may be set according to specific requirements.
  • the second preset display manner corresponding to the seventh input may include: displaying the second object that jumps.
  • the second preset display manner corresponding to the seventh input may include that: the second object is enlarged or reduced.
  • the second object when the second object is displayed in the video call interface according to the second preset display manner corresponding to the seventh input, the second object may also be displayed in a video call interface of an electronic device corresponding to the first user according to the second preset display manner, to realize video call experience in which the two parties of the video call synchronously watch the interactive effect.
  • a user can change a display manner of a video call object in a video call interface, and can also change a display manner of an object corresponding to the user, thereby further enriching the interactive function during a video call, and providing comprehensive video call interactive experience for the user.
  • a user can freely replace a video call background, and a third control and the second object corresponding to the second user may be displayed in the video call interface.
  • FIG. 10 is a schematic flowchart of still another embodiment of an interaction method for a video call according to this application.
  • An execution subject of the interaction method for a video call may be an electronic device having at least one display screen.
  • the interaction method for a video call provided in this embodiment of this application may include S 1001 to S 1004 .
  • the third control may be a tool for replacing a video call background
  • the fifth input may be a click input, a double-click input, a sliding input, or the like by the second user for the third control.
  • the third control is a “background replacement” control
  • the fifth input is a double-click input by the second user for the “background replacement” control.
  • the scene information includes at least one of a current video call background of a first user, a current video call background of the second user, or preset virtual scene information.
  • the preset virtual scene information may be set according to a specific requirement.
  • an electronic device may directly obtain the current video call background of the second user through a depth of field effect and 360° panoramic imaging of a camera, and the current video call background may realistically represent a real environment where the second user is located.
  • the electronic device may directly obtain the current video call background of the first user displayed in a video call interface.
  • the preset virtual scene information may include game scene information, scenery scene information, and the like, and may also include image information locally stored by the electronic device, for example, information about a picture in a gallery.
  • the sixth input may be a click input, a double-click input, a sliding input, or the like by the second user for the target scene information from the plurality of pieces of scene information displayed in the interface.
  • a scene 1 to a scene 4 are displayed in the video call interface, and the sixth input is a click input by the second user for the scene 2 .
  • the second object is an object b.
  • the electronic device replaces a video call background of the object b with the scene 2 , and displays the object b in the scene 2 .
  • an electronic device can replace a video call background of the user with the target scene information, thereby providing a function of independently replacing the video call background for the user during a video call, and effectively improving user experience.
  • the displaying a second object in a scene corresponding to the target scene information may include: displaying a first object and the second object in the scene corresponding to the target scene information.
  • the first object is an object a
  • the second object is an object b
  • a current call background of the object a a “call scene of the other party”
  • a current call background of the object b a “current call scene”, a “park scene”, and a “super Mario scene” are displayed in a video playback interface.
  • the electronic device receives a click input by the second user for the “park scene”, replaces the video call backgrounds of the object a and the object b with the “park scene”, and simultaneously displays the objects a and b in the “park scene”.
  • a user can select a scene corresponding to target scene information as a public scene of two parties of a video call, and the two parties of the video call can simultaneously appear in the public scene, to simulate video call experience in which two parties of the video call communicate face-to-face.
  • intimacy between the two parties of the video call can be deepened.
  • the method may further include: controlling, in a case that the target scene information is game scene information, the first object and the second object to perform a preset game action.
  • a game mode may be enabled in the video call interface
  • a game control may be displayed in the video call interface in the game mode
  • the first user and the second user may control, by operating the game control, the first object and the second object to perform the preset game action, and display a corresponding game effect in the video call interface.
  • the second user enables, by operating the game control, the second object to launch a hidden weapon to hit the first object
  • the first object may perform an action of falling to the ground, and a scar effect appears at a hit position.
  • the electronic device may perform synchronous video recording in an interaction process between the first object and the second object in the game mode, and generate a short video for saving or sharing.
  • an electronic device when receiving an input of selecting game scene information by a user, can enable, in response to the input, a game mode of a video call interface, and can control, in the game mode, a first object and a second object to perform a preset game action, and effectively combine a video call with a game, thereby providing new video call experience for the user through the interesting interaction manner.
  • the electronic device in this embodiment of this application may have a single screen, a double screen, a multi-screen, a foldable screen, a retractable screen, or the like, which is not limited herein.
  • the displaying the first object in the video call interface according to a first preset display manner corresponding to the first input may include. displaying the first object in the first display screen according to the first preset display manner.
  • the electronic device includes a first display screen 1301 and a second display screen 1302 .
  • the first object namely, an object a
  • the second object corresponding to the second user namely, an object b
  • the electronic device may display, in response to the pull input, the object a whose face is correspondingly deformed in 1301 .
  • an electronic device includes more than one screen
  • two parties of a video call can be displayed on two screens respectively, and a user can perform some touch screen operations on a screen on which a video call object is displayed to change a display manner of the video call object in a video call interface, thereby realizing an interesting interactive effect, and effectively improving video call experience of the user.
  • an execution subject of the interaction method for a video call may be an electronic device having at least one display screen, and may also be an interaction apparatus for a video call or a module configured to perform the interaction method for a video call in the interaction apparatus for a video call.
  • FIG. 15 is a schematic structural diagram of an interaction apparatus for a video call according to this application.
  • the interaction apparatus for a video call may be applied to an electronic device having at least one display screen.
  • an interaction apparatus 1500 for a video call may include: a receiving module 1501 and a display module 1502 .
  • the receiving module 1501 is configured to receive, in a case that a first user performs a video call with a second user, a first input by the second user for a first object corresponding to the first user in a video call interface.
  • the display module 1502 is configured to display, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input, where the first input includes a touch input for a target portion of the first object, and the first preset display manner includes that the target portion corresponding to the touch input is deformed.
  • the second user may perform a first input for a first object that corresponds to the first user and that is displayed in a video call interface.
  • an electronic device may display, in response to the first input, the first object according to a first preset display manner corresponding to the first input. For example, a user may perform a touch operation on a target portion of a video call object displayed in a video call interface. In response to the touch operation, the target portion of the video call object displayed in the interface may be correspondingly deformed.
  • a diversified interaction manner can be provided for the user during a video call, and the user can change a display manner of the video call object in the video call interface through some operations implemented on the video call interface, thereby realizing an interesting interactive effect, and effectively improving video call experience of the user.
  • the apparatus further includes: a determining module 1503 , configured to determine a behavioral feature of the first input; and the display module 1502 is configured to: display, in a case that the behavioral feature of the first input is consistent with a preset behavioral feature, the first object in the video call interface according to the first preset display manner corresponding to the first input.
  • a determining module 1503 configured to determine a behavioral feature of the first input
  • the display module 1502 is configured to: display, in a case that the behavioral feature of the first input is consistent with a preset behavioral feature, the first object in the video call interface according to the first preset display manner corresponding to the first input.
  • the behavioral feature of the first input is determined, and only a safe behavior corresponding to a behavioral feature consistent with the preset behavioral feature is displayed in the video call interface, thereby effectively filtering out a behavior corresponding to a behavioral feature inconsistent with the preset behavioral feature, for example, an indecent behavior.
  • a behavior corresponding to a behavioral feature inconsistent with the preset behavioral feature for example, an indecent behavior.
  • the safety of the interaction behavior during the video call can be ensured, thereby improving safety of the video call.
  • the first input further includes a second input of moving the first object to a target region in the video call interface; and the display module 1502 is further configured to display the first object in the target region.
  • the display module 1502 is configured to display the first object and the second object in the target region.
  • a user can perform a moving operation on a video call object in a video call interface and move the video call object to any region in the video call interface, and can also move the video call object to a region in which the user is located, thereby shortening a distance between two parties of a video call, and bringing new video call experience for the user.
  • a first control or a preset region is displayed in the video call interface; the receiving module 1501 is further configured to receive a third input for the first control or the preset region; the display module 1502 is further configured to display a second control in response to the third input; the receiving module 1501 is further configured to receive a fourth input of dragging the first object to a target region in the video call interface by the second user by using the second control; and the display module 1502 is further configured to display, in response to the fourth input, the first object in the target region.
  • an electronic device can move a video call object only when the electronic device receives an input by a user for a first control or a preset region, to avoid some misoperations caused by the user accidentally touching a video playback interface, for example, avoid the user accidentally moving a first object to a corner region of the video playback interface, thereby improving use experience of the user.
  • a third control and a second object corresponding to the second user are displayed in the video call interface; the receiving module 1501 is further configured to receive a fifth input by the second user for the third control; the display module 1502 is further configured to display a plurality of pieces of scene information in response to the fifth input; the receiving module 1501 is further configured to receive a sixth input of selecting target scene information from the plurality of pieces of scene information by the second user; and the display module 1502 is further configured to display, in response to the sixth input, the second object in a scene corresponding to the target scene information.
  • an electronic device can replace a video call background of the user with the target scene information, thereby providing a function of independently replacing the video call background for the user during a video call, and effectively improving user experience.
  • the scene information includes at least one of a current video call background of a first user, a current video call background of the second user, or preset virtual scene information; and the display module 1502 is configured to display the first object and the second object in the scene corresponding to the target scene information.
  • a user can select a scene corresponding to target scene information as a public scene of two parties of a video call, and the two parties of the video call can simultaneously appear in the public scene, to simulate video call experience in which two parties of the video call communicate face-to-face.
  • intimacy between the two parties of the video call can be deepened.
  • the interaction apparatus for a video call in this embodiment of this application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal.
  • the apparatus may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (PDA), or the like
  • the non-mobile electronic device may be a server, a network attached memory (NAS), a personal computer (PC), a television (TV), a teller machine, a self-service machine, or the like. This is not specifically limited in this embodiment of this application.
  • the interaction apparatus for a video call in this embodiment of this application may be an apparatus having an operating system.
  • the operating system may be an Android operating system, may be an iOS operating system, or may be another possible operating system, which is not specifically limited in this embodiment of this application.
  • the interaction apparatus for a video call provided in this embodiment of this application can implement all processes implemented by the method embodiments of FIG. 1 , FIG. 7 , and FIG. 10 . To avoid repetition, details are not described herein again.
  • FIG. 16 is a schematic diagram of a hardware structure of an example of an electronic device according to an embodiment of this application.
  • an electronic device 1600 includes: a processor 1601 , a memory 1602 , and a program or instruction stored on the memory 1602 and executable on the processor 1601 .
  • the program or instruction when executed by the processor 1601 , implements all processes of the embodiments of the foregoing interaction method for a video call, and can achieve the same technical effects. To avoid repetition, details are not described herein again.
  • the electronic device in this embodiment of this application includes the foregoing mobile electronic device and non-mobile electronic device.
  • FIG. 17 is a schematic diagram of a hardware structure of another example of an electronic device according to an embodiment of this application.
  • an electronic device 1700 includes, but is not limited to, components such as a radio frequency unit 1701 , a network module 1702 , an audio output unit 1703 , an input unit 1704 , a sensor 1705 , a display unit 1706 , a user input unit 1707 , an interface unit 1708 , a memory 1709 , and a processor 1710 .
  • components such as a radio frequency unit 1701 , a network module 1702 , an audio output unit 1703 , an input unit 1704 , a sensor 1705 , a display unit 1706 , a user input unit 1707 , an interface unit 1708 , a memory 1709 , and a processor 1710 .
  • the electronic device 1700 further includes a power supply (such as a battery) for supplying power to the components.
  • the power supply may logically connect to the processor 1710 by using a power supply management system, thereby implementing functions, such as charging, discharging, and power consumption management, by using the power supply management system.
  • the structure of the electronic device shown in FIG. 17 constitutes no limitation on the electronic device, and the electronic device may include more or fewer components than those shown in the figure, or some components may be combined, or a different component deployment may be used. Details are not described herein again.
  • the user input unit 1707 is configured to receive, in a case that a first user performs a video call with a second user, a first input by the second user for a first object corresponding to the first user in a video call interface.
  • the display unit 1706 is configured to display, in response to the first input, the first object in the video call interface according to a first preset display manner corresponding to the first input, where the first input includes a touch input for a target portion of the first object, and the first preset display manner includes that the target portion corresponding to the touch input is deformed.
  • the second user may perform a first input for a first object that corresponds to the first user and that is displayed in a video call interface.
  • an electronic device may display, in response to the first input, the first object according to a first preset display manner corresponding to the first input. For example, a user may perform a touch operation on a target portion of a video call object displayed in a video call interface. In response to the touch operation, the target portion of the video call object displayed in the interface may be correspondingly deformed.
  • a diversified interaction manner can be provided for the user during a video call, and the user can change a display manner of the video call object in the video call interface through some operations implemented on the video call interface, thereby realizing an interesting interactive effect, and effectively improving video call experience of the user.
  • the processor 1710 is configured to determine a behavioral feature of the first input; and the display unit 1706 is configured to: display, in a case that the behavioral feature of the first input is consistent with a preset behavioral feature, the first object in the video call interface according to the first preset display manner corresponding to the first input.
  • the behavioral feature of the first input is determined, and only a safe behavior corresponding to a behavioral feature consistent with the preset behavioral feature is displayed in the video call interface, thereby effectively filtering out a behavior corresponding to a behavioral feature inconsistent with the preset behavioral feature, for example, an indecent behavior.
  • a behavior corresponding to a behavioral feature inconsistent with the preset behavioral feature for example, an indecent behavior.
  • the safety of the interaction behavior during the video call can be ensured, thereby improving safety of the video call.
  • the first input further includes a second input of moving the first object to a target region in the video call interface; and the display unit 1706 is further configured to display the first object in the target region.
  • the display unit 1706 is configured to display the first object and the second object in the target region.
  • a user can perform a moving operation on a video call object in a video call interface and move the video call object to any region in the video call interface, and can also move the video call object to a region in which the user is located, thereby shortening a distance between two parties of a video call, and bringing new video call experience for the user.
  • a first control or a preset region is displayed in the video call interface; the user input unit 1707 is further configured to receive a third input for the first control or the preset region; the display unit 1706 is further configured to display a second control in response to the third input; the user input unit 1707 is further configured to receive a fourth input of dragging the first object to a target region in the video call interface by the second user by using the second control; and the display unit 1706 is further configured to display, in response to the fourth input, the first object in the target region.
  • an electronic device can move a video call object only when the electronic device receives an input by a user for a first control or a preset region, to avoid some misoperations caused by the user accidentally touching a video playback interface, for example, avoid the user accidentally moving a first object to a corner region of the video playback interface, thereby improving use experience of the user.
  • a third control and a second object corresponding to the second user are displayed in the video call interface; the user input unit 1707 is further configured to receive a fifth input by the second user for the third control; the display unit 1706 is further configured to display a plurality of pieces of scene information in response to the fifth input; the user input unit 1707 is further configured to receive a sixth input of selecting target scene information from the plurality of pieces of scene information by the second user; and the display unit 1706 is further configured to display, in response to the sixth input, the second object in a scene corresponding to the target scene information.
  • an electronic device can replace a video call background of the user with the target scene information, thereby providing a function of independently replacing the video call background for the user during a video call, and effectively improving user experience.
  • the scene information includes at least one of a current video call background of a first user, a current video call background of the second user, or preset virtual scene information; and the display unit 1706 is configured to display the first object and the second object in the scene corresponding to the target scene information.
  • a user can select a scene corresponding to target scene information as a public scene of two parties of a video call, and the two parties of the video call can simultaneously appear in the public scene, to simulate video call experience in which two parties of the video call communicate face-to-face.
  • intimacy between the two parties of the video call can be deepened.
  • An embodiment of this application further provides a readable storage medium, storing a program or instruction.
  • the program or instruction when executed by a processor, implements all processes of the embodiments of the foregoing interaction method for a video call, and can achieve the same technical effects. To avoid repetition, details are not described herein again.
  • the processor is the processor in the foregoing electronic device in the foregoing embodiments.
  • the readable storage medium includes a computer-readable storage medium, and an example of the computer-readable storage medium includes a non-transient computer-readable storage medium, for example, a computer Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disc.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • magnetic disk or an optical disc.
  • An embodiment of this application further provides a chip, including a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to run a program or instruction, to implement all processes of the embodiments of the foregoing interaction method for a video call, and can achieve the same technical effects. To avoid repetition, details are not described herein again.
  • the chip mentioned in this embodiment of this application may also be referred to as a system-level chip, a system chip, a chip system, a system on chip, or the like.
  • the term “include”, “comprise” or any other variation thereof in this specification is intended to cover a non-exclusive inclusion, which specifies the presence of stated processes, methods, objects, or apparatuses, but does not preclude the presence or addition of one or more other processes, methods, objects, or apparatuses. Without more limitations, elements defined by the sentence “including one . . . ” does not exclude that there are still other same elements in the processes, methods, objects, or apparatuses.
  • the scope of the methods and apparatuses in the implementations of this application is not limited to performing the functions in the order shown or discussed, but may also include performing, according to involved functions, the functions basically simultaneously or in a reverse order. For example, the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined.
  • features described with reference to some examples may be combined in other examples.
  • Such a processor may be, but not limited to, a general-purpose processor, a special-purpose processor, an application-specific processor or a field-programmable logic circuit. It should be further noted that, each box in a block diagram and/or a flowchart and a combination of boxes in the block diagram and/or the flowchart may be implemented by using a dedicated hardware configured to perform a specified function or action, or may be implemented by using a combination of dedicated hardware and a computer instruction.
  • the method according to the foregoing embodiments may be implemented by means of software and a necessary general hardware platform, and may be implemented by hardware, but in many cases, the former manner is a better implementation.
  • the technical solutions in this application essentially or the part contributing to the existing technologies may be implemented in the form of a software product.
  • the computer software product is stored in a storage medium (for example, a ROM/RAM, a magnetic disk, or an optical disc), and includes several instructions for instructing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, a network device, or the like) to perform the method described in the embodiments of this application.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
US18/138,076 2020-10-27 2023-04-22 Interaction method and apparatus for video call Pending US20230259260A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202011167360.2A CN112363658B (zh) 2020-10-27 2020-10-27 视频通话的互动方法和装置
CN202011167360.2 2020-10-27
PCT/CN2021/124942 WO2022089273A1 (zh) 2020-10-27 2021-10-20 视频通话的互动方法和装置

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/124942 Continuation WO2022089273A1 (zh) 2020-10-27 2021-10-20 视频通话的互动方法和装置

Publications (1)

Publication Number Publication Date
US20230259260A1 true US20230259260A1 (en) 2023-08-17

Family

ID=74510961

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/138,076 Pending US20230259260A1 (en) 2020-10-27 2023-04-22 Interaction method and apparatus for video call

Country Status (5)

Country Link
US (1) US20230259260A1 (zh)
EP (1) EP4220369A4 (zh)
KR (1) KR20230047172A (zh)
CN (1) CN112363658B (zh)
WO (1) WO2022089273A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112363658B (zh) * 2020-10-27 2022-08-12 维沃移动通信有限公司 视频通话的互动方法和装置
CN113286082B (zh) * 2021-05-19 2023-06-30 Oppo广东移动通信有限公司 目标对象跟踪方法、装置、电子设备及存储介质
CN115396390A (zh) * 2021-05-25 2022-11-25 Oppo广东移动通信有限公司 基于视频聊天的互动方法、系统、装置及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298827A1 (en) * 2010-06-02 2011-12-08 Microsoft Corporation Limiting avatar gesture display
US20130141515A1 (en) * 2011-12-01 2013-06-06 Eric Setton Augmenting a video conference
US20140267546A1 (en) * 2013-03-15 2014-09-18 Yunmi Kwon Mobile terminal and controlling method thereof
US20200359893A1 (en) * 2019-05-15 2020-11-19 Rollins Enterprises, Llc. Virtual consultation methods

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101449751B1 (ko) * 2008-03-27 2014-10-13 주식회사 케이티 영상 통화 중 터치 피드백 제공 기능을 구비한 이동 단말기및 영상 통화 중 터치 피드백 제공 방법
WO2012007034A1 (en) * 2010-07-13 2012-01-19 Nokia Corporation Sending and receiving information
US8730294B2 (en) * 2010-10-05 2014-05-20 At&T Intellectual Property I, Lp Internet protocol television audio and video calling
CN105554429A (zh) * 2015-11-19 2016-05-04 掌赢信息科技(上海)有限公司 一种视频通话显示方法及视频通话设备
CN105872438A (zh) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 一种视频通话方法、装置及终端
CN106067960A (zh) * 2016-06-20 2016-11-02 努比亚技术有限公司 一种处理视频数据的移动终端和方法
CN107071330A (zh) * 2017-02-28 2017-08-18 维沃移动通信有限公司 一种视频通话互动的方法及移动终端
CN107197194A (zh) * 2017-06-27 2017-09-22 维沃移动通信有限公司 一种视频通话方法及移动终端
CN107529096A (zh) * 2017-09-11 2017-12-29 广东欧珀移动通信有限公司 图像处理方法及装置
CN108259810A (zh) * 2018-03-29 2018-07-06 上海掌门科技有限公司 一种视频通话的方法、设备和计算机存储介质
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera
CN109873971A (zh) * 2019-02-27 2019-06-11 上海游卉网络科技有限公司 一种美妆通话系统及其方法
CN109862434A (zh) * 2019-02-27 2019-06-07 上海游卉网络科技有限公司 一种美妆通话系统及其方法
CN111010526A (zh) * 2019-11-11 2020-04-14 珠海格力电器股份有限公司 一种视频通讯中的互动方法及装置
CN112363658B (zh) * 2020-10-27 2022-08-12 维沃移动通信有限公司 视频通话的互动方法和装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298827A1 (en) * 2010-06-02 2011-12-08 Microsoft Corporation Limiting avatar gesture display
US20130141515A1 (en) * 2011-12-01 2013-06-06 Eric Setton Augmenting a video conference
US20140267546A1 (en) * 2013-03-15 2014-09-18 Yunmi Kwon Mobile terminal and controlling method thereof
US20200359893A1 (en) * 2019-05-15 2020-11-19 Rollins Enterprises, Llc. Virtual consultation methods

Also Published As

Publication number Publication date
WO2022089273A1 (zh) 2022-05-05
EP4220369A1 (en) 2023-08-02
CN112363658B (zh) 2022-08-12
KR20230047172A (ko) 2023-04-06
EP4220369A4 (en) 2024-03-20
CN112363658A (zh) 2021-02-12

Similar Documents

Publication Publication Date Title
US20230259260A1 (en) Interaction method and apparatus for video call
EP3835933A1 (en) Product browsing method and apparatus, device and storage medium
WO2020034747A1 (zh) 图片生成方法、装置、设备及存储介质
US20150309678A1 (en) Methods and apparatus for rendering a collection of widgets on a mobile device display
CN107977141B (zh) 交互控制方法、装置、电子设备及存储介质
WO2022063022A1 (zh) 视频预览方法、装置及电子设备
CN112907760B (zh) 三维对象的标注方法及装置、工具、电子设备和存储介质
CN113163050B (zh) 会话界面显示方法及装置
US9495064B2 (en) Information processing method and electronic device
WO2022063045A1 (zh) 消息显示方法、装置及电子设备
CN111467791A (zh) 目标对象的控制方法及装置、系统
CN109947506B (zh) 界面切换方法、装置及电子设备
CN112148167A (zh) 控件设置方法、装置和电子设备
WO2022135290A1 (zh) 截屏方法、装置及电子设备
WO2022022729A1 (zh) 渲染控制方法、设备以及系统
CN112911052A (zh) 信息分享方法和装置
WO2024021635A1 (zh) 移动控制的方法、装置、存储介质及电子设备
US10102395B2 (en) System and method for creating and transitioning to multiple facets of a social media object in a social network
US11935172B2 (en) Method, system, and non-transitory computer readable record medium for expressing emotion in conversation message using gesture
WO2022135219A1 (zh) 图像显示方法、装置和电子设备
CN113304477A (zh) 一种游戏界面显示的方法、装置、设备及介质
CN110853643A (zh) 快应用中进行语音识别的方法、装置、设备及存储介质
US20240196082A1 (en) Image Processing Method and Apparatus, and Electronic Device
WO2024114571A1 (zh) 信息显示方法、装置、电子设备和存储介质
CN117899488A (zh) 虚拟角色移动的碰撞检测方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
AS Assignment

Owner name: VIVO MOBILE COMMUNICATION CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HU, SHUANGSHUANG;REEL/FRAME:063422/0750

Effective date: 20230314

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED