CN106484114B - Interaction control method and device based on virtual reality - Google Patents

Interaction control method and device based on virtual reality Download PDF

Info

Publication number
CN106484114B
CN106484114B CN201610888660.7A CN201610888660A CN106484114B CN 106484114 B CN106484114 B CN 106484114B CN 201610888660 A CN201610888660 A CN 201610888660A CN 106484114 B CN106484114 B CN 106484114B
Authority
CN
China
Prior art keywords
user
target content
angle range
body part
virtual reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610888660.7A
Other languages
Chinese (zh)
Other versions
CN106484114A (en
Inventor
余盈亿
郭元
徐昊
李捷
潘柏宇
王冀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Wiring Network Technology (shanghai) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wiring Network Technology (shanghai) Co Ltd filed Critical Wiring Network Technology (shanghai) Co Ltd
Priority to CN201610888660.7A priority Critical patent/CN106484114B/en
Publication of CN106484114A publication Critical patent/CN106484114A/en
Application granted granted Critical
Publication of CN106484114B publication Critical patent/CN106484114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Abstract

The invention relates to an interaction control method and device based on virtual reality. The method comprises the following steps: acquiring the moving direction of a specified body part of a user; the control target content is moved in a direction opposite to a direction in which the designated body part of the user is moved. According to the interaction control method and device based on the virtual reality, the movement of the target content can be controlled by the movement of the body part of the user, so that the operation convenience and the user experience of the interaction control based on the virtual reality can be improved.

Description

Interaction control method and device based on virtual reality
Technical Field
The invention relates to the technical field of virtual reality, in particular to an interaction control method and device based on virtual reality.
Background
Virtual Reality (VR) technology is a computer simulation system that creates and experiences a Virtual world, which uses a computer to create a simulated environment. The virtual reality technology mainly comprises the aspects of simulating environment, perception, natural skill, sensing equipment and the like. The simulated environment is a three-dimensional realistic image generated by a computer and dynamic in real time. Perception means that an ideal VR should have the perception that everyone has. In addition to the visual perception generated by computer graphics technology, there are also perceptions such as auditory sensation, tactile sensation, force sensation, and movement, and even olfactory sensation and taste sensation, which are also called multi-perception. The natural skill refers to the head rotation, eyes, gestures or other human body behavior actions of a human, the data which is suitable for the actions of the participants is processed by the computer, and the real-time response is made to the input of the user and is respectively fed back to the five sense organs of the user. The sensing device refers to a three-dimensional interaction device.
In the related art, if a user wants to call out hidden contents in a VR interface, a control in the VR interface needs to be operated; if the user wants to hide the content that has been called out, the user needs to operate the control in the VR interface again. The operation convenience of the mode of carrying out interactive control on the VR interface by operating the control in the VR interface is low.
Disclosure of Invention
Technical problem
In view of this, the technical problem to be solved by the present invention is that the operation convenience of the virtual reality-based interactive control technology is low.
Solution scheme
In order to solve the above technical problem, according to an embodiment of the present invention, there is provided an interaction control method based on virtual reality, including:
acquiring the moving direction of a specified body part of a user;
the control target content is moved in a direction opposite to a direction in which the designated body part of the user is moved.
For the above method, in one possible implementation, the moving of the control target content to the direction opposite to the direction of the movement of the designated body part of the user includes:
and under the condition that the target content is not completely within the front visual angle range of the virtual reality interface, if the moving direction of the appointed body part of the user is the non-front visual angle range from the front visual angle range to the target content, controlling the target content to move into the front visual angle range.
For the above method, in one possible implementation, the moving of the control target content to the direction opposite to the direction of the movement of the designated body part of the user includes:
and under the condition that the target content is within the front visual angle range of the virtual reality interface, if the moving direction of the appointed body part of the user is the non-front visual angle range from the front visual angle range to the virtual reality interface, controlling the target content to move into the non-front visual angle range.
For the above method, in one possible implementation, the method further includes:
acquiring the moving distance of the appointed body part of the user;
and determining the moving distance of the target content according to the moving distance of the appointed body part of the user.
For the above method, in one possible implementation, the designated body part of the user is the head of the user, the torso of the user, or the eyeball of the user.
In order to solve the above technical problem, according to another embodiment of the present invention, there is provided a virtual reality-based interaction control apparatus including:
the direction acquisition module is used for acquiring the moving direction of the appointed body part of the user;
and the movement control module is used for controlling the target content to move towards the direction opposite to the direction of the movement of the specified body part of the user.
For the above apparatus, in one possible implementation manner, the movement control module includes:
the first movement control submodule is used for controlling the target content to move to the front visual angle range if the movement direction of the specified body part of the user is the non-front visual angle range from the front visual angle range to which the target content is located under the condition that the target content is not completely located in the front visual angle range of the virtual reality interface.
For the above apparatus, in one possible implementation manner, the movement control module includes:
and the second movement control submodule is used for controlling the target content to move to the non-frontal visual angle range if the movement direction of the specified body part of the user is from the frontal visual angle range to the non-frontal visual angle range of the virtual reality interface under the condition that the target content is in the frontal visual angle range of the virtual reality interface.
For the above apparatus, in one possible implementation manner, the apparatus further includes:
the distance acquisition module is used for acquiring the moving distance of the appointed body part of the user;
and the distance determining module is used for determining the moving distance of the target content according to the moving distance of the appointed body part of the user.
With regard to the above apparatus, in one possible implementation, the designated body part of the user is the head of the user, the torso of the user, or the eyeball of the user.
Advantageous effects
By acquiring the moving direction of the specified body part of the user and controlling the target content to move in the direction opposite to the moving direction of the specified body part of the user, the interaction control method and the interaction control device based on the virtual reality can control the movement of the target content by utilizing the movement of the body part of the user, so that the operation convenience and the user experience of interaction control based on the virtual reality can be improved.
Other features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
FIG. 1 illustrates a flow diagram of a virtual reality-based interaction control method according to an embodiment of the invention;
FIG. 2 is a schematic diagram illustrating a target content not completely within a front view range of a virtual reality interface in a virtual reality-based interaction control method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a target content within a front view range of a virtual reality interface in a virtual reality-based interaction control method according to an embodiment of the present invention;
FIG. 4 illustrates an exemplary flow diagram of a virtual reality-based interaction control method according to an embodiment of the present invention;
fig. 5 is a block diagram illustrating a virtual reality-based interactive control apparatus according to another embodiment of the present invention;
FIG. 6 is a block diagram illustrating an exemplary structure of a virtual reality-based interactive control apparatus according to another embodiment of the present invention;
fig. 7 shows a block diagram of a virtual reality-based interactive control device according to another embodiment of the present invention.
Detailed Description
Various exemplary embodiments, features and aspects of the present invention will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present invention. It will be understood by those skilled in the art that the present invention may be practiced without some of these specific details. In some instances, methods, procedures, components, and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present invention.
Example 1
Fig. 1 shows a flowchart of a virtual reality-based interaction control method according to an embodiment of the present invention. As shown in fig. 1, the method mainly includes:
in step S11, the direction in which the specified body part of the user moves is acquired.
The execution main part of this embodiment can be various types have screen display ability, the VR equipment of interactive control ability, for example can be the split type VR equipment of constitution on terminal equipment such as with cell-phone installation on support equipment such as VR picture frame, or can be integral type VR equipment such as VR glasses (or be called VR head-mounted display).
In this embodiment, the movement of the designated body part of the user may be detected in real time, and the direction in which the designated body part of the user moves may be acquired when the movement of the designated body part of the user is detected, or when the movement distance of the designated body part of the user is detected to be greater than a first preset value.
In step S12, the control target content is moved in the direction opposite to the direction in which the user' S designated body part is moved.
For example, the designated body part of the user is the head of the user, and when the head of the user is detected to move, the direction of the head movement of the user is acquired, and the target content is controlled to move in the direction opposite to the direction of the head movement of the user. For example, if it is detected that the user lowers his head downward, the target content may be controlled to move upward; if the user is detected to move up and up, the target content can be controlled to move down.
According to the interaction control method based on the virtual reality, the movement of the target content can be controlled by utilizing the movement of the body part of the user, so that the operation convenience and the user experience of the interaction control based on the virtual reality can be improved.
In one possible implementation, the controlling the target content to move in a direction opposite to a direction in which the designated body part of the user moves includes: and under the condition that the target content is not completely within the front visual angle range of the virtual reality interface, if the moving direction of the appointed body part of the user is the non-front visual angle range from the front visual angle range to the target content, controlling the target content to move into the front visual angle range. The target content is not completely positioned in the front visual angle range of the virtual reality interface, namely, the target content is not completely positioned in the front visual angle range, namely, the user can not see the target content completely; the target content is not completely within the front visual angle range of the virtual reality interface, and it may be that a part of the target content is within the front visual angle range and the other part of the target content is not within the front visual angle range, that is, the user can only see a part of the target content through downward residual light, and the other part of the target content is not visible.
In one possible implementation, the controlling the target content to move in a direction opposite to a direction in which the designated body part of the user moves includes: and under the condition that the target content is in the front visual angle range of the virtual reality interface, if the moving direction of the specified body part of the user is the non-front visual angle range from the front visual angle range to the virtual reality interface, controlling the target content to move to the non-front visual angle range.
Fig. 2 is a schematic diagram illustrating that target content is not completely within a front view range of a virtual reality interface in a virtual reality-based interaction control method according to an embodiment of the present invention.
Fig. 3 is a schematic diagram illustrating a target content within a front view range of a virtual reality interface in a virtual reality-based interaction control method according to an embodiment of the present invention.
For example, as shown in fig. 2, a part of the target content 21 is within a front perspective range 22 of the virtual reality interface, and the other part of the target content 21 is within a non-front perspective range 23 below the front perspective range 22, in this case, if it is detected that the user lowers the head downwards, the target content 21 may be controlled to slide upwards out of the front perspective range 22, as shown in fig. 3, so as to call out the target content 21.
For another example, as shown in fig. 3, the target content 21 is within a front view angle range 22 of the virtual reality interface, in this case, if it is detected that the user raises his head upwards, the target content 21 may be controlled to be retracted downwards into a non-front view angle range 23 below the front view angle range 22, that is, the target content 21 may be controlled to move out of the front view angle range 22, as shown in fig. 2, so as to hide the target content 21.
Fig. 4 shows an exemplary flowchart of a virtual reality-based interaction control method according to an embodiment of the present invention. As shown in fig. 4, the method includes:
in step S41, the direction in which the specified body part of the user moves is acquired.
In step S42, the distance the specified body part of the user moves is acquired.
In step S43, the control target content is moved in the direction opposite to the direction in which the user' S designated body part is moved.
In step S44, the distance that the target content moves is determined according to the distance that the specified body part of the user moves.
As an example of the embodiment, the distance that the target content moves may be positively correlated with the distance that the designated body part of the user moves, that is, the larger the distance that the designated body part of the user moves, the larger the distance that the target content moves.
In one possible implementation, the designated body part of the user is the user's head, the user's torso, or the user's eyeball. Wherein, the movement of the user's eyeball may refer to a movement of the user's point of view, i.e., a movement of the user's eyeball viewpoint recognized by the VR device.
According to the interaction control method based on the virtual reality, a more natural limb interaction mode is allowed, and the user experience of interaction control based on the virtual reality can be improved.
Example 2
Fig. 5 is a block diagram illustrating a virtual reality-based interactive control apparatus according to another embodiment of the present invention. For convenience of explanation, only a part related to the present embodiment is shown in fig. 5.
As shown in fig. 5, the virtual reality-based interactive control apparatus includes: a direction acquiring module 51 for acquiring a direction in which a specified body part of the user moves; a movement control module 52 for controlling the target content to move in a direction opposite to the direction in which the designated body part of the user moves.
Fig. 6 is a block diagram illustrating an exemplary structure of a virtual reality-based interactive control apparatus according to another embodiment of the present invention. For convenience of explanation, only a part related to the present embodiment is shown in fig. 6. Components in fig. 6 that are numbered the same as those in fig. 5 have the same functions, and detailed descriptions of these components are omitted for the sake of brevity. As shown in fig. 6:
in one possible implementation, the movement control module 52 includes: the first movement control submodule 521 is configured to, under the condition that the target content is not completely within the front view angle range of the virtual reality interface, control the target content to move into the front view angle range if the direction in which the specified body part of the user moves is from the front view angle range to the non-front view angle range in which the target content is located.
In one possible implementation, the movement control module 52 includes: the second movement control sub-module 522 is configured to, when the target content is within a front view angle range of a virtual reality interface, control the target content to move into a non-front view angle range of the virtual reality interface if the direction in which the specified body part of the user moves is from the front view angle range to the non-front view angle range of the virtual reality interface.
In one possible implementation, the apparatus further includes: a distance acquisition module 53, configured to acquire a distance that a specified body part of the user moves; a distance determining module 54, configured to determine a distance moved by the target content according to a distance moved by the designated body part of the user.
In one possible implementation, the designated body part of the user is the head of the user, the torso of the user, or the eyeball of the user.
According to the interaction control device based on the virtual reality, the movement of the target content can be controlled by utilizing the movement of the body part of the user, so that the operation convenience and the user experience of the interaction control based on the virtual reality can be improved.
Example 3
Fig. 7 shows a block diagram of a virtual reality-based interactive control device according to another embodiment of the present invention. The virtual reality based interactive control device 1100 may be a host server with computing capabilities, a personal computer PC, or a portable computer or terminal that can be carried, etc. The specific embodiments of the present invention do not limit the specific implementation of the compute node.
The virtual reality-based interactive control device 1100 includes a processor (processor)1110, a communication Interface (Communications Interface)1120, a memory 1130, and a bus 1140. The processor 1110, the communication interface 1120, and the memory 1130 communicate with each other via the bus 1140.
The communication interface 1120 is used to communicate with network devices, including, for example, virtual machine management centers, shared storage, and the like.
Processor 1110 is configured to execute programs. Processor 1110 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present invention.
The memory 1130 is used to store files. The memory 1130 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 1130 may also be a memory array. The storage 1130 may also be partitioned and the blocks may be combined into virtual volumes according to certain rules.
In one possible embodiment, the program may be a program code including computer operation instructions. The procedure is particularly useful for: the operations of the steps in example 1 were carried out.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may select different ways to implement the described functionality for specific applications, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
If the described functionality is implemented in the form of computer software and sold or used as a stand-alone product, it is to some extent possible to consider all or part of the technical solution of the invention (for example, the part contributing to the prior art) to be embodied in the form of a computer software product. The computer software product is generally stored in a non-volatile storage medium readable by a computer and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the methods according to the embodiments of the present invention. The storage medium includes various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (6)

1. An interaction control method based on virtual reality is characterized by comprising the following steps:
acquiring the moving direction of a specified body part of a user;
controlling the target content to move in a direction opposite to a direction in which the designated body part of the user moves;
wherein the control target content moves in a direction opposite to a direction in which the designated body part of the user moves, including:
under the condition that the target content is not completely within a front visual angle range of a virtual reality interface, if the direction of the movement of the specified body part of the user is a non-front visual angle range from the front visual angle range to the target content, controlling the target content to move into the front visual angle range to call the target content, wherein under the condition that the target content is not completely within the front visual angle range, the user can only see a part of the target content, or the user cannot see the target content at all;
and under the condition that the target content is within the front visual angle range of the virtual reality interface, if the moving direction of the appointed body part of the user is the non-front visual angle range from the front visual angle range to the virtual reality interface, controlling the target content to move into the non-front visual angle range so as to hide the target content.
2. The method of claim 1, further comprising:
acquiring the moving distance of the appointed body part of the user;
and determining the moving distance of the target content according to the moving distance of the appointed body part of the user.
3. The method of claim 1, wherein the designated body part of the user is the user's head, the user's torso, or the user's eye.
4. An interaction control device based on virtual reality, comprising:
the direction acquisition module is used for acquiring the moving direction of the appointed body part of the user;
a movement control module for controlling the target content to move in a direction opposite to a direction in which the designated body part of the user moves;
wherein the movement control module comprises:
a first movement control sub-module, configured to, when the target content is not completely within a front view angle range of a virtual reality interface, if a direction in which a specified body part of the user moves is a non-front view angle range in which the target content is located from the front view angle range, control the target content to move into the front view angle range to call the target content, where, when the target content is not completely within the front view angle range, the user may only see a part of the target content, or the user may not see the target content at all;
and the second movement control submodule is used for controlling the target content to move to the non-frontal visual angle range to hide the target content if the movement direction of the specified body part of the user is from the frontal visual angle range to the non-frontal visual angle range of the virtual reality interface under the condition that the target content is in the frontal visual angle range of the virtual reality interface.
5. The apparatus of claim 4, further comprising:
the distance acquisition module is used for acquiring the moving distance of the appointed body part of the user;
and the distance determining module is used for determining the moving distance of the target content according to the moving distance of the appointed body part of the user.
6. The apparatus of claim 4, wherein the designated body part of the user is the user's head, the user's torso, or the user's eye.
CN201610888660.7A 2016-10-11 2016-10-11 Interaction control method and device based on virtual reality Active CN106484114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610888660.7A CN106484114B (en) 2016-10-11 2016-10-11 Interaction control method and device based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610888660.7A CN106484114B (en) 2016-10-11 2016-10-11 Interaction control method and device based on virtual reality

Publications (2)

Publication Number Publication Date
CN106484114A CN106484114A (en) 2017-03-08
CN106484114B true CN106484114B (en) 2019-12-31

Family

ID=58270607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610888660.7A Active CN106484114B (en) 2016-10-11 2016-10-11 Interaction control method and device based on virtual reality

Country Status (1)

Country Link
CN (1) CN106484114B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101995944A (en) * 2009-08-24 2011-03-30 仇锐铿 Electrooculogram control system and method for controlling cursor by using eye electricity control system
KR20140055047A (en) * 2012-10-30 2014-05-09 삼성전자주식회사 Display apparatus and control method thereof
CN106663183B (en) * 2013-11-27 2020-04-24 深圳市汇顶科技股份有限公司 Eye tracking and user response detection
CN105739525B (en) * 2016-02-14 2019-09-03 普宙飞行器科技(深圳)有限公司 A kind of system that cooperation somatosensory operation realizes virtual flight
CN105847540A (en) * 2016-03-16 2016-08-10 惠州Tcl移动通信有限公司 Method and mobile phone for controlling picture movement of VR (Virtual Reality) glasses based on eyeball tracking and VR glasses

Also Published As

Publication number Publication date
CN106484114A (en) 2017-03-08

Similar Documents

Publication Publication Date Title
US11557102B2 (en) Methods for manipulating objects in an environment
US9905052B2 (en) System and method for controlling immersiveness of head-worn displays
Kopper et al. Rapid and accurate 3D selection by progressive refinement
CN106445157B (en) Method and device for adjusting picture display direction
CN106873767B (en) Operation control method and device for virtual reality application
JP5914739B1 (en) Program to control the head mounted display system
CN108273265A (en) The display methods and device of virtual objects
CN109246463B (en) Method and device for displaying bullet screen
US20090156970A1 (en) System and method for exercising eyes
KR20180013892A (en) Reactive animation for virtual reality
CN111298435A (en) Visual field control method for VR game, VR display terminal, equipment and medium
CN106598246B (en) Interaction control method and device based on virtual reality
EP3528094B1 (en) Method and device for inputting password in virtual reality scene
CN109696953B (en) Virtual reality character display method and device and virtual reality equipment
JP6989075B2 (en) Methods, devices and computer programs for presenting images in a virtualized environment
CN106774821B (en) Display method and system based on virtual reality technology
CN106774929B (en) Display processing method of virtual reality terminal and virtual reality terminal
JP6121496B2 (en) Program to control the head mounted display system
EP3170061A1 (en) Apparatus for presenting a virtual object on a three-dimensional display and method for controlling the apparatus
WO2018000606A1 (en) Virtual-reality interaction interface switching method and electronic device
US10345894B2 (en) System and method for image processing
WO2018211103A1 (en) Methods and systems for viewing and editing ar/vr computer-based designs allowing impaired vision simulation
CN106484114B (en) Interaction control method and device based on virtual reality
US20230147561A1 (en) Metaverse Content Modality Mapping
CN111599292A (en) Historical scene presenting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200508

Address after: 310052 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Patentee after: Alibaba (China) Co.,Ltd.

Address before: 200241 room 02, floor 2, building e, No. 555, Dongchuan Road, Minhang District, Shanghai

Patentee before: Transmission network technology (Shanghai) Co., Ltd