CN110688000B - Virtual reality human-computer interaction method - Google Patents

Virtual reality human-computer interaction method Download PDF

Info

Publication number
CN110688000B
CN110688000B CN201910712220.XA CN201910712220A CN110688000B CN 110688000 B CN110688000 B CN 110688000B CN 201910712220 A CN201910712220 A CN 201910712220A CN 110688000 B CN110688000 B CN 110688000B
Authority
CN
China
Prior art keywords
state
interactive
interaction
aiming
request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910712220.XA
Other languages
Chinese (zh)
Other versions
CN110688000A (en
Inventor
赵岩
孟令卫
吴哲
程烨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Ruisheng Oceanic Instrument Co ltd
Original Assignee
Hangzhou Ruisheng Oceanic Instrument Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Ruisheng Oceanic Instrument Co ltd filed Critical Hangzhou Ruisheng Oceanic Instrument Co ltd
Priority to CN201910712220.XA priority Critical patent/CN110688000B/en
Publication of CN110688000A publication Critical patent/CN110688000A/en
Application granted granted Critical
Publication of CN110688000B publication Critical patent/CN110688000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment

Abstract

The invention discloses a virtual reality man-machine interaction method, which consists of virtual reality operation input equipment, virtual reality display output equipment, input management software, virtual scene rendering software and interaction process management software, wherein the interaction process management software is structurally divided into an interaction request module, an interaction response module and an interaction logic control module from the function; the interaction request module uniformly manages active interaction objects in the virtual scene and sends out interaction requests according to user operation input conditions; the active interaction object mainly comprises a human body trunk, four limb joints and an operating handle with a space positioning function; and the interactive response module uniformly manages the passive interactive objects in the virtual scene. The invention has the beneficial effects that: the invention realizes the decoupling between the virtual reality hardware and the human-computer interaction program and the decoupling between the interaction input and the interaction response, simultaneously defines a basic structure of human-computer interaction in a virtual environment, and can develop the virtual reality human-computer interaction engineering application on the basis.

Description

Virtual reality human-computer interaction method
Technical Field
The invention relates to a virtual reality technology, a human-computer interaction technology, an action capture technology and a visual simulation technology, in particular to a virtual reality human-computer interaction method.
Background
In recent years, with the rapid development of virtual reality technology, the virtual reality technology is combined with motion capture technology to provide a brand-new application scenario for human-computer interaction technology, namely, an operator can perform interaction operation with an object in a virtual environment by wearing virtual reality equipment. Once proposed, virtual reality human-computer interaction is widely applied to various industries, such as virtual surgery in medical aspects, virtual combat training in military aspects, and the like. The virtual reality application has the following characteristics: 1) Providing near-real visual perception, so that a user can perceive a virtual environment as being personally on the scene; 2) The virtual scene which is harsh in conditions and difficult to realize in reality is constructed through software, so that the cost of the actual process is reduced, and meanwhile, the personnel safety is guaranteed.
At present, many companies develop a series of applications based on virtual reality technology, human-computer interaction operations included in the applications are generally directed to fixed virtual reality hardware and application scenes, and do not form standard industry specifications, so that the following problems exist: 1) Effective communication means are lacked among virtual reality research and development workers, and the cooperative work efficiency is not high; 2) The design aiming at specific requirements is not strong in universality and expandability, and the thinking of the whole framework of the virtual reality human-computer interaction system is lacked.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a virtual reality human-computer interaction method which is based on interaction process management software and can support virtual reality hardware change and human-computer interaction operation content change under the condition of not changing an interaction process management software framework.
The object of the present invention is achieved by the following technical means. A virtual reality human-computer interaction method mainly comprises a virtual reality operation input device, a virtual reality display output device, input management software, virtual scene rendering software and interaction process management software, wherein the interaction process management software is the key for ensuring the realization of human-computer interaction in a virtual environment and is divided into an interaction request module, an interaction response module and an interaction logic control module from the functional structure; the interactive request module uniformly manages objects capable of actively carrying out interactive operation in the virtual scene (hereinafter referred to as active interactive objects) and sends out interactive requests according to user operation input conditions; the active interaction objects mainly comprise human body trunks, four limb joints and operation handles with space positioning functions, and for each active interaction object capable of freely moving, the space pose data, the affiliated level and the related data of the current state of the objects are recorded; the interactive response module is used for uniformly managing objects (hereinafter referred to as passive interactive objects) which can accept interactive operation in the virtual scene and making appropriate interactive response according to the received interactive request; the passive interaction objects mainly comprise triggering devices (such as buttons, pull rods and the like) and objects (such as equipment, parts, tools and the like) which can be directly grabbed, and for each passive interaction object, the spatial pose data, the affiliated hierarchy and the related data of the current state of the objects are recorded and correspond to the active interaction objects; the interactive logic control module calculates the spatial position and the belonged level of the active interactive object and the passive interactive object, and determines whether to switch to a further state according to the states of the two parties when a request and a response are sent.
Preferably, the active interaction object has the following four states: 1) The free state is temporarily free from interactive behaviors and can interact with other objects; 2) Aiming state, wherein objects which can accept the interaction request exist in the range, an aiming request is sent out, and the aiming state is entered after successful response; 3) A trigger state, after the user carries out interactive operation, the user initiates a trigger request when the object to be aimed exists at the same time, and enters the trigger state after successful response; 4) And in a release state, the user actively releases the currently triggered object and sends an abandon request.
Preferably, the passive interaction object also has four states: 1) In a free state, no interactive behavior exists temporarily, and an interactive request of an active interactive object can be accepted; 2) The aiming state receives the aiming request, the self state allows the response, and the aiming state is entered after the response is successful; 3) The triggered state receives the trigger request, the self state allows the response, and the triggered state is entered after the response is successful; 4) And (4) the released state receives a 'abandon' request and restores the self state to the free state.
Preferably, the four basic states of the active interactive object and the passive interactive object correspond to each other one by one, and after the interactive request is successfully responded, state conversion is simultaneously carried out; five state transformation mechanisms exist among the four basic states, namely 1) a free state is changed into an aiming/aimed state, and the aiming state is changed after a 'aiming' request is successfully responded; 2) Aiming/aimed state to free state, the interactive logic control module detects and controls periodically and circularly, and when the interactive object does not meet the aiming condition, the state is changed; 3) Aiming/aimed state to triggered/triggered state, "trigger" request changes state after successful response; 4) The triggering/triggered state is changed to a releasing/released state, and the state is changed after a 'abandon' request is sent or received; 5) And the released/released state is changed to a free state, and the state is automatically changed after the recovery process is finished.
Furthermore, the specific working process of the interactive process management software is as follows: 1) The active interactive object in the free state continuously initiates an aiming request, the interactive logic control module screens according to the level of the object, the aiming range of the active interactive object and the space position of the passive interactive object, and stores the active/passive interactive object matching conditions into an aiming relation table, and simultaneously changes the state of the interactive object into an aiming/aimed state; 2) The interactive logic control module regularly traverses the aiming relation table, removes the active/passive interactive object pairs which do not meet the conditions from the table, and simultaneously changes the state of the interactive objects into a free state; 3) Judging whether a corresponding active interaction object in the aiming relationship table initiates a triggering request or not according to user operation input, and after the triggering request is responded, transferring the active/passive interaction object pair from the aiming relationship table to the triggering relationship table and simultaneously changing the state of the interaction object into a triggering/triggered state; 4) Judging whether the corresponding active interactive object in the trigger relation table initiates a abandoning operation or not according to the user operation input, removing the active/passive interactive object pair in the trigger relation table once the abandoning request is initiated, changing the state of the interactive object into a releasing/released state, automatically completing a releasing/releasing process by the interactive object in the releasing/released state, and then changing the state into a free state.
Preferably, the virtual reality operation input device mainly includes a wearable motion capture device and an operation handle. The wearable motion capture device can acquire the current posture of the operator and synchronize the current posture to a three-dimensional scene in real time. The operating handle can realize some special operations through the keys, and when the human body action can not reach the operation purpose, the key mode can more conveniently realize the target.
Preferably, the virtual reality display output device is mainly an immersive head-mounted display, images on the ultramicro display screen are magnified through a group of optical systems, and the images are projected on the retina, so that a user can obtain immersive visual perception.
Preferably, the input management software mainly realizes decoupling between software and hardware, and converts user input information of different hardware into a given and standard operation input instruction, so that the design process of the virtual reality man-machine interaction method does not need to worry about the change of external input equipment.
Preferably, the virtual scene rendering software comprises three-dimensional models of all objects and environments participating in the interaction process, constructs a virtual scene in real time according to the pose of the models in the space, and renders a live-action image.
The invention has the beneficial effects that: the invention mainly realizes the decoupling between virtual reality hardware and a human-computer interaction program and the decoupling between interaction input and interaction response, simultaneously defines a basic structure of human-computer interaction in a virtual environment, and can develop the human-computer interaction engineering application of virtual reality on the basis.
Drawings
FIG. 1 is a schematic diagram of a basic structure of a virtual reality human-computer interaction method;
FIG. 2 is a diagram illustrating a basic state transition mechanism of an interactive object;
fig. 3 is a schematic diagram of a specific operation process of the interactive process management software.
Detailed Description
The invention will be described in detail below with reference to the following drawings:
fig. 1 shows a basic structure of a virtual reality human-computer interaction method, and the system mainly comprises virtual reality display output equipment 2, virtual reality operation input equipment 3, virtual scene rendering software 4, input management software 5 and interaction process management software 6.
The virtual reality display output device 2 is mainly a virtual reality head-mounted display and provides immersive visual perception for a user; the virtual reality operation input device 3 is mainly a wearable motion capture device and an operation handle, and provides an operation input interface for a user; the virtual scene rendering software 4 comprises three-dimensional models of all objects and environments participating in the interaction process, constructs a virtual scene in real time according to the pose of the models in the space, and renders a real-scene image; the input management software 5 mainly converts the user operation input into uniform and standard interactive input data, so that the change of the virtual reality input/output hardware does not affect the interactive method. The interactive process management software 6 mainly comprises an interactive response module 7, an interactive request module 8 and an interactive logic control module 9, wherein the interactive response module 7 manages the passive interactive objects in the virtual scene and makes appropriate response according to the received interactive request; the interactive request module 8 manages active interactive objects in the virtual scene, and determines whether to send an interactive request according to user operation input conditions; the interaction logic control module 9 makes a logic judgment on both sides of the request-response and decides whether to shift to a further interaction state.
An operator 1 can observe a virtual three-dimensional scene constructed by virtual scene rendering software 4 at an in-person angle by wearing virtual reality display output equipment 2, can walk, stretch hands and other body actions consistent with the operator in the virtual three-dimensional scene by wearing virtual reality operation input equipment 3, can perform some special controls by operating handle keys, and interaction process management software 6 defines the interaction relation between people and objects in the virtual scene, so that the operator can interact with the virtual objects in a natural and intuitive mode in the virtual environment.
Fig. 2 is a basic state transition mechanism of an interactive object, wherein the active interactive object and the passive interactive object both have four basic states: a free state 10, an aimed state 11, a triggered state 12, and a released state 13. The four basic states of the active interactive object and the passive interactive object are in one-to-one correspondence, and after the interactive request is successfully responded, state conversion is simultaneously carried out.
There are five state transition mechanisms between the four basic states, 1) free state to (targeted) state 14, which changes state after the "targeting" request is successfully responded to; 2) The (aimed) state is changed to a free state 15, which is periodically and circularly detected and controlled by the interactive logic control module, and the state is changed when the interactive object does not meet the aiming condition; 3) (quilt) aim state to (quilt) trigger state 16, "trigger" requests change state after being successfully responded to; 4) A (triggered) state to a (released) state 17, the state being changed after the "abort" request is issued or received; 5) The (released) state is changed to the free state 18, and the state is automatically changed after the recovery process is finished. The state change of the interactive object follows the above mechanism and cannot be changed freely.
Fig. 3 is a specific working process of the interactive process management software, which is a key point for ensuring that the state transition mechanism is implemented. 1) The active interaction object in the free state continuously initiates an aiming request, the interaction logic control module screens 19 the level of the object, the aiming range of the active interaction object and the space position of the passive interaction object, and stores the matched active (passive) interaction object in the aiming relation table, and simultaneously changes the state of the interaction object into the aiming state; 2) The interactive logic control module regularly traverses the aiming relationship table 20, removes the main (passive) interactive object pairs which do not meet the conditions from the table, and simultaneously changes the state of the interactive objects into a free state; 3) Judging whether a corresponding active interaction object in an aiming relationship table initiates a 'triggering' request 21 according to user operation input, wherein the state of the active interaction object stored in the 'aiming relationship table' is a 'aiming state' and ensures that other states cannot skip the 'aiming state' and directly enter the 'triggered state', and after the 'triggering' request is responded, the active (passive) interaction object pair is transferred from the 'aiming relationship table' to the 'triggering relationship table' and the state of the interaction object is changed into the 'triggered state'; 4) Whether the corresponding active interaction object in the trigger relationship table initiates the abandoning operation 22 is judged according to the operation input of the user, once the abandoning request is initiated, the main (passive) interaction object pair in the trigger relationship table is removed, the state of the interaction object is changed into a releasing state, the interaction object in the releasing state completes the releasing process autonomously, and then the state is changed into a free state.
It should be understood that equivalent substitutions and changes to the technical solution and the inventive concept of the present invention should be made by those skilled in the art to the protection scope of the appended claims.

Claims (9)

1. A virtual reality human-computer interaction method is characterized in that: the interactive system mainly comprises virtual reality operation input equipment, virtual reality display output equipment, input management software, virtual scene rendering software and interactive process management software, wherein the interactive process management software is divided into an interactive request module, an interactive response module and an interactive logic control module from the functional structure; the interactive request module is used for uniformly managing active interactive objects in the virtual scene and sending interactive requests according to user operation input conditions; the active interaction objects mainly comprise human body trunks, four limb joints and operation handles with space positioning functions, and for each active interaction object capable of freely moving, the space pose data, the affiliated level and the related data of the current state of the objects are recorded; the interactive response module manages passive interactive objects in the virtual scene in a unified manner and makes appropriate interactive response according to the received interactive request; the passive interaction objects mainly comprise triggering devices and objects capable of being directly grabbed, and for each passive interaction object, the space position and attitude data, the belonged level and the current state related data of the objects are recorded and correspond to the active interaction objects; the interactive logic control module calculates the spatial position and the belonged level of the active interactive object and the passive interactive object, and determines whether to switch to a further state according to the states of the two parties when a request and a response are sent.
2. The virtual reality human-computer interaction method of claim 1, wherein: the active interaction object has the following four states: 1) The system is in a free state, has no interaction behavior temporarily, and can interact with other objects; 2) Aiming state, wherein objects which can accept the interaction request exist in the range, an aiming request is sent out, and the aiming state is entered after successful response; 3) A trigger state, after the user carries out interactive operation, the user initiates a trigger request when the object to be aimed exists at the same time, and enters the trigger state after successful response; 4) And in the release state, the user actively releases the currently triggered object and sends out a 'abandon' request.
3. The virtual reality human-computer interaction method according to claim 1, wherein: the passive interaction object also has four states: 1) In a free state, no interactive behavior is temporarily generated, and an interactive request of an active interactive object can be accepted; 2) Aiming state, receiving the aiming request, allowing the self state to respond, and entering the aiming state after successful response; 3) The triggered state receives the trigger request, the self state allows the response, and the triggered state is entered after the response is successful; 4) And (4) the released state receives a 'abandon' request and restores the self state to the free state.
4. The virtual reality human-computer interaction method according to claim 2 or 3, characterized in that: the four basic states of the active interactive object and the passive interactive object are in one-to-one correspondence, and after the interactive request is successfully responded, state conversion is simultaneously carried out; five state transformation mechanisms exist among the four basic states, namely 1) a free state is changed into an aiming/aimed state, and the aiming state is changed after a successful response is made to an aiming request; 2) The aiming/aimed state is changed into a free state, the interactive logic control module periodically and circularly detects and controls the aiming/aimed state, and the state is changed when the interactive object does not meet the aiming condition; 3) Aiming/aimed state to triggered/triggered state, "trigger" request changes state after successful response; 4) The triggering/triggered state is changed to a releasing/released state, and the state is changed after a 'abandon' request is sent or received; 5) And the released/released state is changed to a free state, and the state is automatically changed after the recovery process is finished.
5. The virtual reality human-computer interaction method according to claim 1, wherein: the specific working process of the interactive process management software is as follows: 1) The active interactive object in the free state continuously initiates an aiming request, the interactive logic control module screens according to the level of the object, the aiming range of the active interactive object and the space position of the passive interactive object, and stores the active/passive interactive object matching conditions into an aiming relation table, and simultaneously changes the state of the interactive object into an aiming/aimed state; 2) The interactive logic control module regularly traverses the aiming relation table, removes the active/passive interactive object pairs which do not meet the conditions from the table, and simultaneously changes the state of the interactive objects into a free state; 3) Judging whether a corresponding active interaction object in the aiming relationship table initiates a trigger request according to user operation input, and after the trigger request is responded, transferring the active/passive interaction object pair from the aiming relationship table to the trigger relationship table and simultaneously changing the state of the interaction object into a triggered/triggered state; 4) Judging whether the corresponding active interaction object in the trigger relation table initiates a abandoning operation or not according to the operation input of the user, removing the active/passive interaction object pair in the trigger relation table once the abandoning request is initiated, changing the state of the interaction object into a releasing/released state, autonomously completing a releasing/releasing process by the interaction object in the releasing/releasing state, and then changing the state into a free state.
6. The virtual reality human-computer interaction method of claim 1, wherein: the virtual reality operation input device mainly comprises wearable motion capture equipment and an operation handle, wherein the wearable motion capture equipment can acquire the current posture of an operator and synchronize the current posture of the operator to a three-dimensional scene in real time.
7. The virtual reality human-computer interaction method of claim 1, wherein: the virtual reality display output equipment is mainly an immersive head-mounted display, images on the ultramicro display screen are amplified through a group of optical systems, and the images are projected on the retina, so that a user obtains immersive visual perception.
8. The virtual reality human-computer interaction method according to claim 1, wherein: the input management software mainly realizes the decoupling between software and hardware and converts user input information of different hardware into a set and standard operation input instruction.
9. The virtual reality human-computer interaction method according to claim 1, wherein: the virtual scene rendering software comprises three-dimensional models of all objects and environments participating in the interaction process, constructs a virtual scene in real time according to the pose of the models in the space, and renders a live-action image.
CN201910712220.XA 2019-08-02 2019-08-02 Virtual reality human-computer interaction method Active CN110688000B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910712220.XA CN110688000B (en) 2019-08-02 2019-08-02 Virtual reality human-computer interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910712220.XA CN110688000B (en) 2019-08-02 2019-08-02 Virtual reality human-computer interaction method

Publications (2)

Publication Number Publication Date
CN110688000A CN110688000A (en) 2020-01-14
CN110688000B true CN110688000B (en) 2023-01-10

Family

ID=69108175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910712220.XA Active CN110688000B (en) 2019-08-02 2019-08-02 Virtual reality human-computer interaction method

Country Status (1)

Country Link
CN (1) CN110688000B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111947650A (en) * 2020-07-14 2020-11-17 杭州瑞声海洋仪器有限公司 Fusion positioning system and method based on optical tracking and inertial tracking
CN112491618B (en) * 2020-11-27 2021-08-31 东北大学 Universal virtual reality interaction management method based on four-ring model

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207677A (en) * 2013-04-22 2013-07-17 北京工业大学 System and method for realizing virtual-real somatosensory interaction of digital Zenghouyi bells
CN107193371A (en) * 2017-04-28 2017-09-22 上海交通大学 A kind of real time human-machine interaction system and method based on virtual reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IN2014DE00332A (en) * 2014-02-05 2015-08-07 Nitin Vats

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207677A (en) * 2013-04-22 2013-07-17 北京工业大学 System and method for realizing virtual-real somatosensory interaction of digital Zenghouyi bells
CN107193371A (en) * 2017-04-28 2017-09-22 上海交通大学 A kind of real time human-machine interaction system and method based on virtual reality

Also Published As

Publication number Publication date
CN110688000A (en) 2020-01-14

Similar Documents

Publication Publication Date Title
Pinho et al. Cooperative object manipulation in immersive virtual environments: framework and techniques
Fisher et al. Virtual environment display system
CN110688000B (en) Virtual reality human-computer interaction method
US20130063560A1 (en) Combined stereo camera and stereo display interaction
CN107193371A (en) A kind of real time human-machine interaction system and method based on virtual reality
CN108319160B (en) Nuclear power station main control room simulator system based on virtual reality technology
CN106200944A (en) The control method of a kind of object, control device and control system
CN105922262A (en) Robot and remote control equipment and remote control method thereof
Tachi Telexistence and retro-reflective projection technology (RPT)
WO2018196552A1 (en) Method and apparatus for hand-type display for use in virtual reality scene
Cho et al. Evaluation of a bimanual simultaneous 7dof interaction technique in virtual environments
WO2019225548A1 (en) Remote control system, information processing method, and program
CN104656893A (en) Remote interaction control system and method for physical information space
CN112639685A (en) Display device sharing and interaction in Simulated Reality (SR)
Wei et al. Multi-view merging for robot teleoperation with virtual reality
Krupke et al. Prototyping of immersive HRI scenarios
Liang et al. Robot teleoperation system based on mixed reality
Lin et al. The implementation of augmented reality in a robotic teleoperation system
CN106681506B (en) Interaction method for non-VR application in terminal equipment and terminal equipment
Tran et al. Wireless data glove for gesture-based robotic control
CN109213306A (en) A kind of robot remote control platform and its design method
Bambušek et al. Handheld augmented reality: Overcoming reachability limitations by enabling temporal switching to virtual reality
Wen et al. Rotation vector sensor-based remote control of a humanoid robot through a Google Glass
CN114281190A (en) Information control method, device, system, equipment and storage medium
Stone The reality of virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200713

Address after: Room 3002, No. 136, Yuanpu street, Shuangpu Town, Xihu District, Hangzhou City, Zhejiang Province

Applicant after: HANGZHOU RUISHENG OCEANIC INSTRUMENT CO.,LTD.

Address before: 311499, Xihu District, Hangzhou, Zhejiang, leaving 715 Street Ping Feng.

Applicant before: The 715nd Research Institute of China Shipbuilding Industry Corporation

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant