CN110355750A - Interaction control method towards remote operating hand eye coordination - Google Patents

Interaction control method towards remote operating hand eye coordination Download PDF

Info

Publication number
CN110355750A
CN110355750A CN201811270020.5A CN201811270020A CN110355750A CN 110355750 A CN110355750 A CN 110355750A CN 201811270020 A CN201811270020 A CN 201811270020A CN 110355750 A CN110355750 A CN 110355750A
Authority
CN
China
Prior art keywords
coordinate system
camera
world coordinate
remote operating
camera coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811270020.5A
Other languages
Chinese (zh)
Other versions
CN110355750B (en
Inventor
刘正雄
司继康
黄攀峰
任瑾力
孟中杰
董刚奇
张夷斋
张帆
鹿振宇
常海涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest University of Technology
Original Assignee
Northwest University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest University of Technology filed Critical Northwest University of Technology
Priority to CN201811270020.5A priority Critical patent/CN110355750B/en
Publication of CN110355750A publication Critical patent/CN110355750A/en
Application granted granted Critical
Publication of CN110355750B publication Critical patent/CN110355750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J3/00Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of interaction control method towards remote operating hand eye coordination, the technical issues of the practicability is poor for solving existing man-machine interaction control method.Technical solution is the conversion based on coordinate system, when operator controls scenario objects movement by switching equipment, the pose of scenario objects is transformed into camera coordinates system by world coordinate system first, then movement adds the precession amount of interactive device as expected in camera coordinates system, new camera coordinates system pose coordinate is obtained, the pose coordinate newly obtained is finally transformed into the actual motion in world coordinate system to control scenario objects by camera coordinates system.The present invention realizes the hand eye coordination of remote operating process, i.e., the movement of scenario objects is not observed scene view transformation by operator and influenced during interactive operation, consistent with the movement of interactive device, reduces remote operating difficulty, practicability is good.

Description

Interaction control method towards remote operating hand eye coordination
Technical field
The present invention relates to a kind of man-machine interaction control methods, more particularly to a kind of interaction towards remote operating hand eye coordination Control method.
Background technique
Document " Teleoperation Systems master-slave control strategy, Jiangsu University of Science and Technology's journal (natural science edition), 2013, Vol27 (8), p643-647 " disclose a kind of control method of master-slave mode Teleoperation Systems.This method uses increment type Position control mode controls the movement from hand with the increment of main hand, relatively efficiently avoids the cumbersome of initial Aligning control.Work as behaviour Operator directly observe by when operating environment information with operator by vision facilities observation by operating environment information when, principal and subordinate Between position it is corresponding different, control gain matrix by adjusting ratio, carry out main hand and actual environment or main hand and vision facilities Coordinate matching, the working space for establishing main and slave terminal are mapped with preferable approach sense of vision.Document the method is based only on The direct environment of observation information of operator and the difference for passing through vision facilities environment of observation information control gain coefficient letter by ratio The working space for singly establishing main hand and actual environment or main hand and vision facilities maps, and works as operator's observation quilt without providing When operating environment information observes visual angle change, it is empty to be not carried out the work of principal and subordinate end for the method for establishing the mapping of principal and subordinate end working space Between map and do not influenced by observation visual angle variation, movement is consistent always, and the scope of application is not wide, and operation difficulty is larger.
Summary of the invention
In order to overcome the shortcomings of existing man-machine interaction control method, the practicability is poor, and the present invention provides one kind towards remote operating hand The interaction control method of eye coordinate.Conversion of this method based on coordinate system, operator control scenario objects by switching equipment and transport When dynamic, the pose of scenario objects is transformed into camera coordinates system by world coordinate system first, is then pressed in camera coordinates system Desired movement adds the precession amount of interactive device, obtains new camera coordinates system pose coordinate, finally sits the pose newly obtained Mark is transformed into the actual motion in world coordinate system to control scenario objects by camera coordinates system.The present invention realizes remote operating The hand eye coordination of process, i.e., the shadow of scene view transformation is not observed in the movement of scenario objects by operator during interactive operation It rings, it is consistent with the movement of interactive device, remote operating difficulty is reduced, practicability is good.
The technical solution adopted by the present invention to solve the technical problems is: a kind of interaction control towards remote operating hand eye coordination Method processed, its main feature is that the following steps are included:
Step 1: interactive device data acquire, in hand controller motion process, constant duration acquires the real-time of hand controller Location information Pc n;By collected hand controller real-time position information Pc nWith previous moment posture information Pc n-1Hand controller is obtained as difference Precession amount Δ P;By precession amount Δ P according to Δ P0=k* Δ P is mapped as the initial motion instruction Δ P of mechanical arm tail end0, wherein k To operate proportionality coefficient.
△P0=k* △ P (1)
Step 2: in conjunction with the transition matrix R of interactive operation coordinate system to camera coordinates systemx, Δ P is instructed to initial motion0Into Row conversion process obtains the movement instruction Δ P in camera coordinates system1
△P1=△ P0*Rx (2)
Step 3: position auto―control C of the combining camera coordinate system in world coordinate system, to the movement instruction in camera coordinates system ΔP1Conversion process is carried out, the movement instruction Δ P in world coordinate system is obtained2
In interactive process, the what comes into a driver's that operator observes is determined by position and attitude of the camera in world coordinate system, and The position and attitude matrix of camera depend on three elements: i.e. camera sight line direction, the position of image center, camera positive court To these three elements just can determine that position and attitude of the camera in world coordinate system.When observation visual angle conversion, essence is camera Position and attitude in world coordinate system is changed, and direction of the camera coordinates system in world coordinate system also becomes therewith Change.
Amount of exercise obtained in step 2 is transformed into world coordinate system from camera coordinates system, needs to use camera matrix Inverse matrix C-1.Further, since interactive device is 1x3 matrix with momentum matrix, it is therefore desirable to construct the matrix D of a 1x4.
D=[△ P1 0] (4)
△P2=DC-1 (5)
Step 4: by the movement instruction Δ P in world coordinate system2The final fortune of mechanical arm tail end is obtained by moving mapping Momentum Δ P3, generate the remote operating instruction of entire interactive process.
The world coordinate system movement instruction Δ P that step 3 is obtained2Expansion obtains Δ P2For 1x4 matrix, first three column element For the final amount of exercise of scenario objects end, Δ P is taken by moving mapping2First three columns Element generation Δ P3, it is simplified to be expressed as follows:
△P3=[△ x △ y △ z] (6)
Wherein, Δ x, Δ y, Δ z respectively indicate the fortune of scenario objects end three change in coordinate axis direction in world coordinate system Momentum is so far generated to realize that the remote operating of hand eye coordination instructs.
Step 5: the remote operating order-driven manipulator motion generated using step 4, realizes hand eye coordination.
The real time position of scenario objects end is denoted as Pj n, the position of subsequent time is Pj n+1, then
Scenario objects movement is driven by scenario objects real-time end position, realizes the hand eye coordination of remote operating process.
The beneficial effects of the present invention are: conversion of this method based on coordinate system, operator control scene by switching equipment When object moves, the pose of scenario objects is transformed into camera coordinates system by world coordinate system first, then in camera coordinates Movement adds the precession amount of interactive device as expected in system, obtains new camera coordinates system pose coordinate, finally will newly obtain Pose coordinate is transformed into the actual motion in world coordinate system to control scenario objects by camera coordinates system.The present invention realizes The hand eye coordination of remote operating process, i.e., the movement of scenario objects is not observed scene visual angle by operator and become during interactive operation The influence changed, it is consistent with the movement of interactive device, remote operating difficulty is reduced, practicability is good.
It elaborates With reference to embodiment to the present invention.
Specific embodiment
For the contents of the present invention are discussed in detail, the definition to 3 Common Coordinates is needed to be illustrated:
(1) world coordinate system Oworld: world coordinate system is the reference frame of whole system, for describing object in scene The actual motion of model, is equivalent to basis coordinates system, and the movement of scenario objects is all based on world coordinate system to be described.
(2) camera coordinates system Ocamera: camera coordinates system is used to describe remote operating what comes into a driver's, when operator's observation visual angle converts When, essence is that position and attitude of the camera in world coordinate system is changed.When observation visual angle transformation, camera coordinates are tied up to Pose in world coordinate system changes, but it is constant relative to the direction of computer screen.
(3) interactive operation coordinate system (i.e. hand controller coordinate system) Ointeraction: interactive device coordinate system is used to describe distant behaviour The movement of interactive device during work, when observation visual angle transformation, position and attitude of the interactive device in world coordinate system will be sent out Changing, but it is constant relative to the direction of computer screen.
For the validity for verifying the operating technology proposed by the invention towards remote operating hand eye coordination, the present invention combines three Graphical development environment OSG (OpenSenceGraph) and interactive tool NovintFalcon hand controller are tieed up, based on to virtual views The control of middle IRB120 manipulator motion carries out simulation demo verifying, and specific embodiment is as follows:
Step 1: interactive device data acquire, by taking hand controller moves upwards as an example, in hand controller motion process medium while Interval collects the real time position posture information P of hand controllerc n;By collected hand controller real-time pose information Pc nWith it is previous Moment posture information Pc n-1Hand controller precession amount Δ P is obtained as difference;By precession amount Δ P according to Δ P0=k* Δ P is mapped as mechanical arm The initial motion of end instructs Δ P0, wherein k is operation proportionality coefficient, and hand controller end movement amount unit is rice, virtual scene Middle mechanical arm tail end amount of exercise unit is millimeter, operates proportionality coefficient k=1000, when hand controller end moves upwards 0.1 meter, Initial motion instructs
△P0=k* △ P=[0 0 100] (1)
Step 2: in conjunction with the transition matrix R of interactive operation coordinate system to camera coordinates systemx, Δ P is instructed to initial motion0Into Row conversion process obtains the movement instruction Δ P in camera coordinates system1
Through the positional relationship analysis to interactive device coordinate system and camera coordinates system it is found that when observation visual angle transformation, There is no variations for the relative direction of interactive device coordinate system and camera coordinates system in world coordinate system, by interactive device coordinate System is rotated by 90 ° along X-axis can be consistent with camera coordinates system direction.
Step 3: position auto―control C of the combining camera coordinate system in world coordinate system, to the movement instruction in camera coordinates system ΔP1Conversion process is carried out, the movement instruction Δ P in world coordinate system is obtained2
When facing scene, position auto―control of the camera in world coordinate system is C.
Since interactive device is 1x3 matrix with momentum matrix, need to construct the matrix D of a 1x4.
D=[△ P10]=[0 100 0 0] (4)
△P2=DC-1=[0 0 100 0] (5)
Step 4: by the movement instruction Δ P in world coordinate system2The final fortune of mechanical arm tail end is obtained by moving mapping Momentum Δ P3, generate the remote operating instruction of entire interactive process.
△P3=[△ x △ y △ z]=[0 0 100] (6)
Wherein, Δ x, Δ y, Δ z respectively indicate the fortune of scenario objects end three change in coordinate axis direction in world coordinate system Momentum is so far generated to realize that the remote operating of hand eye coordination instructs.
Step 5: the remote operating order-driven manipulator motion generated using step 4, realizes hand eye coordination.Scenario objects The real time position of end is denoted as Pj n, the position of subsequent time is Pj n+1, then
Interactive controlling of the present embodiment based on IRB120 mechanical arm, will complete step 5, first with tools such as 3D MAX IRB120 mechanical arm threedimensional model is established, building for virtual views is completed;On the basis of D-H coordinate system, mechanical arm is established Kinematics model.
Carrying out forward Kinematic Analysis each adjacent two coordinate system to it first can mutually be turned by four homogeneous transformations Change set transformation matrix asThen
Wherein, ai、αi、di、θiRespectively length of connecting rod, two sides torsion angle, connecting rod offset, joint angle.
For sixdegree-of-freedom simulation, after the joint angle when each joint determines, mechanical arm tail end pose also determines therewith, If end pose is T, then
Secondly, carrying out your kinematics analysis to it with analytic method, each joint angle is sought according to end pose.
θ can be sought by above formula1, other joint angles are successively sought by analytic method.
The initial position of mechanical arm tail endWhen facing scene,
It can be obtained after carrying out movement mapping by the above hand eye coordination interaction control method, when hand controller moves upwards, just Depending on virtual views, the direction of motion of mechanical arm is along Z axis, and when overlooking virtual views, manipulator motion direction is along Y-axis.Thus, When the transformation of the observation visual angle of operator, using hand eye coordination implementation method proposed in this paper, the fortune of scenario objects can be realized It is dynamic consistent with the movement of interactive device, to demonstrate the validity of hand eye coordination method proposed in this paper.

Claims (1)

1. a kind of interaction control method towards remote operating hand eye coordination, it is characterised in that the following steps are included:
Step 1: interactive device data acquire, in hand controller motion process, constant duration acquires the real time position of hand controller Information Pc n;By collected hand controller real-time position information Pc nWith previous moment posture information Pc n-1Hand controller precession is obtained as difference Measure Δ P;By precession amount Δ P according to Δ P0=k* Δ P is mapped as the initial motion instruction Δ P of mechanical arm tail end0, wherein k is behaviour Make proportionality coefficient;
ΔP0=k* Δ P (1)
Step 2: in conjunction with the transition matrix R of interactive operation coordinate system to camera coordinates systemx, Δ P is instructed to initial motion0Turned Processing is changed, the movement instruction Δ P in camera coordinates system is obtained1
ΔP1=Δ P0*Rx (2)
Step 3: position auto―control C of the combining camera coordinate system in world coordinate system, to the movement instruction Δ P in camera coordinates system1 Conversion process is carried out, the movement instruction Δ P in world coordinate system is obtained2
In interactive process, the what comes into a driver's that operator observes is determined by position and attitude of the camera in world coordinate system, and camera Position and attitude matrix depend on three elements: i.e. camera sight line direction, the position of image center, camera positive direction, this Three elements just can determine that position and attitude of the camera in world coordinate system;When observation visual angle conversion, essence is that camera is alive Position and attitude in boundary's coordinate system is changed, and direction of the camera coordinates system in world coordinate system also changes therewith;
Amount of exercise obtained in step 2 is transformed into world coordinate system from camera coordinates system, needs to use camera inverse of a matrix square Battle array C-1;Further, since interactive device is 1x3 matrix with momentum matrix, it is therefore desirable to construct the matrix D of a 1x4;
D=[Δ P1 0] (4)
ΔP2=DC-1 (5)
Step 4: by the movement instruction Δ P in world coordinate system2The final amount of exercise of mechanical arm tail end is obtained by moving mapping ΔP3, generate the remote operating instruction of entire interactive process;
The world coordinate system movement instruction Δ P that step 3 is obtained2Expansion obtains Δ P2For 1x4 matrix, first three column element is field The final amount of exercise of scape object end takes Δ P by moving mapping2First three columns Element generation Δ P3, it is simplified to be expressed as follows:
ΔP3=[Δ x Δ y Δ z] (6)
Wherein, Δ x, Δ y, Δ z respectively indicate the amount of exercise of scenario objects end three change in coordinate axis direction in world coordinate system, So far, it is generated to realize that the remote operating of hand eye coordination instructs;
Step 5: the remote operating order-driven manipulator motion generated using step 4, realizes hand eye coordination;
The real time position of scenario objects end is denoted as Pj n, the position of subsequent time is Pj n+1, then
Scenario objects movement is driven by scenario objects real-time end position, realizes the hand eye coordination of remote operating process.
CN201811270020.5A 2018-10-29 2018-10-29 Interaction control method for hand-eye coordination of teleoperation Active CN110355750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811270020.5A CN110355750B (en) 2018-10-29 2018-10-29 Interaction control method for hand-eye coordination of teleoperation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811270020.5A CN110355750B (en) 2018-10-29 2018-10-29 Interaction control method for hand-eye coordination of teleoperation

Publications (2)

Publication Number Publication Date
CN110355750A true CN110355750A (en) 2019-10-22
CN110355750B CN110355750B (en) 2022-05-10

Family

ID=68214781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811270020.5A Active CN110355750B (en) 2018-10-29 2018-10-29 Interaction control method for hand-eye coordination of teleoperation

Country Status (1)

Country Link
CN (1) CN110355750B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111590537A (en) * 2020-05-23 2020-08-28 西北工业大学 Teleoperation interactive operation method based on force position feedback
CN111640189A (en) * 2020-05-15 2020-09-08 西北工业大学 Teleoperation enhanced display method based on artificial mark points
WO2022002159A1 (en) * 2020-07-01 2022-01-06 北京术锐技术有限公司 Master-slave motion control method, robot system, device, and storage medium
CN115570558A (en) * 2022-10-28 2023-01-06 武汉恒新动力科技有限公司 Somatosensory cooperative teleoperation system and method for controlled object cluster
CN115639910A (en) * 2022-10-28 2023-01-24 武汉恒新动力科技有限公司 All-dimensional somatosensory interaction method facing operation space of operation object and operation device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010046313A1 (en) * 1992-01-21 2001-11-29 Green Philip S. Method and apparatus for transforming coordinate systems in a telemanipulation system
CN103991077A (en) * 2014-02-19 2014-08-20 吉林大学 Robot hand controller shared control method based on force fusion
CN104950885A (en) * 2015-06-10 2015-09-30 东南大学 UAV (unmanned aerial vehicle) fleet bilateral remote control system and method thereof based on vision and force sense feedback
CN106444861A (en) * 2016-11-21 2017-02-22 清华大学深圳研究生院 Space robot teleoperation system based on three-dimensional gestures
CN107662195A (en) * 2017-09-22 2018-02-06 中国东方电气集团有限公司 A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc
CN107748496A (en) * 2017-09-25 2018-03-02 北京邮电大学 Impedance controller algorithm based on parameter adaptive regulation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010046313A1 (en) * 1992-01-21 2001-11-29 Green Philip S. Method and apparatus for transforming coordinate systems in a telemanipulation system
CN103991077A (en) * 2014-02-19 2014-08-20 吉林大学 Robot hand controller shared control method based on force fusion
CN104950885A (en) * 2015-06-10 2015-09-30 东南大学 UAV (unmanned aerial vehicle) fleet bilateral remote control system and method thereof based on vision and force sense feedback
CN106444861A (en) * 2016-11-21 2017-02-22 清华大学深圳研究生院 Space robot teleoperation system based on three-dimensional gestures
CN107662195A (en) * 2017-09-22 2018-02-06 中国东方电气集团有限公司 A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc
CN107748496A (en) * 2017-09-25 2018-03-02 北京邮电大学 Impedance controller algorithm based on parameter adaptive regulation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汤卿,等.: "基于KUKA工业机器人的遥操作控制系统设计与异构主从控制方法研究", 《四川大学学报( 工程科学版)》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111640189A (en) * 2020-05-15 2020-09-08 西北工业大学 Teleoperation enhanced display method based on artificial mark points
CN111590537A (en) * 2020-05-23 2020-08-28 西北工业大学 Teleoperation interactive operation method based on force position feedback
CN111590537B (en) * 2020-05-23 2023-01-24 西北工业大学 Teleoperation interactive operation method based on force position feedback
WO2022002159A1 (en) * 2020-07-01 2022-01-06 北京术锐技术有限公司 Master-slave motion control method, robot system, device, and storage medium
CN115570558A (en) * 2022-10-28 2023-01-06 武汉恒新动力科技有限公司 Somatosensory cooperative teleoperation system and method for controlled object cluster
CN115639910A (en) * 2022-10-28 2023-01-24 武汉恒新动力科技有限公司 All-dimensional somatosensory interaction method facing operation space of operation object and operation device
CN115639910B (en) * 2022-10-28 2023-08-15 武汉恒新动力科技有限公司 Omnidirectional somatosensory interaction method and equipment for operation space of controlled object

Also Published As

Publication number Publication date
CN110355750B (en) 2022-05-10

Similar Documents

Publication Publication Date Title
CN110355750A (en) Interaction control method towards remote operating hand eye coordination
CN110480634B (en) Arm guide motion control method for mechanical arm motion control
EP2728548B1 (en) Automated frame of reference calibration for augmented reality
US10751877B2 (en) Industrial robot training using mixed reality
CN102848389B (en) Realization method for mechanical arm calibrating and tracking system based on visual motion capture
CN108015766B (en) Nonlinear constrained primal-dual neural network robot action planning method
CN108214445B (en) ROS-based master-slave heterogeneous teleoperation control system
CN108436909A (en) A kind of hand and eye calibrating method of camera and robot based on ROS
CN108284436B (en) Remote mechanical double-arm system with simulation learning mechanism and method
CN103302668A (en) Kinect-based space teleoperation robot control system and method thereof
CN106142092A (en) A kind of method robot being carried out teaching based on stereovision technique
CN110815189B (en) Robot rapid teaching system and method based on mixed reality
CN104002296A (en) Robot simulator, robot teaching apparatus and robot teaching method
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN110385694A (en) Action teaching device, robot system and the robot controller of robot
CN111590567B (en) Space manipulator teleoperation planning method based on Omega handle
CN113858217B (en) Multi-robot interaction three-dimensional visual pose perception method and system
CN111113414B (en) Robot three-dimensional space scale prompting method and system based on screen identification
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN210361314U (en) Robot teaching device based on augmented reality technology
Frank et al. Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet
CN111283664B (en) Registration system and method for robot augmented reality teaching
CN106903665A (en) A kind of master-slave mode telesurgery robot control system based on stereoscopic vision
CN115357851A (en) Master-slave end hybrid mapping method for man-machine interaction system and application thereof
CN112947117A (en) Multi-ROV underwater cooperative operation simulation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant