CN108098780A - A kind of new robot apery kinematic system - Google Patents

A kind of new robot apery kinematic system Download PDF

Info

Publication number
CN108098780A
CN108098780A CN201611038652.XA CN201611038652A CN108098780A CN 108098780 A CN108098780 A CN 108098780A CN 201611038652 A CN201611038652 A CN 201611038652A CN 108098780 A CN108098780 A CN 108098780A
Authority
CN
China
Prior art keywords
robot
apery
human
human action
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611038652.XA
Other languages
Chinese (zh)
Inventor
陈墩金
覃争鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Original Assignee
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou filed Critical Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority to CN201611038652.XA priority Critical patent/CN108098780A/en
Publication of CN108098780A publication Critical patent/CN108098780A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Abstract

The invention discloses a kind of new robot apery kinematic system, the apery system is made of human action catching portion, man-machine action demapping section and articulated robot part;Wherein, human action catching portion and man-machine action demapping section associated;Man-machine action demapping section is connected with articulated robot part in a manner of wireless LAN communication.The present invention program gathers the image and depth information of human action by Kinect sensor, obtain human action information, human action gesture recognition is realized using relevant art, establish man-machine action mapping relations, the apery for finally realizing robot using motion control moves, strict demand of the traditional robot to the efficiency and precision of motion control is significantly reduced, realizes simplification, the naturalization of man-machine communication's process.

Description

A kind of new robot apery kinematic system
Technical field
The invention belongs to the human-computer interactions of robot, are related to a kind of new robot apery kinematic system.
Background technology
The development of science and technology is maked rapid progress, the robot to grow up in the last hundred years the nowadays fashionable whole world, and permeating To all trades and professions, the moment affects our life.
As robot application from industrial circle to industries such as medical treatment, service, amusement, education is constantly extended and goed deep into, It also proposed new requirement to the motion control of robot.Traditional industrial robot has sternly the efficiency and precision of motion control Lattice requirement generally requires professional and carries out complicated programming and calibration, could meet final requirement.However, to family For other fields such as front yard service robot, towards be general population, user and service object often lack it is relevant specially Industry background knowledge, thus, the requirement of the property easy to use of robot manipulation, interactive capability, sports safety etc. are outstanding For protrusion.
In order to make simpler man-machine communication, nature, close friend, domestic and international researcher carries out in terms of man-machine interaction mode In-depth study extensively proposes voice, gesture, eye moves, human body attitude act etc., and species are more, more naturally man-machine mutual Flowing mode.
Application publication number is that the application for a patent for invention of CN105997097A discloses a kind of " human body lower limbs movement posture reproduction System and reproducting method ", the system include:For making the power module of system worked well;For gathering human action signal Data acquisition module;For handling the Signal-regulated kinase of the data-signal obtained from data acquisition module;For control with Coordinate the micro controller module of whole system;For receiving the data in micro controller module and transferring data to movement posture The wireless communication module of analysis machine;For reappearing the movement posture analysis machine of human body lower limbs movement locus.The invention passes through sensing Device obtains related data and carries out vector analysis, can completely reproduce the live effect of human body lower limbs action, but the person of being imitated needs Wear respective sensor equipment and special power module, when use is inconvenient.
Application publication number is that the application for a patent for invention of CN103605375A discloses a kind of " control method of bio-robot And device ", this method includes:The joint motions information of human body is gathered, the joint motions information of human body includes human synovial fortune Dynamic angle;According to human body joint motion angle, movement instruction is generated, movement instruction includes human body joint motion angle;It will fortune Dynamic instruction is sent to robot, so that the human body joint motion angle that robot includes according to movement instruction is closed accordingly Section movement.Though the invention can preserve movement instruction, robot is made to imitate human body joint motion, and is carried out always according to instruction is preserved Pipelining ensures that the product accuracy of production is high, but its nobody-motor-driven make coordinate system and each position rotation angle of upper arm is reflected It penetrates, it is difficult to ensure that the uniformity and similitude of the athletic performance in human arm and each joint of robot arm.
The content of the invention
Present invention aims at a kind of new robot apery kinematic system is provided, people is gathered by Kinect sensor The image and depth information of body action, obtain human action information, realize human action gesture recognition using relevant art, establish People-motor-driven apery the movement made mapping relations, robot is finally realized using motion control, significantly reduces traditional robot Strict demand to the efficiency and precision of motion control realizes simplification, the naturalization of man-machine communication's process.
In order to solve the above technical problems, the present invention adopts the following technical scheme that:A kind of new robot apery movement System, the system include:Human action catching portion, people-motor-driven make demapping section and articulated robot part;Wherein, The human action catching portion and the people-motor-driven make demapping section associated;The people-motor-driven make demapping section with it is described Articulated robot part is connected in a manner of wireless LAN communication.
Further, the human action catching portion obtains people by gathering the image and depth information of human action Body action message using artis identification and skeleton tracer technique, realizes human action gesture recognition.
Further, the people-motor-driven demapping section of making is in analysis human synovial and the basis of robot mechanism otherness On, it is mapped into pedestrian-motor-driven, realizes motion planning and robot control.
Further, the articulated robot part receives motion control instruction, performs corresponding joint motions, real Existing apery movement.
The present invention has following advantageous effect compared with prior art:
The present invention program combination human action gesture recognition technology, people-motor-driven make mapping algorithm and motion planning and robot control Strategy realizes the apery movement of robot, significantly reduces traditional robot to the stringent of the efficiency of motion control and precision It is required that realize the simplification of man-machine communication's process, naturalization.
Description of the drawings
Fig. 1 is new robot apery kinematic system structure diagram.
Fig. 2 is for gathering the Kinect device coordinate system of human action information.
Fig. 3 is the human synovial schematic diagram under the frame of reference.
Fig. 4 is the arm action posture feature under the frame of reference.
Fig. 5 is the movable range of human body shoulder joint.
Fig. 6 is the movable range of human elbow.
Fig. 7 is robot links coordinate system.
Specific embodiment
Below in conjunction with the accompanying drawings and specific embodiment the present invention is carried out in further detail with complete explanation.It is appreciated that It is that specific embodiment described herein is only used for explaining the present invention rather than limitation of the invention.
With reference to Fig. 1, a kind of new robot apery kinematic system of the invention, which includes:Human action captures Partly, people-motor-driven makees demapping section and articulated robot part;Wherein, the human action catching portion and the people- It is motor-driven to make demapping section associated;The people-motor-driven demapping section and articulated robot part made is with wireless local Network Communication mode is connected.
Wherein,
(1) the human action catching portion obtains human action by gathering the image and depth information of human action Information using artis identification and skeleton tracer technique, realizes human action gesture recognition;
The present invention carries out human action acquisition using Microsoft's Kinect sensor as 3D video cameras.Kinect utilizes vision And infrared sensing, the colour and range image sequence of human action are obtained by OpenNI, and then is utilized based on NiTE middlewares Development kit tracking human skeleton, and calculate the position of each artis in Kinect coordinate systems under current human's posture in real time Confidence ceases.Kinect coordinate systems are with reference to Fig. 2.
With reference to Fig. 3, head (H) trunk (T), right and left shoulders (LS, RS), left and right elbow (LE, RE) and its right-hand man (LH, RH) are chosen Etc. artis information analyze the movement of human upper limb.Human arm movement posture is analyzed by the change procedure of artis Variation.In addition, by the also detectable human body palms of Kinect, open/hold with a firm grip the identification of state Stat.
In order to avoid position of human body and stance variation generate interference to arm motion detection, the spirit of motion capture system is improved Activity and adaptability, before the angle of human hand and arm joint action is calculated, it is necessary first to correct the human body frame of reference in real time CB.Such as:In human body left arm motion analysis, human body frame of reference CBWith left shoulder joint node PLSAs coordinate origin OB, X-axis Positive direction is direction vector P of the right shoulder to left shoulderRSPLS, Y-axis positive direction is direction vector P of the trunk to both shoulders midpointTPN, Z axis Positive direction is the multiplication cross of X-direction vector and Y direction vector, i.e. PRSPLS×PTPN, three, which is formed, meets right hand vector rule.
Each artis of left arm is in Kinect coordinate systems CWIn three-dimensional coordinateWPiHuman body frame of reference C can be converted intoB Under position coordinatesBPi.Such as:Human body frame of reference CBIn, the coordinate of elbow joint is
The joint angles of human arm can intuitively reflect human action posture, convenient for people-motor-driven work of apery movement Mapping and motion planning and robot control, thus, the human synovial point data of acquisition is handled by space vector method, is asked for The rotational angle in joint, to realize human action gesture recognition.
With reference to Fig. 4, characterizing the joint angles of head movement has α, β and μ, the angle of oscillation of α reflections large arm in the horizontal direction Degree, the elevating movement of angle beta reflection large arm in the vertical direction, μ reflect large arm using itself as the rotation angle of axis.Characterize ancon The joint angles of movement are γ, reflect the angle between the big and small arms of ancon.
The calculation formula of the pivot angle α of large arm in the horizontal direction is as follows:
The calculation formula of the pitch angle β of large arm in the vertical direction is as follows:
Arm is using the anglec of rotation μ that large arm is axis by the normal vector of the perpendicular of large arm and large arm excessively and forearm place plane Normal vector codetermine, calculation formula is:
It is dot product in formula, × it is multiplication cross.
The angle γ of large arm and forearm can pass through large arm vector OB BPLEWith forearm vectorBPLE BPLHIt calculates:
By characterizing joint angles α, β, μ and γ of shoulder, elbow movement, the movement appearance of human arm can be efficiently identified State, the foundation as apery motion control.
(2) people-motor-driven demapping section of making carries out on the basis of analysis human synovial and robot mechanism otherness People-motor-driven maps, and realizes motion planning and robot control;
People-motor-driven work mapping is exactly on the basis of analysis human synovial and joint of robot difference, establishes certain reflect Relation is penetrated, imitation of the robot arm to human action is realized, so as to complete the motion control of robot arm.
Human synovial is mostly spherical joint, and each joint includes two to three degree of freedom, the motion range in each joint Also it is not quite similar.With reference to Fig. 5,6, the motion mode of shoulder joint includes:It is anteflexion/after stretch, abduction/adduction, horizontal buckling/level are stretched Outside exhibition, rotation/medial rotation;The motion mode of elbow joint includes:Before buckling/stretching, extension, rotation/supination.The angular range of movement as shown in the figure, The basic axis of reference takes erect position, and due to individual difference and test, method for expressing difference, there may be 10 ° for joint angles ~15 ° of error.
When carrying out arm action gesture recognition, according to sensing capability, shoulder joint horizontal buckling/horizontal stretching is acted into letter Turn to the swing angle α in the horizontal direction in large arm;By shoulder joint it is anteflexion/after stretch and stretch action with abduction/interior and be reduced to large arm Pitch angle β in the vertical direction;Anglec of rotation μ of the arm using large arm as axis is reduced to by being acted outside shoulder joint medial rotation/rotation;Elbow closes Buckling/stretching of section is represented by angle γ;Before the rotation of elbow joint/supination need to be differentiated by volar direction, it there is no at present Method recognizes.
Different from human synovial, joint of robot is generally divided into cradle head and linear joint, and each joint includes one A degree of freedom.With reference to Fig. 7, the present invention includes four cradle heads and two finger paws using four-degree-of-freedom manipulator arm.Root Factually border size establishes link rod coordinate system, the D-H parameters such as following table of mechanical arm:
D-H transformation matrixs between each adjacent links of mechanical arm are:
Then mechanical arm direct kinematics equation is:
In formula, p is the position vector of paw, and paw center is directed toward by the origin of base coordinate system;N is robot hand Normal vector, size and Orientation are provided by oXa;O is the direction vector of paw, another finger tip is directed toward in direction for a finger tip;a Be paw close to vector, direction enters the direction of object for paw.Accordingly, robot can be described by [n, o, a, p] Pawl compared with base coordinate system pose.
In order to which robot arm action is made to keep similitude as much as possible with human arm action, and then pass through mould naturally Imitative human action realizes motion planning and robot control, by the horizontal flexion/extension motion of shoulder joint, by large arm in the horizontal direction On rotational angle α, map to the angle, θ of joint of robot 11;By shoulder joint it is anteflexion/after stretch and stretch movement with abduction/interior, lead to Rotational angle β of the shoulder joint in vertical plane is crossed, maps to the angle, θ of joint of robot 22;By buckling/stretching, extension of elbow joint Movement by rotational angle γ, maps to the angle, θ of joint of robot 33.It will move, map to before supination/rotation of elbow joint Joint of robot 4, for the ease of capturing article, joint of robot 4 can be determined by the direction of operation object;In the present invention temporarily Make θ4=0 °.
Thus robot joint angles θ is obtained1、θ2、θ3People-machine mapping ruler with human synovial angle [alpha], between β, γ Such as following table:
It is pointed out that being limited by four-degree-of-freedom manipulator arm locomitivity, the upper and lower arms of robot can only It is moved in perpendicular.Thus, in robot arm apery motion process, medial rotation/rotation outward transport of shoulder joint can not be reappeared It is dynamic, human arm can not be also included in people-organ's section mapping ruler using large arm as the rotation angle information μ of axis.
(3) the articulated robot part receives motion control instruction, performs corresponding joint motions, realizes apery Movement.
According to people-machine mapping relations, to realize the motion control in each joint of robot arm, robot motion pass is established The angle, θ of section1、θ2、θ3With human hand and arm joint angle [alpha], the control planning between β, γ:
In formula, kiRepresent zoom factor, biFor compensation rate.α, β, γ are the vector method meter by human action gesture recognition The human synovial angle information calculated.
Human arm and robot arm joint have differences in structure composition, motion range, the direction of motion etc..It is comprehensive It closes and considers many factors, introduce zoom factor ki, to adjust the difference of actual motion scope and the direction of motion etc.;Change and mend The amount of repaying bi, the difference of initial position caused by joint coordinate system difference etc. is can adjust, and then ensures human arm With the uniformity and similitude of robot arm action.
According to different measurement standards, zoom factor kiWith compensation rate biThere can be different selection principles.
When using robot practical operation as purpose, the space that should make full use of human arm and robot arm is Target ensures human arm and uniformity of the robot arm in terms of the direction of motion and movement tendency, and then, the present invention proposes Motion range maximumlly reappears control strategy, on the premise of Movement consistency is ensured, to sacrifice kinematic similarity as cost, Make full use of the space of human arm and robot arm.
In order to maximize the range of motion, control planning is established rules then as follows really:For joint angle α, arm large arm is horizontal When direction effectively swings -30 °~120 °, joint of robot 1 rotates about in the range of -120 °~120 °.For joint angle β, Arm large arm vertical direction swings 30 °~180 °, the elevating movement in the range of 90 °~0 ° of joint of robot 2.For joint angle γ, when elbow joint stretches 0 ° in the wrong~120 °, manipulator arm joint 3 does elevating movement in the range of -60 °~0 °.Arm span of control limit of control It is as follows with the robot arm motion range table of comparisons:
At this point, people-machine arm joint control planning is:
I.e.
Motion range maximumlly reappears in control, and the motion mode and space of human arm and robot arm are all It is unrestricted.Although having lost some action similitudes, it is dynamic that large arm horizontally rotates, the vertical pitching of large arm and ancon are bent and stretched etc. Make, preferable uniformity is maintained in the direction of motion and movement tendency etc. with joint 1, joint 2, joint 3, there is control to close It is simple characteristic.
The foregoing is merely the preferred embodiment of the present invention, are not intended to limit the invention, for those skilled in the art For, the present invention can have various modifications and changes.All any modifications made within spirit and principles of the present invention are equal Replace, improve etc., it should all be included in the protection scope of the present invention.

Claims (4)

1. a kind of new robot apery kinematic system, which is characterized in that the apery kinematic system includes:Human action is caught Catch part, people-motor-driven makees demapping section and articulated robot part;Wherein, the human action catching portion with it is described People-motor-driven makees demapping section associated;The people-motor-driven demapping section and articulated robot part made is with wireless office Domain Network Communication mode is connected.
A kind of 2. new robot apery kinematic system according to claim 1, which is characterized in that the human action Catching portion obtains human action information, utilizes artis identification and bone by gathering the image and depth information of human action The technologies such as frame tracking, realize human action gesture recognition.
A kind of 3. new robot apery kinematic system according to claim 1, which is characterized in that the people-motor-driven Make demapping section on the basis of analysis human synovial and robot mechanism otherness, mapped into pedestrian-motor-driven, realize machine People's motion control.
4. a kind of new robot apery kinematic system according to claim 1, which is characterized in that described articulated Robot part receives motion control instruction, performs corresponding joint motions, realizes apery movement.
CN201611038652.XA 2016-11-24 2016-11-24 A kind of new robot apery kinematic system Pending CN108098780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611038652.XA CN108098780A (en) 2016-11-24 2016-11-24 A kind of new robot apery kinematic system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611038652.XA CN108098780A (en) 2016-11-24 2016-11-24 A kind of new robot apery kinematic system

Publications (1)

Publication Number Publication Date
CN108098780A true CN108098780A (en) 2018-06-01

Family

ID=62203639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611038652.XA Pending CN108098780A (en) 2016-11-24 2016-11-24 A kind of new robot apery kinematic system

Country Status (1)

Country Link
CN (1) CN108098780A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109333527A (en) * 2018-08-30 2019-02-15 苏州博众机器人有限公司 A kind of exchange method, device, electronic equipment and storage medium with robot
CN109397286A (en) * 2018-09-29 2019-03-01 Oppo广东移动通信有限公司 Robot control method, device, electronic equipment and computer readable storage medium
CN109693251A (en) * 2018-12-18 2019-04-30 航天时代电子技术股份有限公司 A kind of robot control system and method based on motion capture
CN111113429A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN112276947A (en) * 2020-10-21 2021-01-29 乐聚(深圳)机器人技术有限公司 Robot motion simulation method, device, equipment and storage medium
CN112894794A (en) * 2019-11-19 2021-06-04 深圳市优必选科技股份有限公司 Human body arm action simulation method and device, terminal equipment and storage medium
CN113146634A (en) * 2021-04-25 2021-07-23 达闼机器人有限公司 Robot attitude control method, robot and storage medium
CN113492404A (en) * 2021-04-21 2021-10-12 北京科技大学 Humanoid robot action mapping control method based on machine vision
WO2022142078A1 (en) * 2020-12-28 2022-07-07 达闼机器人股份有限公司 Method and apparatus for action learning, medium, and electronic device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109333527A (en) * 2018-08-30 2019-02-15 苏州博众机器人有限公司 A kind of exchange method, device, electronic equipment and storage medium with robot
CN109397286A (en) * 2018-09-29 2019-03-01 Oppo广东移动通信有限公司 Robot control method, device, electronic equipment and computer readable storage medium
CN109693251A (en) * 2018-12-18 2019-04-30 航天时代电子技术股份有限公司 A kind of robot control system and method based on motion capture
CN109693251B (en) * 2018-12-18 2020-12-08 航天时代电子技术股份有限公司 Robot control system and method based on motion capture
CN112894794A (en) * 2019-11-19 2021-06-04 深圳市优必选科技股份有限公司 Human body arm action simulation method and device, terminal equipment and storage medium
CN112894794B (en) * 2019-11-19 2022-08-05 深圳市优必选科技股份有限公司 Human body arm action simulation method and device, terminal equipment and storage medium
CN111113429A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN111113429B (en) * 2019-12-31 2021-06-25 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN112276947A (en) * 2020-10-21 2021-01-29 乐聚(深圳)机器人技术有限公司 Robot motion simulation method, device, equipment and storage medium
WO2022142078A1 (en) * 2020-12-28 2022-07-07 达闼机器人股份有限公司 Method and apparatus for action learning, medium, and electronic device
CN113492404A (en) * 2021-04-21 2021-10-12 北京科技大学 Humanoid robot action mapping control method based on machine vision
CN113492404B (en) * 2021-04-21 2022-09-30 北京科技大学 Humanoid robot action mapping control method based on machine vision
CN113146634A (en) * 2021-04-25 2021-07-23 达闼机器人有限公司 Robot attitude control method, robot and storage medium

Similar Documents

Publication Publication Date Title
CN108098780A (en) A kind of new robot apery kinematic system
CN108762495B (en) Virtual reality driving method based on arm motion capture and virtual reality system
CN106346485B (en) The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture
CN107363813B (en) Desktop industrial robot teaching system and method based on wearable equipment
CN105252532B (en) The method of the flexible gesture stability of motion capture robot collaboration
CN107943283B (en) Mechanical arm pose control system based on gesture recognition
CN109079794B (en) Robot control and teaching method based on human body posture following
CN107953331A (en) A kind of human body attitude mapping method applied to anthropomorphic robot action imitation
CN110570455A (en) Whole body three-dimensional posture tracking method for room VR
CN106607910B (en) A kind of robot imitates method in real time
CN106625658A (en) Method for controlling anthropomorphic robot to imitate motions of upper part of human body in real time
CN107818318B (en) Humanoid robot simulation similarity evaluation method
Yang et al. Teleoperated robot writing using EMG signals
CN109968310A (en) A kind of mechanical arm interaction control method and system
CN109064486A (en) A kind of anthropomorphic robot and human body attitude method for evaluating similarity
CN105328701A (en) Teaching programming method for series mechanical arms
CN109766782A (en) Real-time body action identification method based on SVM
Xiao et al. Machine learning for placement-insensitive inertial motion capture
CN104002307A (en) Wearable rescue robot control method and system
Zhang et al. A real-time upper-body robot imitation system
WO2022227664A1 (en) Robot posture control method, robot, storage medium and computer program
CN102479386A (en) Three-dimensional motion tracking method of upper half part of human body based on monocular video
CN102156994A (en) Joint positioning method of single-view unmarked human motion tracking
CN109895104A (en) A kind of humanoid robot system
Indrajit et al. Development of whole body motion imitation in humanoid robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180601

WD01 Invention patent application deemed withdrawn after publication