CN112591571A - Intelligent robot taking elevator autonomously and control method thereof - Google Patents

Intelligent robot taking elevator autonomously and control method thereof Download PDF

Info

Publication number
CN112591571A
CN112591571A CN202011528398.8A CN202011528398A CN112591571A CN 112591571 A CN112591571 A CN 112591571A CN 202011528398 A CN202011528398 A CN 202011528398A CN 112591571 A CN112591571 A CN 112591571A
Authority
CN
China
Prior art keywords
elevator
robot
key
target
floor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011528398.8A
Other languages
Chinese (zh)
Other versions
CN112591571B (en
Inventor
楼云江
赵均鑫
孟雨皞
陈雨景
赵真
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Harbin Institute of Technology
Original Assignee
Shenzhen Graduate School Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Harbin Institute of Technology filed Critical Shenzhen Graduate School Harbin Institute of Technology
Priority to CN202011528398.8A priority Critical patent/CN112591571B/en
Publication of CN112591571A publication Critical patent/CN112591571A/en
Application granted granted Critical
Publication of CN112591571B publication Critical patent/CN112591571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/24Control systems with regulation, i.e. with retroactive action, for influencing travelling speed, acceleration, or deceleration
    • B66B1/28Control systems with regulation, i.e. with retroactive action, for influencing travelling speed, acceleration, or deceleration electrical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3415Control system configuration and the data transmission or communication within the control system
    • B66B1/3446Data transmission or communication within the control system
    • B66B1/3461Data transmission or communication within the control system between the elevator control system and remote or mobile stations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B2201/00Aspects of control systems of elevators
    • B66B2201/40Details of the change of control mode
    • B66B2201/46Switches or switchgear
    • B66B2201/4607Call registering systems
    • B66B2201/4638Wherein the call is registered without making physical contact with the elevator system

Abstract

The invention relates to the technical field of robot control, and discloses an intelligent robot taking an elevator autonomously and a control method thereof, wherein the control method comprises the following steps: receiving a navigation command, wherein the navigation command comprises a target floor to which the robot needs to arrive, and calling an elevator according to the target floor; if the elevator running direction is consistent with the direction of reaching the target floor, detecting whether the door opening width of the elevator is larger than the width of the robot, and if so, navigating the robot to enter the elevator; detecting the key position of a target floor, and controlling a mechanical arm to press the target floor key; and detecting whether the target floor is reached through the air pressure sensor, and if so, enabling the navigation robot to go out of the elevator. The invention has at least the following beneficial effects: the original elevator control system and the original equipment do not need to be changed, so that the robot can smoothly take the elevator, the elevator is compatible with elevators of different brands, and the universality is good.

Description

Intelligent robot taking elevator autonomously and control method thereof
Technical Field
The invention relates to the technical field of robot control, in particular to an intelligent robot taking an elevator autonomously and a control method thereof.
Background
In the service robot autonomous intelligent technology, a robot multi-floor autonomous navigation technology is a key point needing attention. In recent years, with the development of urbanization in China, the number and population of cities are remarkably increased, and multilayer buildings become spaces for life and work of residents in most cities, so that the multilayer buildings also become important application scenes for serving robots, and a series of practical applications are promoted, such as express and take-out, building cleaning, file distribution, welcome reception and the like, and the robots are required to be capable of performing large-scale navigation on cross floors in the buildings. In the process of the robot navigating across floors, the robot riding an elevator is one of the most critical links.
At present, the commercial robot takes the elevator, and can only realize smooth taking of the elevator by the robot by means of communication interaction between the robot and the elevator or manual assistance by means of some external control and communication units. However, in order to realize the communication between the robot and the elevator, the elevator manufacturer is often required to coordinate the open control protocol and modify the existing elevator equipment. Different communication protocols are designed for elevators of different brands, and the universality is poor. Meanwhile, the operation safety of the elevator cannot be guaranteed by modifying the elevator equipment, and once a safety accident occurs, the responsibility division is difficult to define.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a robot control method, which enables the robot to take the elevator to realize cross-floor navigation completely depending on a sensor and an actuator of the robot without modifying the existing elevator equipment.
The invention also provides a robot with the robot control method.
A robot control method according to an embodiment of the first aspect of the present invention includes: s100, separating a navigation sub-command and target floor information of the robot moving to the elevator of the floor where the robot is located from the navigation command, sending a first moving command to a moving platform of the robot according to the navigation sub-command, enabling the robot to reach the position in front of the lifting elevator of the floor where the robot is located, then sending a first motion control command to trigger a mechanical arm of the robot to reach the position of a key of the elevator, and pressing a corresponding key for going upstairs or downstairs according to whether the target floor is located above or below the floor where the robot is located, so as to call the elevator; s200, identifying display screen information in front of an elevator door through a visual sensor of the robot, determining that the running direction of the elevator is consistent with the direction of reaching a target floor, detecting the opening width of the elevator door, and waiting for the opening width of the elevator door to be larger than or equal to the width of the robot; s300, sending a second movement instruction to a moving platform of the robot, enabling the robot to enter the elevator and enabling the mechanical arm to move to a preset position in front of a target floor key panel, detecting the position of the target floor key through a visual sensor, then sending a second movement control instruction to trigger the mechanical arm to press the target floor key in the elevator, detecting whether an overweight alarm exists in the elevator through a sound sensor, and if yes, navigating the robot to leave the space of the elevator car; s400, identifying display screen information in the elevator through a visual sensor and/or comparing calibrated floor air pressure data through an air pressure sensor to determine that the floor where the robot is located reaches a target floor; s500, when the robot is about to ascend and descend to reach a target floor or reach the target floor, sound and light reminding information is sent, then the opening width of the elevator is detected, a third moving instruction is sent at the same time, and when the opening width of the elevator is larger than or equal to the width of the robot, the robot leaves the elevator.
According to some embodiments of the invention, said step S100 comprises: s110, controlling the robot to reach the front of a lifting elevator of the floor of the robot through a navigation command; s120, controlling the mechanical arm to enable the elevator outer key panel to be in a sensing range of the visual sensor, wherein the visual sensor is located at the front end of the mechanical arm; s130, obtaining the position of a target key of the elevator through the visual sensor, and controlling the mechanical arm to press the target key.
According to some embodiments of the invention, the vision sensor is a depth camera, the step S130 comprises: s131, acquiring a depth image of the elevator outer key panel through the depth camera, and detecting a first position of the target key in a depth image pixel coordinate system; s132, acquiring a second position of the target key in a space coordinate system and a normal vector of a key plane according to the first position; s133, planning a first target position of the tail end of the mechanical arm according to the second position and the normal vector direction of the key plane, wherein the first target position is the target key position, controlling the tail end of the mechanical arm to reach the first target position, and pressing the target key to call an elevator.
According to some embodiments of the invention, said step S200 further comprises: s210, transforming a robot navigation map coordinate system to a robot two-dimensional laser radar coordinate system based on a homogeneous transformation matrix, and obtaining coordinate points of two ends of an elevator door in the two-dimensional laser radar coordinate system; s220, establishing a two-dimensional hyperplane through coordinate points at two ends of the elevator door; s230, obtaining the maximum angle range and the minimum angle range detected by the laser radar according to the two-dimensional hyperplane and the thickness of the elevator door; s240, obtaining an opening amplitude value of the elevator door based on the maximum angle ranging and the minimum angle ranging in combination with coordinate points at two ends of the elevator door; s250, detecting the change trend of the opening amplitude value of the elevator door to obtain the state of the current elevator door, wherein the state is an opening state or a closing state; and S260, determining that the state is an opening state, and navigating the robot to prepare to enter the elevator if the opening amplitude value of the elevator door is larger than the width of the robot.
According to some embodiments of the invention, the calculation of the opening amplitude value of the elevator door comprises the steps of: a two-dimensional hyperplane is established through coordinate points at two ends of the elevator door,
wTx+b=0
Figure BDA0002851364980000031
Figure BDA0002851364980000032
x=(x1,x2)T=(lθcosθ+lθsinθ)
wherein lθFor radar laser radiation and x1The coordinate of two ends of the elevator door is as follows according to the distance measurement result when the shaft is opposite to the angle theta
Figure BDA0002851364980000033
The effective angle range of the laser radar is obtained as follows:
Figure BDA0002851364980000034
Figure BDA0002851364980000035
points within the lidar measurement range may be grouped into sets defined as follows, where Δ θ is the lidar angular resolution:
Figure BDA0002851364980000036
through the two-dimensional hyperplane, the thickness w of the elevator dooreAnd obtaining a set of ranging points at the other end of the elevator door in the set as follows:
Figure BDA0002851364980000041
finding the distance measuring point theta with the largest angle and the distance measuring point theta with the smallest angle from the distance measuring points at the other end of the elevator doorlAnd thetarAnd calculating the opening amplitude of the elevator door according to the maximum angle ranging point and the minimum angle ranging point as follows:
Figure BDA0002851364980000042
Figure BDA0002851364980000043
wo=|vtan(θl3)|+|vtan(θr3)|
wherein, woThe magnitude of the elevator door opening.
According to some embodiments of the invention, the vision sensor is a depth camera, the step S400 further comprises: s410, acquiring a depth image of a key panel in the elevator through the depth camera, and detecting a third position of the target floor key in a depth image pixel coordinate system; s420, acquiring a fourth position of the target floor key in a space coordinate system and a normal vector of a key plane according to the third position; s430, planning a second target position of the tail end of the mechanical arm according to the fourth position and the normal vector direction of the key plane, wherein the second target position is the target floor key position, controlling the tail end of the mechanical arm to reach the second target position, and pressing the target floor key to call an elevator.
According to some embodiments of the invention, the step S500 further comprises: s510, acquiring a first air pressure value, wherein the first air pressure value is acquired when the robot is ready to enter an elevator; s520, acquiring a second air pressure value, wherein the second air pressure value is acquired after the robot enters the elevator; s530, obtaining a target floor air pressure value based on the first air pressure value, the air pressure difference value between adjacent floors, and the floor difference value between the target floor and the initial floor when the robot is ready to enter the elevator; and S540, comparing whether the difference between the second air pressure value and the target floor air pressure value is smaller than a preset error value, if so, enabling the robot to reach the target floor.
According to some embodiments of the invention, said step S300 further comprises: before the robot enters the elevator, the continuous vacant space of the elevator car is detected through at least one sensor, the area of the continuous vacant space projected to the bottom of the elevator car is determined to be larger than or equal to the area of the robot projected to the ground by the volume, and then the navigation robot enters the elevator car.
A robot control method according to an embodiment of the second aspect of the present invention includes the steps of: s1, controlling the robot to arrive at the elevator hall according to the navigation command and calling the elevator according to the target floor command key; s2, waiting for the opening of the elevator door, detecting whether the running direction of the elevator is consistent with the target floor, and if the running direction of the elevator is consistent with the target floor, detecting whether the width of the opening of the elevator door is larger than the width of the robot so that the robot can enter the elevator; s3, after the robot enters the elevator, detecting the key position of the target floor through a visual sensor, and pressing down the key through a mechanical arm; s4, detecting whether the robot reaches a target floor through at least one sensor; and S5, detecting the door opening state of the elevator after the robot is determined to reach the target floor, so that the robot leaves the elevator.
A robot according to an embodiment of the third aspect of the invention comprises a controller, a robot arm, a moving platform and sensors, the sensors comprising a lidar, a depth camera and a barometer, the depth camera being located at a front end of the robot arm, the controller comprising a processor and a memory, the memory having stored thereon a computer program, the processor performing the steps of the robot control method when executing the computer program.
According to the robot control method provided by the embodiment of the invention, at least the following beneficial effects are achieved: make the robot can independently operate the elevator button and take the elevator through robot self sensor and executor, like this, do not make under the condition of change to original elevator control system and equipment, realized that the robot takes advantage of the ladder smoothly, simultaneously because the robot has the ability of independently detecting and operating, can all realize independently taking advantage of the ladder to the elevator of different brands, the commonality is better, need not carry out different settings to different elevators.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for obtaining and pressing a target key according to an embodiment of the present invention;
FIG. 3 is a second flowchart illustrating a method for obtaining and pressing a target key according to a second embodiment of the present invention;
fig. 4 is a flow chart of a method for detecting the opening amplitude of an elevator door according to an embodiment of the invention;
fig. 5 is a flowchart illustrating a method for obtaining and pressing a target floor key according to an embodiment of the present invention;
fig. 6 is a flowchart illustrating a method for detecting a target floor according to an embodiment of the present invention;
FIG. 7 is a block diagram of a robot component module according to an embodiment of the invention;
fig. 8 is a schematic diagram of the calculation of the door opening amplitude of the elevator according to the embodiment of the invention;
FIG. 9 is a second flowchart of a method according to an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, the meaning of a plurality of means is one or more, the meaning of a plurality of means is two or more, and more than, less than, more than, etc. are understood as excluding the present number, and more than, less than, etc. are understood as including the present number. If the first and second are described for the purpose of distinguishing technical features, they are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
All the components in the elevator control system are arranged on the robot body and connected with the robot control center in a wired or wireless mode, the robot control system is not limited to infrared connection, Bluetooth connection, wifi connection, communication mobile network connection, Ethernet connection, serial connection, USB connection and the like, and the battery carried by the robot is used for supplying power, so that the existing elevator equipment does not need to be modified, and the control system of the elevator cannot be influenced.
In a specific embodiment, as shown in fig. 7, the mobile platform is connected to the controller through an ethernet, and has a bottom layer motor control function and a dead reckoning function inside, after receiving a motion command sent by the controller, the mobile platform can convert the motion command of the mobile platform into a motor speed command and control the motor to realize an expected motion, and at the same time, the mobile platform can calculate pose information of the mobile platform in a motion process and send the pose information to the controller through the ethernet.
The mechanical arm is connected with the controller through USB communication. The mechanical arm has the functions of controlling the joint motor and detecting the state, and after the controller sends a motion command to the mechanical arm, the mechanical arm can execute corresponding motion and feed the motion state back to the controller.
The laser radar is connected with the controller through the Ethernet, and can provide distance information between the robot and surrounding objects. The depth camera is arranged at the tail end of the mechanical arm, is connected with the controller through a USB and can return color images and depth images in the field of view of the camera. The barometer is connected with the controller through a USB, and can detect the atmospheric pressure of the current position of the robot.
The method of riding the elevator by the robot is as follows.
Referring to fig. 1, fig. 1 shows one of the flow diagrams of the method of the embodiment of the present invention, including:
s100, separating a navigation sub-command and target floor information of the robot moving to the elevator on the floor where the robot is located from the navigation command, sending a first moving command to a moving platform of the robot according to the navigation sub-command to enable the robot to reach the position in front of the lifting elevator on the floor where the robot is located, then sending a first motion control command to trigger a mechanical arm of the robot to reach the position of an elevator key, and pressing a corresponding upstairs key or downstairs key according to whether the target floor is located above or below the floor where the robot is located to call the elevator;
s200, identifying display screen information in front of an elevator door through a visual sensor of the robot, determining that the running direction of the elevator is consistent with the direction of reaching a target floor, detecting the opening width of the elevator door, and waiting for the opening width of the elevator door to be larger than or equal to the width of the robot;
s300, sending a second movement instruction to a moving platform of the robot, enabling the robot to enter the elevator and enabling the mechanical arm to move to a preset position in front of a target floor key panel, detecting the position of the target floor key through a visual sensor, then sending a second movement control instruction to trigger the mechanical arm to press the target floor key in the elevator, detecting whether an overweight alarm exists in the elevator through a sound sensor, and if yes, navigating the robot to leave the space of the elevator car;
s400, identifying display screen information in the elevator through a visual sensor and/or comparing calibrated floor air pressure data through an air pressure sensor to determine that the floor where the robot is located reaches a target floor;
s500, when the robot is about to ascend and descend to reach a target floor or reach the target floor, sound and light reminding information is sent, then the opening width of the elevator is detected, a third moving instruction is sent at the same time, and when the opening width of the elevator is larger than or equal to the width of the robot, the robot leaves the elevator.
It should be noted that, by the steps and methods, the robot can operate the elevator key and take the elevator by itself through the sensor and the actuator of the robot without changing the original elevator control system and equipment; the method is not sensitive to elevators of different brands, can achieve the effect of autonomous control of the robot for taking the elevator as long as the method conforms to the arrangement of common elevators, and comprises an elevator outer panel, an elevator inner panel and floor display keys conforming to general rules, and has good universality.
As shown in fig. 2, fig. 2 is a flowchart illustrating a method for acquiring and pressing a target key according to an embodiment of the present invention, including:
s110, controlling the robot to reach the front of a lifting elevator of the floor of the robot through a navigation command;
s120, controlling the mechanical arm to enable the outer key panel of the elevator to be in a sensing range of a visual sensor, wherein the visual sensor is positioned at the front end of the mechanical arm;
and S130, acquiring the position of the target key of the elevator through the visual sensor, and controlling the mechanical arm to press the target key.
Referring to fig. 3, fig. 3 is a second schematic flowchart of a method for obtaining and pressing a target key according to an embodiment of the present invention, including:
s131, acquiring a depth image of an elevator outer key panel through a depth camera, and detecting a first position of a target key in a depth image pixel coordinate system;
s132, acquiring a second position of the target key in a space coordinate system and a normal vector of a key plane according to the first position;
and S133, planning a first target position at the tail end of the mechanical arm according to the second position and the normal vector direction of the key plane, wherein the first target position is a target key position, controlling the tail end of the mechanical arm to reach the first target position, and pressing a target key to call the elevator.
Specifically, when the robot receives a navigation command and needs to navigate across floors, the robot is controlled to travel to the elevator on the floor and call the elevator based on the navigation algorithm of the robot in the plane. The elevator calling method is specifically realized in such a way that after the robot reaches the front of the elevator, the mechanical arm is extended out to enable the elevator key panel to be in the visual field range of the depth camera on the mechanical arm, then the position of the elevator key in the picture pixel coordinate system is obtained through color picture detection returned by the depth camera, and the position of the key in the space coordinate system and the normal vector of the key plane are obtained in the depth image according to the position. And defining the target position of the tail end of the mechanical arm according to the position and the normal vector direction, controlling the tail end of the mechanical arm to reach the key position according to a mechanical arm planning and control method, and calling the elevator by pressing the key.
Fig. 4 shows a flow chart of a method for detecting the opening amplitude of an elevator door, which comprises the following steps:
s210, transforming a robot navigation map coordinate system to a robot two-dimensional laser radar coordinate system based on a homogeneous transformation matrix, and obtaining coordinate points of two ends of an elevator door in the two-dimensional laser radar coordinate system;
s220, establishing a two-dimensional hyperplane through coordinate points at two ends of the elevator door;
s230, obtaining the maximum angle range and the minimum angle range detected by the laser radar according to the two-dimensional hyperplane and the thickness of the elevator door;
s240, obtaining an opening amplitude value of the elevator door based on the maximum angle ranging and the minimum angle ranging in combination with coordinate points at two ends of the elevator door;
s250, detecting the change trend of the opening amplitude value of the elevator door to obtain the state of the current elevator door, wherein the state is an opening state or a closing state;
and S260, judging that the state is the opening state, executing the process that the opening amplitude value of the elevator door is larger than the width of the robot, and navigating the robot to prepare to enter the elevator.
Specifically, in a map coordinate system, coordinates of two ends of an elevator door of an i-th elevator room in a building are respectively marked as
Figure BDA0002851364980000091
Positional use of a robot in a map
Figure BDA0002851364980000092
And (4) showing. Because the target marked on the navigation map is positioned on one plane, the z-axis is ignored, and the three-dimensional translation vector and the three-dimensional rotation matrix are simplified to be two-dimensional. In a two-dimensional homogeneous coordinate system, the position of the No. i elevator door can be converted to a robot two-dimensional laser radar coordinate system through a homogeneous conversion matrix of the robot relative to the original point of the map.
After the position of the elevator door is converted into the laser radar coordinate system, the opening and closing state of the elevator door can be detected according to the position of the elevator door under the laser radar coordinate system by combining with the laser radar ranging point data, and the schematic diagram of the detection is shown in fig. 8.
Obtaining the coordinates of two ends of the elevator door under a laser radar coordinate system through coordinate transformation
Figure BDA0002851364980000093
Through the two endpoints, a two-dimensional hyperplane can be established:
wTx+b=0
Figure BDA0002851364980000101
Figure BDA0002851364980000102
x=(x1,x2)T=(lθcosθ+lθsinθ)
wherein lθFor radar laser radiation and x1And measuring the distance when the shaft is at an angle theta. Meanwhile, the effective angle range of the laser radar can be obtained as follows:
Figure BDA0002851364980000103
Figure BDA0002851364980000104
points within the lidar measurement range may be grouped into sets defined as follows, where Δ θ is the lidar angular resolution:
Figure BDA0002851364980000105
according to the established two-dimensional hyperplane and the thickness w of the elevator dooreAnd obtaining a set of ranging points positioned at the other end of the elevator door in the set as follows:
Figure BDA0002851364980000106
finding the distance measuring point theta with the largest angle and the smallest angle from the distance measuring points at the other end of the elevator doorlAnd thetarAnd calculating the opening amplitude of the elevator door according to the two points as follows:
Figure BDA0002851364980000107
Figure BDA0002851364980000108
wo=|vtan(θl3)|+|vtan(θr3)|
in the formula woThe width of the elevator door opening. If the opening width of the elevator door is larger than the width of the robot, a depth camera on the mechanical arm is used for detecting an up-down direction indication mark displayed on an elevator panel, and if the running direction of the elevator is consistent with the target up-down direction, the robot is controlled to enter the elevator.
In a specific embodiment of the invention, the method further comprises the steps of detecting whether the elevator has an overweight alarm through the sound sensor, if so, navigating the robot to leave the space of the elevator car, and judging and processing abnormal conditions by the robot are enriched through detecting the overweight alarm, so that the operation of the robot is more flexible and intelligent.
Fig. 5 is a flowchart illustrating a method for obtaining and pressing a target floor button according to an embodiment of the present invention, including:
s410, acquiring a depth image of a key panel in the elevator through a depth camera, and detecting a third position of a target floor key in a depth image pixel coordinate system;
s420, acquiring a fourth position of the target floor key in a space coordinate system and a normal vector of a key plane according to the third position;
and S430, planning a second target position at the tail end of the mechanical arm according to the fourth position and the normal vector direction of the key plane, wherein the second target position is a target floor key position, controlling the tail end of the mechanical arm to reach the second target position, and pressing a target floor key to call the elevator.
Specifically, according to the design principle of a common elevator, the key panel in the elevator generally has a relatively fixed position, and after the elevator enters the elevator, the mechanical arm is controlled to move so that the key panel of the elevator is within the visual field range of the depth camera. And then detecting the position of the elevator key in the picture pixel coordinate system through a color picture returned by the depth camera, and obtaining the position of the key in a space coordinate system and a normal vector of the key plane in the depth image according to the position. And defining the target position of the tail end of the mechanical arm according to the position and the normal vector direction, controlling the tail end of the mechanical arm to reach the key position of the target floor according to a mechanical arm planning and control method, and pressing the corresponding floor key.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for detecting a target floor according to an embodiment of the present invention, including:
s510, acquiring a first air pressure value, wherein the first air pressure value is an air pressure value acquired when the robot is ready to enter an elevator;
s520, acquiring a second air pressure value, wherein the second air pressure value is acquired after the robot enters the elevator;
s530, obtaining a target floor air pressure value based on the first air pressure value, the air pressure difference value between adjacent floors, and the floor difference value between a target floor and an initial floor when the robot is ready to enter the elevator;
and S540, comparing whether the difference between the second air pressure value and the target floor air pressure value is smaller than a preset error value, if so, enabling the robot to reach the target floor.
Specifically, when the robot enters the elevator, the current air pressure value is recorded as P0Starting to detect the current air pressure value P of the robot after the target floor is pressed, and setting the air pressure value between each floor according to the floor height as PLThe initial floor and the target floor are Ns and Nt respectively, and when the air pressure difference value delta P is equal to P-P0And the calculated target floor air pressure difference delta pt is equal to (Nt-Ns) × PLIf the error is less than the given error threshold value, the robot is indicated to reach the given floor.
In some embodiments of the present invention, the method further comprises: before the robot gets into the elevator, detect the inside continuous vacant space of car of elevator through at least one sensor, confirm that the area of continuous vacant space projection to elevator car bottom is greater than or equal to the area that robot self volume projected to ground, then navigation robot gets into elevator car. The at least one sensor may comprise a laser radar sensor, an infrared sensor, a vision sensor, etc., the continuous empty space inside the elevator car refers to a whole space inside the elevator car, which can accommodate the volume of the robot.
Referring to fig. 9, fig. 9 shows a second flowchart of the method according to the embodiment of the invention, which includes:
s1, controlling the robot to arrive at the elevator hall according to the navigation command and calling the elevator according to the target floor command key;
s2, waiting for the opening of the elevator door, detecting whether the running direction of the elevator is consistent with the target floor, and if the running direction of the elevator is consistent with the target floor, detecting whether the width of the opening of the elevator door is larger than the width of the robot so that the robot can enter the elevator;
s3, after the robot enters the elevator, detecting the key position of the target floor through a visual sensor, and pressing down the key through a mechanical arm;
s4, detecting whether the robot reaches a target floor through at least one sensor;
and S5, detecting the door opening state of the elevator after the robot is determined to reach the target floor, so that the robot leaves the elevator.
It should be noted that, by the steps and methods, the robot can operate the elevator key and take the elevator by itself through the sensor and the actuator of the robot without changing the original elevator control system and equipment; the method is not sensitive to elevators of different brands, can achieve the effect of autonomous control of the robot for taking the elevator as long as the method conforms to the arrangement of common elevators, and comprises an elevator outer panel, an elevator inner panel and floor display keys conforming to general rules, and has good universality.
The embodiment of the invention also includes a robot, as shown in fig. 7, the robot includes a controller, a robot arm, a moving platform and a sensor, the sensor includes a laser radar, a depth camera and a barometer, the depth camera is located at the front end of the robot arm, the controller includes a processor and a memory, a computer program capable of running on the processor is stored on the memory, and when the processor executes the computer program, the control method for the robot to take the elevator is implemented.
Although specific embodiments have been described herein, those of ordinary skill in the art will recognize that many other modifications or alternative embodiments are equally within the scope of this disclosure. For example, any of the functions and/or processing capabilities described in connection with a particular device or component may be performed by any other device or component. In addition, while various illustrative implementations and architectures have been described in accordance with embodiments of the present disclosure, those of ordinary skill in the art will recognize that many other modifications of the illustrative implementations and architectures herein are also within the scope of the present disclosure.
Certain aspects of the present disclosure are described above with reference to block diagrams and flowchart illustrations of systems, methods, systems, and/or computer program products according to example embodiments. It will be understood that one or more blocks of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by executing computer-executable program instructions. Also, according to some embodiments, some blocks of the block diagrams and flow diagrams may not necessarily be performed in the order shown, or may not necessarily be performed in their entirety. In addition, additional components and/or operations beyond those shown in the block diagrams and flow diagrams may be present in certain embodiments.
Accordingly, blocks of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of elements or steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions, elements or steps, or combinations of special purpose hardware and computer instructions.
Program modules, applications, etc. described herein may include one or more software components, including, for example, software objects, methods, data structures, etc. Each such software component may include computer-executable instructions that, in response to execution, cause at least a portion of the functionality described herein (e.g., one or more operations of the illustrative methods described herein) to be performed.
The software components may be encoded in any of a variety of programming languages. An illustrative programming language may be a low-level programming language, such as assembly language associated with a particular hardware architecture and/or operating system platform. Software components that include assembly language instructions may need to be converted by an assembler program into executable machine code prior to execution by a hardware architecture and/or platform. Another exemplary programming language may be a higher level programming language, which may be portable across a variety of architectures. Software components that include higher level programming languages may need to be converted to an intermediate representation by an interpreter or compiler before execution. Other examples of programming languages include, but are not limited to, a macro language, a shell or command language, a job control language, a scripting language, a database query or search language, or a report writing language. In one or more exemplary embodiments, a software component containing instructions of one of the above programming language examples may be executed directly by an operating system or other software component without first being converted to another form.
The software components may be stored as files or other data storage constructs. Software components of similar types or related functionality may be stored together, such as in a particular directory, folder, or library. Software components may be static (e.g., preset or fixed) or dynamic (e.g., created or modified at execution time).
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments, and various changes can be made within the knowledge of those skilled in the art without departing from the gist of the present invention.

Claims (10)

1. A robot control method is characterized by comprising the following steps:
s100, separating a navigation sub-command and target floor information of the robot moving to the elevator of the floor where the robot is located from the navigation command, sending a first moving command to a moving platform of the robot according to the navigation sub-command, enabling the robot to reach the position in front of the lifting elevator of the floor where the robot is located, then sending a first motion control command to trigger a mechanical arm of the robot to reach the position of a key of the elevator, and pressing a corresponding key for going upstairs or downstairs according to whether the target floor is located above or below the floor where the robot is located, so as to call the elevator;
s200, identifying display screen information in front of an elevator door through a visual sensor of the robot, determining that the running direction of the elevator is consistent with the direction of reaching a target floor, detecting the opening width of the elevator door, and waiting for the opening width of the elevator door to be larger than or equal to the width of the robot;
s300, sending a second movement instruction to a moving platform of the robot, enabling the robot to enter the elevator and enabling the mechanical arm to move to a preset position in front of a target floor key panel, detecting the position of the target floor key through a visual sensor, then sending a second movement control instruction to trigger the mechanical arm to press the target floor key in the elevator, detecting whether an overweight alarm exists in the elevator through a sound sensor, and if yes, navigating the robot to leave the space of the elevator car;
s400, identifying display screen information in the elevator through a visual sensor and/or comparing calibrated floor air pressure data through an air pressure sensor to determine that the floor where the robot is located reaches a target floor;
s500, when the robot is about to ascend and descend to reach a target floor or reach the target floor, sound and light reminding information is sent, then the opening width of the elevator is detected, a third moving instruction is sent at the same time, and when the opening width of the elevator is larger than or equal to the width of the robot, the robot leaves the elevator.
2. The robot control method according to claim 1, wherein the step S100 includes:
s110, controlling the robot to reach the front of a lifting elevator of the floor of the robot through a navigation command;
s120, controlling the mechanical arm to enable the elevator outer key panel to be in a sensing range of the visual sensor, wherein the visual sensor is located at the front end of the mechanical arm;
s130, obtaining the position of a target key of the elevator through the visual sensor, and controlling the mechanical arm to press the target key.
3. The robot control method according to claim 2, wherein the vision sensor is a depth camera, and the step S130 includes:
s131, acquiring a depth image of the elevator outer key panel through the depth camera, and detecting a first position of the target key in a depth image pixel coordinate system;
s132, acquiring a second position of the target key in a space coordinate system and a normal vector of a key plane according to the first position;
s133, planning a first target position of the tail end of the mechanical arm according to the second position and the normal vector direction of the key plane, wherein the first target position is the target key position, controlling the tail end of the mechanical arm to reach the first target position, and pressing the target key to call an elevator.
4. The robot control method according to claim 1, wherein the step S200 further includes:
s210, transforming a robot navigation map coordinate system to a robot two-dimensional laser radar coordinate system based on a homogeneous transformation matrix, and obtaining coordinate points of two ends of an elevator door in the two-dimensional laser radar coordinate system;
s220, establishing a two-dimensional hyperplane through coordinate points at two ends of the elevator door;
s230, obtaining the maximum angle range and the minimum angle range detected by the laser radar according to the two-dimensional hyperplane and the thickness of the elevator door;
s240, obtaining an opening amplitude value of the elevator door based on the maximum angle ranging and the minimum angle ranging in combination with coordinate points at two ends of the elevator door;
s250, detecting the change trend of the opening amplitude value of the elevator door to obtain the state of the current elevator door, wherein the state is an opening state or a closing state;
and S260, determining that the state is an opening state, and navigating the robot to prepare to enter the elevator if the opening amplitude value of the elevator door is larger than the width of the robot.
5. The robot control method according to claim 4, characterized in that the calculation of the opening amplitude value of the elevator door is carried out by:
a two-dimensional hyperplane is established through coordinate points at two ends of the elevator door,
wTx+b=0
Figure FDA0002851364970000031
Figure FDA0002851364970000032
x=(x1,x2)T=(lθcosθ+lθsinθ)
wherein lθFor radar laser radiation and x1The coordinate of two ends of the elevator door is as follows according to the distance measurement result when the shaft is opposite to the angle theta
Figure FDA0002851364970000033
The effective angle range of the laser radar is obtained as follows:
Figure FDA0002851364970000034
Figure FDA0002851364970000035
points within the lidar measurement range may be grouped into sets defined as follows, where Δ θ is the lidar angular resolution:
Figure FDA0002851364970000036
through the two-dimensional hyperplane, the thickness w of the elevator dooreAnd obtaining a set of ranging points at the other end of the elevator door in the set as follows:
Figure FDA0002851364970000037
finding the distance measuring point theta with the largest angle and the distance measuring point theta with the smallest angle from the distance measuring points at the other end of the elevator doorlAnd thetarAnd calculating the opening amplitude of the elevator door according to the maximum angle ranging point and the minimum angle ranging point as follows:
Figure FDA0002851364970000038
Figure FDA0002851364970000039
wo=|vtan(θl3)|+|vtan(θr3)|
wherein, woThe magnitude of the elevator door opening.
6. The robot control method according to claim 1, wherein the vision sensor is a depth camera, and the step S400 further comprises:
s410, acquiring a depth image of a key panel in the elevator through the depth camera, and detecting a third position of the target floor key in a depth image pixel coordinate system;
s420, acquiring a fourth position of the target floor key in a space coordinate system and a normal vector of a key plane according to the third position;
s430, planning a second target position of the tail end of the mechanical arm according to the fourth position and the normal vector direction of the key plane, wherein the second target position is the target floor key position, controlling the tail end of the mechanical arm to reach the second target position, and pressing the target floor key to call an elevator.
7. The robot control method according to claim 1, wherein the step S500 further includes:
s510, acquiring a first air pressure value, wherein the first air pressure value is acquired when the robot is ready to enter an elevator;
s520, acquiring a second air pressure value, wherein the second air pressure value is acquired after the robot enters the elevator;
s530, obtaining a target floor air pressure value based on the first air pressure value, the air pressure difference value between adjacent floors, and the floor difference value between the target floor and the initial floor when the robot is ready to enter the elevator;
and S540, comparing whether the difference between the second air pressure value and the target floor air pressure value is smaller than a preset error value, if so, enabling the robot to reach the target floor.
8. The robot control method according to claim 1, wherein the step S300 further includes:
before the robot enters the elevator, the continuous vacant space of the elevator car is detected through at least one sensor, the area of the continuous vacant space projected to the bottom of the elevator car is determined to be larger than or equal to the area of the robot projected to the ground by the volume, and then the navigation robot enters the elevator car.
9. A robot control method is characterized by comprising the following steps:
s1, controlling the robot to arrive at the elevator hall according to the navigation command and calling the elevator according to the target floor command key;
s2, waiting for the opening of the elevator door, detecting whether the running direction of the elevator is consistent with the target floor, and if the running direction of the elevator is consistent with the target floor, detecting whether the width of the opening of the elevator door is larger than the width of the robot so that the robot can enter the elevator;
s3, after the robot enters the elevator, detecting the key position of the target floor through a visual sensor, and pressing down the key through a mechanical arm;
s4, detecting whether the robot reaches a target floor through at least one sensor;
and S5, detecting the door opening state of the elevator after the robot is determined to reach the target floor, so that the robot leaves the elevator.
10. A robot, characterized in that it comprises a controller, a robot arm, a moving platform and sensors, said sensors comprising a lidar, a depth camera and a barometer, said depth camera being located at the front end of the robot arm, said controller comprising a processor and a memory, said memory having stored thereon a computer program, which when executed by said processor performs the steps of the method according to any of claims 1 to 9.
CN202011528398.8A 2020-12-22 2020-12-22 Intelligent robot taking elevator autonomously and control method thereof Active CN112591571B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011528398.8A CN112591571B (en) 2020-12-22 2020-12-22 Intelligent robot taking elevator autonomously and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011528398.8A CN112591571B (en) 2020-12-22 2020-12-22 Intelligent robot taking elevator autonomously and control method thereof

Publications (2)

Publication Number Publication Date
CN112591571A true CN112591571A (en) 2021-04-02
CN112591571B CN112591571B (en) 2022-12-13

Family

ID=75200189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011528398.8A Active CN112591571B (en) 2020-12-22 2020-12-22 Intelligent robot taking elevator autonomously and control method thereof

Country Status (1)

Country Link
CN (1) CN112591571B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113734917A (en) * 2021-08-25 2021-12-03 上海思岚科技有限公司 Hotel distribution robot terraced equipment
CN114084758A (en) * 2021-11-23 2022-02-25 江苏有熊安全科技有限公司 Robot and method for automatically getting on and off elevator and polling robot
CN114104881A (en) * 2021-11-15 2022-03-01 北京云迹科技有限公司 Robot control method and device, electronic equipment and readable storage medium
CN114148836A (en) * 2021-11-08 2022-03-08 中国科学院自动化研究所 Robot autonomous ladder taking method and device
CN114348811A (en) * 2021-12-06 2022-04-15 深圳市普渡科技有限公司 Robot, robot elevator taking method, device and storage medium
CN114536404A (en) * 2022-03-01 2022-05-27 乐聚(深圳)机器人技术有限公司 Robot-based test method, device and storage medium
CN114671309A (en) * 2022-04-14 2022-06-28 广东铃木电梯有限公司 Elevator robot control system and method
CN114905528A (en) * 2022-06-01 2022-08-16 武汉盛恒达智能科技有限公司 Automatic adjusting robot applied to stair movement and using method
CN114988237A (en) * 2022-06-16 2022-09-02 深圳优地科技有限公司 Robot interactive ladder taking method and device, electronic equipment and readable storage medium
CN115352974A (en) * 2022-08-29 2022-11-18 广州鲁邦通物联网科技股份有限公司 Robot elevator taking number-eliminating prevention method and system
CN115432525A (en) * 2022-10-21 2022-12-06 北京云迹科技股份有限公司 Robot floor positioning method and related equipment
CN115432524A (en) * 2022-10-21 2022-12-06 北京云迹科技股份有限公司 Robot floor positioning method and related equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105858383A (en) * 2016-05-31 2016-08-17 北京云迹科技有限公司 Automatic elevator-in/out system and method
CN105867390A (en) * 2016-06-16 2016-08-17 中南大学 Method for controlling transport robot to autonomously enter elevator
CN107021390A (en) * 2017-04-18 2017-08-08 上海木爷机器人技术有限公司 The control method and system of elevator are taken by robot
CN107055241A (en) * 2017-04-18 2017-08-18 上海木爷机器人技术有限公司 The control method and system of elevator are taken by robot
CN108002154A (en) * 2017-11-22 2018-05-08 上海思岚科技有限公司 The method that control robot is moved across floor
CN109748164A (en) * 2019-01-30 2019-05-14 苏州优智达机器人有限公司 A kind of robot and elevator exchange method and system
US20190204844A1 (en) * 2017-12-28 2019-07-04 Tessa Lau Apparatus, System, and Method for Mobile Robot Relocalization
CN110713087A (en) * 2019-10-21 2020-01-21 北京猎户星空科技有限公司 Elevator door state detection method and device
JP2020111394A (en) * 2019-01-08 2020-07-27 東芝エレベータ株式会社 Elevator apparatus, elevator system, and control method for elevator apparatus
CN111728533A (en) * 2020-06-01 2020-10-02 珠海市一微半导体有限公司 Movement control method for robot to get in and out of elevator, laser robot and chip
US10857679B1 (en) * 2017-08-31 2020-12-08 Savioke, Inc. Apparatus and method for auxiliary mobile robot functionality

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105858383A (en) * 2016-05-31 2016-08-17 北京云迹科技有限公司 Automatic elevator-in/out system and method
CN105867390A (en) * 2016-06-16 2016-08-17 中南大学 Method for controlling transport robot to autonomously enter elevator
CN107021390A (en) * 2017-04-18 2017-08-08 上海木爷机器人技术有限公司 The control method and system of elevator are taken by robot
CN107055241A (en) * 2017-04-18 2017-08-18 上海木爷机器人技术有限公司 The control method and system of elevator are taken by robot
US10857679B1 (en) * 2017-08-31 2020-12-08 Savioke, Inc. Apparatus and method for auxiliary mobile robot functionality
CN108002154A (en) * 2017-11-22 2018-05-08 上海思岚科技有限公司 The method that control robot is moved across floor
US20190204844A1 (en) * 2017-12-28 2019-07-04 Tessa Lau Apparatus, System, and Method for Mobile Robot Relocalization
JP2020111394A (en) * 2019-01-08 2020-07-27 東芝エレベータ株式会社 Elevator apparatus, elevator system, and control method for elevator apparatus
CN109748164A (en) * 2019-01-30 2019-05-14 苏州优智达机器人有限公司 A kind of robot and elevator exchange method and system
CN110713087A (en) * 2019-10-21 2020-01-21 北京猎户星空科技有限公司 Elevator door state detection method and device
CN111728533A (en) * 2020-06-01 2020-10-02 珠海市一微半导体有限公司 Movement control method for robot to get in and out of elevator, laser robot and chip

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113734917A (en) * 2021-08-25 2021-12-03 上海思岚科技有限公司 Hotel distribution robot terraced equipment
CN114148836A (en) * 2021-11-08 2022-03-08 中国科学院自动化研究所 Robot autonomous ladder taking method and device
CN114104881A (en) * 2021-11-15 2022-03-01 北京云迹科技有限公司 Robot control method and device, electronic equipment and readable storage medium
CN114084758A (en) * 2021-11-23 2022-02-25 江苏有熊安全科技有限公司 Robot and method for automatically getting on and off elevator and polling robot
CN114348811B (en) * 2021-12-06 2024-04-09 深圳市普渡科技有限公司 Robot, robot boarding method, robot boarding device, and storage medium
CN114348811A (en) * 2021-12-06 2022-04-15 深圳市普渡科技有限公司 Robot, robot elevator taking method, device and storage medium
CN114536404A (en) * 2022-03-01 2022-05-27 乐聚(深圳)机器人技术有限公司 Robot-based test method, device and storage medium
CN114671309A (en) * 2022-04-14 2022-06-28 广东铃木电梯有限公司 Elevator robot control system and method
CN114905528A (en) * 2022-06-01 2022-08-16 武汉盛恒达智能科技有限公司 Automatic adjusting robot applied to stair movement and using method
CN114988237A (en) * 2022-06-16 2022-09-02 深圳优地科技有限公司 Robot interactive ladder taking method and device, electronic equipment and readable storage medium
CN114988237B (en) * 2022-06-16 2024-05-07 深圳优地科技有限公司 Robot interactive elevator taking method and device, electronic equipment and readable storage medium
CN115352974B (en) * 2022-08-29 2024-01-02 广州鲁邦通物联网科技股份有限公司 Number eliminating prevention method and system for robot elevator taking
CN115352974A (en) * 2022-08-29 2022-11-18 广州鲁邦通物联网科技股份有限公司 Robot elevator taking number-eliminating prevention method and system
CN115432525A (en) * 2022-10-21 2022-12-06 北京云迹科技股份有限公司 Robot floor positioning method and related equipment
CN115432524A (en) * 2022-10-21 2022-12-06 北京云迹科技股份有限公司 Robot floor positioning method and related equipment
CN115432524B (en) * 2022-10-21 2023-12-26 北京云迹科技股份有限公司 Robot floor positioning method and related equipment
CN115432525B (en) * 2022-10-21 2024-03-01 北京云迹科技股份有限公司 Robot floor positioning method and related equipment

Also Published As

Publication number Publication date
CN112591571B (en) 2022-12-13

Similar Documents

Publication Publication Date Title
CN112591571B (en) Intelligent robot taking elevator autonomously and control method thereof
US20200333789A1 (en) Information processing apparatus, information processing method, and medium
US20230168686A1 (en) Information processing apparatus, information processing method, information processing system, and storage medium
US5938710A (en) Selectively operable industrial truck
KR100922494B1 (en) Method for measuring pose of a mobile robot and method and apparatus for measuring position of the mobile robot using the method
US20190070730A1 (en) Robot system
CN110928291B (en) Information processing apparatus, information processing method, information processing system, and storage medium
US9902061B1 (en) Robot to human feedback
KR20090009172A (en) Method and apparatus for measuring position of the mobile robot
US20210271262A1 (en) Autonomous Mobile Robot And Method For Controlling An Autonomous Mobile Robot
US20210197369A1 (en) Robot system and supplemental learning method
CN114800535B (en) Robot control method, mechanical arm control method, robot and control terminal
CN114148836B (en) Robot autonomous ladder taking method and device
CN112549043A (en) Collision prediction method and device for construction operation equipment and construction operation equipment
JP6609588B2 (en) Autonomous mobility system and autonomous mobility control method
US20240085916A1 (en) Systems and methods for robotic detection of escalators and moving walkways
JP7412634B2 (en) Method and system for contactless elevator control
KR102500684B1 (en) Robot cleaner and method for controlling robot cleaner
CN114084758A (en) Robot and method for automatically getting on and off elevator and polling robot
KR20220084991A (en) Building with system for detecting abnormality in sensor of robot using elevator
CN113064425A (en) AGV equipment and navigation control method thereof
CN111399518A (en) Multi-sensor-based cooperative robot obstacle avoidance system and control method thereof
JP2021136009A (en) Information processing apparatus, information processing method, and program
EP4092576A2 (en) Learning device, learning method, and computer program product for training
EP4170388A1 (en) Method and system for spatial static map construction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant