CN115167395A - In-plane robot movement control method, device, robot, and storage medium - Google Patents

In-plane robot movement control method, device, robot, and storage medium Download PDF

Info

Publication number
CN115167395A
CN115167395A CN202210680975.8A CN202210680975A CN115167395A CN 115167395 A CN115167395 A CN 115167395A CN 202210680975 A CN202210680975 A CN 202210680975A CN 115167395 A CN115167395 A CN 115167395A
Authority
CN
China
Prior art keywords
robot
image
point
axis
monocular camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210680975.8A
Other languages
Chinese (zh)
Inventor
张斌
魏建超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Boya Ruishi Technology Co ltd
Original Assignee
Beijing Boya Ruishi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Boya Ruishi Technology Co ltd filed Critical Beijing Boya Ruishi Technology Co ltd
Priority to CN202210680975.8A priority Critical patent/CN115167395A/en
Publication of CN115167395A publication Critical patent/CN115167395A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The application provides a method and a device for controlling robot movement in a plane, a robot and a storage medium. The method comprises the following steps: shooting the advancing direction of the robot by using a monocular camera; sending the image shot by the monocular camera to a user terminal so that a user marks a target point of the robot movement on the image; generating a movement control command according to the image marked with the target point returned by the user terminal; and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command. According to the method and the device, the monocular camera is used for shooting the image picture of the advancing direction of the robot, the robot is controlled to reach the marked target position according to the target point marked on the user return image, the mobile control process is simple to calculate, the monocular camera can be replaced by a night vision camera, and even under the condition of dark light, the mobile control process can also run normally.

Description

In-plane robot movement control method, device, robot, and storage medium
Technical Field
The application relates to the technical field of robots, in particular to a method and a device for controlling movement of a robot in a plane, the robot and a storage medium.
Background
With the continuous development of science and technology, robots gradually come into the lives of people. In practical application, the robot can not only accept human commands, but also run pre-programmed programs, and also perform actions according to principles formulated by artificial intelligence technology. The robot can be used indoors or outdoors, can be used for industry or families, can be used for replacing security patrol, replacing people to clean the ground, and can also be used for family companions, assistant office work and the like.
Generally, the movement control mode of a robot can be divided into two modes, one mode is that the robot directly moves according to signals such as set speed and angle, the other mode is that a target point is given, the robot plans a path autonomously, generates a series of movement control commands, converts the movement control commands into signals such as speed and angle, and then sequentially executes the signals to reach a final target point. The first control method has the advantage of simple and direct control, but is not friendly to the user for setting commands, cannot correspond to specific target positions, and needs to be adjusted manually and continuously. The second control mode is friendly to user operation, the robot can independently generate the mobile control command only by setting a target point, but the process of generating the mobile control command is complex in calculation, a model needs to be established in advance for the environment where the robot is located, the coordinate of the target point can be set, the control mode needs more sensor support, and the computational power requirement for the robot is high.
Disclosure of Invention
The application provides a method and a device for controlling the movement of a robot in a plane, the robot and a storage medium, which can realize the movement control process with simple calculation and control the robot to reach a marked target position.
In a first aspect, the present application provides an in-plane robot movement control method, including:
shooting the advancing direction of the robot by using a monocular camera; the monocular camera is fixedly installed on the robot, an optical axis of the monocular camera is parallel to the ground, and three-dimensional attitude angles are all 0 degree;
sending the image shot by the monocular camera to a user terminal so that a user marks a target point of the robot movement on the image;
generating a movement control command according to the image marked with the target point returned by the user terminal;
and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command.
In a second aspect, the present application provides an in-plane robot movement control apparatus comprising:
the acquisition module is used for shooting the advancing direction of the robot by using the monocular camera; the monocular camera is fixedly installed on the robot, an optical axis of the monocular camera is parallel to the ground, and three-dimensional attitude angles are all 0 degree;
the receiving and sending module is used for sending the image shot by the monocular camera to a user terminal so that a user marks a target point moved by the robot on the image;
the mobile control module is used for generating a mobile control command according to the image marked with the target point returned by the user terminal; and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command.
A third aspect of the present application provides a robot comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the computer program when executing the computer program to implement the method of the first aspect of the present application.
A fourth aspect of the present application provides a computer readable storage medium having computer readable instructions stored thereon which are executable by a processor to implement the method of the first aspect of the present application.
Compared with the prior art, the method and the device for controlling the movement of the robot in the plane, the robot and the storage medium provided by the embodiment of the application utilize the monocular camera to shoot the advancing direction of the robot; sending the image shot by the monocular camera to a user terminal so that a user marks a target point of the robot movement on the image; generating a movement control command according to the image marked with the target point returned by the user terminal; and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command. According to the method and the device, the monocular camera is used for shooting the image picture of the advancing direction of the robot, the robot is controlled to reach the marked target position according to the target point marked on the user return image, the mobile control process is simple to calculate, the monocular camera can be replaced by a night vision camera, and even under the condition of dark light, the mobile control process can also run normally.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 shows a flowchart of an in-plane robot movement control method provided by the present application;
FIG. 2 illustrates a schematic diagram of a user terminal and a robot provided by the present application;
fig. 3 is a flowchart illustrating a movement control process of a robot provided by the present application;
FIG. 4 is a schematic diagram of an image taken by a monocular camera as provided herein;
FIG. 5 illustrates a top view of a first cross-section provided herein;
FIG. 6 shows a side view of a second cross-section provided herein;
fig. 7 shows a top view of the robot chassis 10 and monocular camera 20 combination provided herein;
FIG. 8 illustrates a schematic diagram of an in-plane robotic movement control device provided herein;
FIG. 9 shows a schematic view of a robot provided by the present application;
FIG. 10 shows a schematic diagram of a computer-readable storage medium provided herein.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
It is to be noted that, unless otherwise specified, technical or scientific terms used herein shall have the ordinary meaning as understood by those skilled in the art to which this application belongs.
In addition, the terms "first" and "second", etc. are used to distinguish different objects, rather than to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The robot can be classified into an industrial robot, a service robot, an educational robot, an agricultural robot, etc. according to the application scenario, wherein the movement control is an important part of the robot control. Generally, the movement control of the robot is converted into signals such as speed and angle, and the signals are applied to actuating components such as a motor. The movement control command is generally classified into a direct setup control command and an indirect setup control command. The mode of direct setting can be through the upper computer signals such as speed and angle of direct transmission to the robot, and the upper computer can be equipment such as rocker, handle, keyboard, gesture recognition. The movement control command transmitted in the indirect setting method does not include parameters such as speed and angle, and is generally the coordinates of the target point.
The indirect mode is more applied by a navigation method, wherein the navigation method is used for establishing a model for the surrounding environment, providing pose information for the robot and assisting the robot to reach a set target point. The navigation method comprises laser radar, ultrasonic radar, satellite navigation, visual navigation and other methods. The laser radar has higher cost; the range measurement range of the ultrasonic radar is limited, and the directivity is poor; the coverage range of satellite navigation is limited, and satellite signals received indoors are weak, so that the navigation effect is poor; the visual navigation has no use range limitation, can work indoors and outdoors, and has strong anti-interference capability.
In visual navigation, an optical camera acquires image data, feature points for estimating the pose are extracted from the image data, the pose of the robot is solved through a plurality of groups of feature points, and the robot is controlled to complete the next movement control command according to the obtained pose information. The calculation process is complex, and the calculation force requirement on the robot is high.
In view of the above, the present application provides an in-plane robot movement control method, an in-plane robot movement control apparatus, a robot, and a storage medium.
To further illustrate aspects of embodiments of the present application, reference is made to the following description taken in conjunction with the accompanying drawings. It is to be understood that, in the following embodiments, the same or corresponding contents may be mutually referred, and for simplicity and convenience of description, the subsequent description is not repeated.
Referring to fig. 1, a flowchart of an in-plane robot movement control method provided in an embodiment of the present application is shown, where an execution main body of the method may be a robot, such as a sweeping robot. As shown in fig. 1, the in-plane robot movement control method may include the following steps S101 to S104:
s101, shooting the advancing direction of the robot by using a monocular camera.
The monocular camera is fixedly installed on the robot, an optical axis of the monocular camera is parallel to the ground, three-dimensional attitude angles of the monocular camera are all 0 degree, namely a pitch angle, a yaw angle and a roll angle of the monocular camera are all 0 degree.
Specifically, the robot moves in a plane, and the monocular camera is used for shooting the advancing direction to obtain an image of the advancing direction, wherein the lower half part of the image is the ground, and the upper half part of the image may be a wall surface.
S102, sending the image shot by the monocular camera to a user terminal so that a user marks a target point of the robot movement on the image.
And S103, generating a movement control command according to the image marked with the target point returned by the user terminal.
Specifically, step S103 may be implemented as: converting a target point marked on the image to a pixel offset relative to a center of the image; calculating the relative displacement between the target position corresponding to the target point in the physical space and the current position of the robot according to the pixel offset; and generating a movement control command according to the relative displacement.
Specifically, the movement control command may be generated according to the relative displacement based on a shortest path algorithm.
And S104, controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command.
Specifically, the movement control process of the application mainly relates to two devices, namely a remote control user terminal and a robot.
In some embodiments, the functional modules specifically included in the two apparatuses are as shown in fig. 2. The user terminal is used for setting a target position, the user terminal comprises a module which comprises a wireless transceiving module and a touch display screen, the wireless transceiving module receives images returned by the robot, the touch display screen displays returned image pictures, the user sets a target point on the image pictures, and finally the images containing the target point are sent to the robot through the wireless transceiving module. The robot comprises the following modules: the device comprises a wireless transceiving module, a monocular camera, an obstacle detection module and a mobile control module. The monocular camera is used for shooting pictures of the advancing direction of the robot and transmitting image data to the user terminal through the wireless transceiving module. If the fisheye camera is used as a monocular camera, a shot picture has distortion, so that before returning image data, the image needs to be subjected to distortion removal processing, otherwise the accuracy of subsequent movement control is influenced. The obstacle detection module is used for detecting obstacles around the robot, and when the robot detects the obstacles, the robot executes an obstacle avoidance algorithm. The movement control module is used for controlling the movement of the robot, the input of the movement control module is the position of the marking point on the image, the output of the movement control module is a movement control command, and a specific calculation method is described in detail later.
In some embodiments of the present application, a process of controlling the movement of the robot is shown in fig. 3. The robot in the application is a robot capable of freely moving in a plane, the user terminal is an intelligent mobile phone terminal, and customized and developed application software is deployed on a mobile phone to realize the processes of receiving images, displaying the images, marking target points and sending punctuation data.
Specifically, the robot shoots images within a certain visual angle range through the installed monocular camera, and the shot image content comprises a horizontal ground and various obstacles. The robot transmits the image back to the user terminal, and the user terminal displays the transmitted image through the display screen. A user operating a user terminal marks a target point on a touch screen, which may correspond to an actual position (i.e., a target position) in a physical space. Since the method only considers the punctuation process in the plane environment, the selected target point is also the target point on the plane, and if the user selects the target point on the non-plane, the robot cannot generate an effective movement control command.
The robot translates to a shift in pixels based on the target points marked on the image. And calculating the relative displacement of the target point and the current position in the physical space according to the pixel offset. After the moving direction and the moving distance are determined, the robot can generate a moving control command, and the robot is guaranteed to move to a target position through the shortest path. In the process of approaching the target position, an obstacle may be encountered, and if the obstacle is detected, an obstacle avoidance algorithm is executed to avoid the obstacle. Whether the robot meets an obstacle or not, the robot can continuously judge whether the robot moves to a target point or not. If the target position is reached, the execution of the control command is stopped, the robot is stopped, and the control is completed. If the target point has not been reached yet, the control command continues to be executed. The shortest path algorithm and the obstacle avoidance algorithm adopt related technologies, and are not described in detail in the application.
The method mainly researches how to convert a target point marked on an image into the relative displacement between the target point and the current position of the robot, and finally converts the relative displacement into a movement control command, specifically the target point on the horizontal ground in the scene.
As shown in fig. 4, the rectangle represents an image frame captured by the monocular camera, and the height of the image frame is denoted by h and the width thereof is denoted by w. The horizontal direction is represented by an x-axis, and the x-axis direction is parallel to the machineThe direction of robot travel is vertical and the direction perpendicular to the ground is represented using the z-axis. The point O represents the center of the image, is the intersection of the x-axis and the z-axis, and is on the same axis as the focal point of the monocular camera. The C point in the image is the marked target point, and the number of the pixels of the Z-axis deviation from the C point is marked as C w The projection point of the C point on the z axis is C z And the number of pixels from the point C to the x axis is recorded as C h The projection point of the point C on the x axis is C x
For a given image sensor, the physical dimensions of each pixel are known, and the dimension of each pixel in the x-axis direction is denoted as x p And the dimension of each pixel in the z-axis direction is denoted as z p . For a certain camera lens, the focal length is also known, the focal length of the monocular camera being denoted as f. The present application only deals with points where the target points fall on the horizontal ground, so in fig. 4, the marked point C is below the x-axis, indicating that the target points in the corresponding physical space are on the horizontal ground.
The pixel offset of the mark point C in the z-axis direction in the image is OC z =C h *z p
The pixel offset of the mark point C in the x-axis direction in the image is OC x =C w *x p
As shown in fig. 5, a top view of a first cross-section through the focal point F, as viewed from the z-axis, is shown, the first cross-section being a plane defined by the x-axis and the focal point F. In fig. 5, a thick line portion on the x-axis indicates an image frame, the width of the image frame is w, and the projection point of the point C on the x-axis is C x ,T z Is the projection of the corresponding target point in the physical space on the cross section, and F is the focal point of the monocular camera. T is a unit of z The distance from the focal point F in the x-axis direction is T z And the intersection point of K, FK and the x axis is the central point O of the image.
As shown in fig. 6, a side view is shown of a second cross-section through the focal point F, as viewed from the x-axis, the second cross-section being the plane defined by the z-axis and the focal point F. The thick line part on the z-axis in the figure represents the image picture, the height of the image picture is h, and the projection of the target point C on the z-axis is C z ,T y Is the projection point of the corresponding target position T in the physical space on the second section. F is the focal point of the monocular camera. Focal pointF is FN, T y The intersection of N and the z-axis is M.
The method comprises the following steps: calculating the vertical distance T from the target position T to the image frame in the physical space y M。
In fig. 6, according to the theorem of similar triangles, the following first equation holds:
Figure BDA0003698350450000071
FN is the vertical distance from the focal point of the camera to the horizontal ground, and can be measured. C z M=OM-OC z =FN-OC z Wherein OC z Can be obtained by calculating the offset pixel of the target point C in the z-axis direction in the image frame, OC z =C h *z p MN is equal to the focal length f of the monocular camera, so T y N=T y M+MN=T y M + f, the above first equation can be equivalently converted into the following second equation:
Figure BDA0003698350450000072
the second equation above contains only one unknown quantity T y M, so T can be obtained y The length of M.
Step two: calculating a distance T from a target position T in a physical space to an image center O in an x-axis direction z K。
In fig. 5, according to the theorem of similar triangles, the following third equation holds:
Figure BDA0003698350450000073
where FO is the focal length f of the monocular camera and OK is the vertical distance from the target position T in physical space to the image screen, which has been found in step one, so OK = T y M,FK=FO+OK=f+T y M,C x O can be calculated by calculating the x-axis direction of the target point C in the image pictureToward offset pixel get, C x O=C w *x p The third equation can be equivalently converted into a fourth equation:
Figure BDA0003698350450000074
the fourth equation above contains only one unknown quantity T z K, so that T can be obtained z The length of K.
Step three: the relative displacement of the robot center to a target position in physical space is calculated.
As shown in fig. 7, which is a top view composed of the robot chassis 10 and the monocular camera 20, T point in fig. 7 is a target position, and according to the above steps, T has been solved y M and T z Length of K, L x And L y The offset distances of the monocular camera in the x-axis direction and the y-axis direction with respect to the center position of the robot are set as R, respectively, the offset distances of the target position T in the x-axis direction and the y-axis direction with respect to the center position of the robot are set as R x And R y ,R x =T z K+L x ,R y =T y M+L y
If the counterclockwise direction in fig. 7 is taken as the positive direction of the turning angle, in the present embodiment, the turning angle that the robot should turn to reach the target position T is: θ = -arctan (R) x /R y )。
Step four: and generating a movement control command of the robot.
Calculating the displacement R of the center of the robot relative to the target position according to the step three x And R y Setting a basic movement speed value, setting an angular speed value of the robot according to the calculated steering angle theta, and recording the finished displacement of the robot through an odometer of the robot in the process that the robot moves according to the set speed and angle. And when the completed displacement is judged to be the target position, stopping moving the robot.
The solving process of the current position and the target position of the robot in the application can be divided into four steps, the calculation of each step depends on the calculation result of the previous step, the design ensures that the calculation process is simple and convenient, and the calculation amount is reduced.
It is worth mentioning that the solving process of the relative displacement from the current position of the robot to the target position can be further explained by establishing a coordinate system. The monocular camera can be replaced by a night vision camera, and even under the condition of dark light, the mobile control process can also run normally. The user terminal is not limited to a mobile phone implementation mode, and can also be a computer software or a Web page control mode. The moving mode of the robot can be wheel type, crawler type or four-foot type, and the application is not limited.
The current robot movement control method can be divided into two types, one type adopts a mode of directly setting signals such as speed, angle and the like, and the method cannot intuitively reflect the reached target position; the other method adopts navigation, sets the coordinates of the target point and converts the coordinates into signals such as speed, angle and the like, but the method needs to establish a model for the surrounding environment, and the process is complex in calculation. The method aims at solving the problems of the method, and provides a new method, a monocular camera is used for shooting a picture in the forward direction, and a mode of marking points on a returned image is adopted, so that a movement control process with simple calculation can be realized, and the robot can reach a marked target position. According to the method, modeling of the surrounding environment is not needed, the shot picture is the real-time state of the surrounding environment, and the mark points on the image can visually reflect the target position. The calculation process from the current position of the robot to the target position is simple, the relative distance and the rotation angle information can be solved, and the subsequent conversion into a movement control command containing signals such as speed and angle is facilitated.
In the above embodiments, an in-plane robot movement control method is provided, and correspondingly, the present application also provides an in-plane robot movement control apparatus. Please refer to fig. 8, which illustrates a schematic diagram of a robot movement control device in another plane according to some embodiments of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative.
As shown in fig. 8, the in-plane robot movement control apparatus 100 may include:
an acquisition module 101, configured to capture an advancing direction of the robot by using a monocular camera; the monocular camera is fixedly installed on the robot, an optical axis of the monocular camera is parallel to the ground, and three-dimensional attitude angles are all 0 degree;
the transceiving module 102 is configured to send an image captured by the monocular camera to a user terminal, so that a user marks a target point on the image, where the robot moves;
the movement control module 103 is used for generating a movement control command according to the image which is returned by the user terminal and marks the target point; and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command.
In a possible implementation manner, the mobile control module 103 is specifically configured to:
converting a target point marked on the image to a pixel offset relative to a center of the image;
calculating the relative displacement between the target position corresponding to the target point in the physical space and the current position of the robot according to the pixel offset;
and generating a movement control command according to the relative displacement.
In a possible implementation manner, the mobile control module 103 is specifically configured to:
the height of the image is h, the width of the image is w, the x axis represents the horizontal direction, the x axis direction is vertical to the advancing direction of the robot, and the z axis represents the direction vertical to the ground; the point O represents the center of the image, is the intersection point of the x axis and the z axis and is positioned on the same axis with the focus of the monocular camera;
point C represents a target point marked in the image, and the number of pixels of the offset of point C to the z-axis is C w The projection point of the point C on the z axis is C z And the number of pixels from the point C to the x axis is recorded as C h And the projection point of the C point on the x axis is C x (ii) a The dimension of each pixel in the x-axis direction is x p The dimension of each pixel in the z-axis direction is z p (ii) a Then the process of the first step is carried out,
the pixel offset of the point C in the z-axis direction in the image is OC z =C h *z p
The pixel offset of the point C in the x-axis direction in the image is OC x =C w *x p
In a possible implementation manner, the mobile control module 103 is specifically configured to:
the focus of the monocular camera is F, and the focal length is F;
the first cross-section is a plane defined by the x-axis and the focal point F; t is z Is the projection of the corresponding target position T in the physical space on the first section; t is z The distance from the focal point F in the x-axis direction is T z K, the intersection point of FK and the x axis is an image center O;
the second section is a plane defined by the z-axis and the focal point F; t is y Is the projection of the corresponding target position T in the physical space on the second section; the vertical distance from the focus F to the horizontal ground is FN, T y The intersection point of the N and the z axis is M;
the method comprises the following steps: calculating the vertical distance T from the target position T to the image frame in the physical space y M;
According to the theorem of similar triangles, the following first equation holds:
Figure BDA0003698350450000101
wherein, C z M=OM-OC z =FN-OC z MN equals the focal length f of the monocular camera, so T y N=T y M+MN=T y M + f, the first equation may be equivalently transformed into a second equation as follows:
Figure BDA0003698350450000102
the second equation contains only one unknown quantity T y M, according to the secondSolving for T by equality y The length of M;
step two: calculating a distance T from a target position T in a physical space to an image center O in an x-axis direction z K;
According to the theorem of similar triangles, the following third equation holds:
Figure BDA0003698350450000103
where FO is the focal length f of the monocular camera and OK is the vertical distance from the target position T in physical space to the image frame, i.e. OK = T y M,FK=FO+OK=f+T y M, the third equation can be equivalently converted into the fourth equation as follows:
Figure BDA0003698350450000104
the fourth equation contains only one unknown quantity T z K, obtaining T from the fourth equation z The length of K;
step three: calculating the relative displacement of the center of the robot to a target position in a physical space;
L x and L y The offset distances of the monocular camera relative to the center position of the robot in the x-axis direction and the y-axis direction are respectively, the offset distance of the target position T relative to the center of the robot in the x-axis direction and the y-axis direction is R x And R y ,R x =T z K+L x ,R y =T y M+L y
If the counterclockwise direction in the plan view of the first cross section is taken as the positive direction of the rotation angle, the robot will reach the target position T, and the steering angle is θ = -arctan (R) x /R y )。
In a possible implementation manner, the mobile control module 103 is specifically configured to:
and generating a mobile control command according to the relative displacement based on a shortest path algorithm.
In a possible implementation manner, the transceiver module 102 is specifically configured to:
and sending the image shot by the monocular camera to a user terminal after distortion removal.
In a possible implementation manner, the mobile control module 103 is specifically configured to:
detecting obstacles in the process of controlling the robot to run to the target position according to the movement control command;
and if the obstacle is detected, executing an obstacle avoidance algorithm to avoid the obstacle until the robot runs to the target position.
The in-plane robot movement control device provided by the embodiment of the application and the in-plane robot movement control method provided by the embodiment of the application have the same inventive concept and have the same beneficial effects as the method adopted, operated or realized by the device.
Embodiments of the present application also provide a robot, such as a cleaning robot, a service robot, etc., corresponding to the in-plane robot movement control method provided in the foregoing embodiments, so as to execute the in-plane robot movement control method.
Please refer to fig. 9, which illustrates a schematic diagram of a robot provided in some embodiments of the present application. As shown in fig. 9, the robot 30 includes: the system comprises a processor 200, a memory 201, a bus 202 and a communication interface 203, wherein the processor 200, the communication interface 203 and the memory 201 are connected through the bus 202; the memory 201 stores a computer program that can be executed on the processor 200, and the processor 200 executes the in-plane robot movement control method provided in any of the foregoing embodiments when executing the computer program.
The Memory 201 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 203 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
Bus 202 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The memory 201 is used for storing a program, and the processor 200 executes the program after receiving an execution instruction, and the in-plane robot movement control method disclosed in any of the foregoing embodiments of the present application may be applied to the processor 200, or implemented by the processor 200.
The processor 200 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 200. The Processor 200 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, etc. as is well known in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201 and completes the steps of the method in combination with the hardware thereof.
The robot provided by the embodiment of the application and the in-plane robot movement control method provided by the embodiment of the application have the same beneficial effects as the method adopted, operated or realized by the robot.
Referring to fig. 10, the computer readable storage medium is an optical disc 40, on which a computer program (i.e., a program product) is stored, and when the computer program is executed by a processor, the computer program performs the in-plane robot movement control method provided in any of the foregoing embodiments.
It should be noted that examples of the computer-readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memories (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical and magnetic storage media, which are not described in detail herein.
The computer-readable storage medium provided by the above-mentioned embodiments of the present application and the in-plane robot movement control method provided by the embodiments of the present application have the same advantages as the method adopted, run or implemented by the application program stored in the computer-readable storage medium.
It should be noted that the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present disclosure, and the present disclosure should be construed as being covered by the claims and the specification.

Claims (10)

1. An in-plane robot movement control method, comprising:
shooting the advancing direction of the robot by using a monocular camera; the monocular camera is fixedly installed on the robot, an optical axis of the monocular camera is parallel to the ground, and three-dimensional attitude angles are all 0 degree;
sending the image shot by the monocular camera to a user terminal so that a user marks a target point of the robot movement on the image;
generating a movement control command according to the image marked with the target point returned by the user terminal;
and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command.
2. The method according to claim 1, wherein the generating of the movement control command according to the image marked with the target point returned by the user terminal comprises:
converting a target point marked on the image to a pixel offset relative to a center of the image;
calculating the relative displacement between the target position corresponding to the target point in the physical space and the current position of the robot according to the pixel offset;
and generating a movement control command according to the relative displacement.
3. The method of claim 2, wherein converting the target point marked on the image to a pixel offset relative to the center of the image comprises:
the height of the image is h, the width of the image is w, the x axis represents the horizontal direction, the x axis direction is vertical to the advancing direction of the robot, and the z axis represents the direction vertical to the ground; the point O represents the center of the image, is the intersection point of the x axis and the z axis and is positioned on the same axis with the focus of the monocular camera;
point C represents a target point marked in the image, and the number of pixels of the offset of point C to the z-axis is C w The projection point of the point C on the z axis is C z And the number of pixels from the point C to the x axis is recorded as C h The projection point of the point C on the x axis is C x (ii) a The dimension of each pixel in the x-axis direction is x p The dimension of each pixel in the z-axis direction is z p (ii) a Then the process of the first step is carried out,
the pixel offset of the point C in the z-axis direction in the image is OC z =C h *z p
The pixel offset of the point C in the x-axis direction in the image is OC x =C w *x p
4. The method of claim 3, wherein calculating a relative displacement of a target position corresponding to the target point in physical space from a current position of the robot according to the pixel offset comprises:
the focus of the monocular camera is F, and the focal length is F;
the first cross-section is a plane defined by the x-axis and the focal point F; t is z Is the projection of the corresponding target position T in the physical space on the first section; t is z In the x-axis direction, a distance from the focal point F ofT z K, the intersection point of FK and the x axis is an image center O;
the second section is a plane defined by the z-axis and the focal point F; t is y Is the projection of the corresponding target position T in the physical space on the second section; the vertical distance from the focus F to the horizontal ground is FN, T y The intersection point of the N and the z axis is M;
the method comprises the following steps: calculating the vertical distance T from the target position T to the image frame in the physical space y M;
According to the theorem of similar triangles, the following first equation holds:
Figure FDA0003698350440000021
wherein, C z M=OM-OC z =FN-OC z MN is equal to the focal length f of the monocular camera, so T y N=T y M+MN=T y M + f, which can be equivalently converted to a second equation as follows:
Figure FDA0003698350440000022
the second equation contains only one unknown quantity T y M, finding T from the second equation y The length of M;
step two: calculating a distance T from a target position T in a physical space to an image center O in an x-axis direction z K;
According to the theorem of similar triangles, the following third equation holds:
Figure FDA0003698350440000023
where FO is the focal length f of the monocular camera and OK is the vertical distance from the target position T in physical space to the image frame, i.e. OK = T y M,FK=FO+OK=f+T y M, the third equation can be equivalently converted into the fourth equation as follows:
Figure FDA0003698350440000024
the fourth equation contains only one unknown quantity T z K, obtaining T from the fourth equation z The length of K;
step three: calculating the relative displacement of the center of the robot to a target position in a physical space;
L x and L y The offset distances of the monocular camera relative to the center position of the robot in the x-axis direction and the y-axis direction are respectively, the offset distance of the target position T relative to the center position of the robot in the x-axis direction and the y-axis direction is R x And R y ,R x =T z K+L x ,R y =T y M+L y
Taking the counterclockwise direction in the top view of the first cross section as the positive direction of the rotation angle, the robot reaches the target position T, and the steering angle is θ = -arctan (R) x /R y )。
5. The method of claim 2, wherein generating movement control commands based on the relative displacement comprises:
and generating a mobile control command according to the relative displacement based on a shortest path algorithm.
6. The method according to claim 1, wherein the sending the image taken by the monocular camera to a user terminal comprises:
and sending the image shot by the monocular camera to a user terminal after distortion removal.
7. The method of claim 1, wherein the controlling the robot to travel to the target location in the physical space corresponding to the target point according to the movement control command comprises:
detecting obstacles in the process of controlling the robot to run to the target position according to the movement control command;
and if the obstacle is detected, executing an obstacle avoidance algorithm to avoid the obstacle until the robot runs to the target position.
8. An in-plane robotic movement control apparatus, comprising:
the acquisition module is used for shooting the advancing direction of the robot by using a monocular camera; the monocular camera is fixedly installed on the robot, an optical axis of the monocular camera is parallel to the ground, and three-dimensional attitude angles are all 0 degree;
the receiving and sending module is used for sending the image shot by the monocular camera to a user terminal so as to enable a user to mark a target point moved by the robot on the image;
the mobile control module is used for generating a mobile control command according to the image which is returned by the user terminal and marks the target point; and controlling the robot to run to a target position in a physical space corresponding to the target point according to the movement control command.
9. A robot, comprising: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor executes when executing the computer program to implement the method according to any of claims 1 to 7.
10. A computer readable storage medium having computer readable instructions stored thereon which are executable by a processor to implement the method of any one of claims 1 to 7.
CN202210680975.8A 2022-06-16 2022-06-16 In-plane robot movement control method, device, robot, and storage medium Pending CN115167395A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210680975.8A CN115167395A (en) 2022-06-16 2022-06-16 In-plane robot movement control method, device, robot, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210680975.8A CN115167395A (en) 2022-06-16 2022-06-16 In-plane robot movement control method, device, robot, and storage medium

Publications (1)

Publication Number Publication Date
CN115167395A true CN115167395A (en) 2022-10-11

Family

ID=83484758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210680975.8A Pending CN115167395A (en) 2022-06-16 2022-06-16 In-plane robot movement control method, device, robot, and storage medium

Country Status (1)

Country Link
CN (1) CN115167395A (en)

Similar Documents

Publication Publication Date Title
Prusak et al. Pose estimation and map building with a time-of-flight-camera for robot navigation
US10796151B2 (en) Mapping a space using a multi-directional camera
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN107907131B (en) positioning system, method and applicable robot
KR20190030197A (en) System and method for initializing a robot to autonomously traverse a trained path
US20180210442A1 (en) Systems and methods for controlling a vehicle using a mobile device
CN110163963B (en) Mapping device and mapping method based on SLAM
Chatterjee et al. Vision based autonomous robot navigation: algorithms and implementations
JP7345042B2 (en) Mobile robot navigation
US20170154431A1 (en) Mobile robot and method for controlling the same
WO2019001237A1 (en) Mobile electronic device, and method in mobile electronic device
WO2019019819A1 (en) Mobile electronic device and method for processing tasks in task region
US11100602B2 (en) System and method for geometrical user interactions via three-dimensional mapping
CN103092205A (en) Mobile robot and control method thereof based on predesigned move path
WO2018228258A1 (en) Mobile electronic device and method therein
Li et al. Depth camera based remote three-dimensional reconstruction using incremental point cloud compression
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
Su et al. A framework of cooperative UAV-UGV system for target tracking
CN115167395A (en) In-plane robot movement control method, device, robot, and storage medium
CN203241822U (en) A mobile robot based on a preset moving path
US11865724B2 (en) Movement control method, mobile machine and non-transitory computer readable storage medium
Negahdaripour et al. Utilizing panoramic views for visually guided tasks in underwater robotics applications
Meng et al. Embedded GPU 3D panoramic viewing system based on virtual camera roaming 3D environment
Guo et al. Object tracking for autonomous mobile robot based on feedback of monocular-vision
Sheu et al. Implementation of a following wheel robot featuring stereoscopic vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination