CN116350113B - Robot control method, system and client device - Google Patents

Robot control method, system and client device Download PDF

Info

Publication number
CN116350113B
CN116350113B CN202310636948.5A CN202310636948A CN116350113B CN 116350113 B CN116350113 B CN 116350113B CN 202310636948 A CN202310636948 A CN 202310636948A CN 116350113 B CN116350113 B CN 116350113B
Authority
CN
China
Prior art keywords
robot
control
area
draggable
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310636948.5A
Other languages
Chinese (zh)
Other versions
CN116350113A (en
Inventor
刘哲
李梦梦
艾灵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN202310636948.5A priority Critical patent/CN116350113B/en
Publication of CN116350113A publication Critical patent/CN116350113A/en
Application granted granted Critical
Publication of CN116350113B publication Critical patent/CN116350113B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L1/00Cleaning windows
    • A47L1/02Power-driven machines or devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the application provides a robot control method, a robot control system and client equipment. Wherein, the method includes: determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions; determining a moving speed according to the area of the operation termination position of the sliding operation in the interface; and controlling the robot to adjust the gesture according to the target direction, and moving at the moving speed according to the adjusted gesture. According to the technical scheme, the sliding operation can be performed in any direction on the interface by designing the user, so that the user can control the robot to move in any direction, and an omnidirectional movement control function is provided for the robot. In addition, the moving speed is determined according to the area of the operation termination position on the interface, so that the variety of moving speed determination modes can be realized, the control requirement of a user on the variety of moving speeds of the robot is met, and the user control experience is improved.

Description

Robot control method, system and client device
Technical Field
The present disclosure relates to the field of robots, and in particular, to a method and system for controlling a robot, and a client device.
Background
The surface cleaning robot, such as a window cleaning robot, can help the higher user of living floor to clean various smoother surfaces such as outer facade glass windows, glass doors, etc., reduce the cost input of manpower, reduce the danger of window cleaning and door cleaning.
When using the surface cleaning robot, the user can manually remotely control the surface cleaning robot by means of a remote control tool to perform the work. However, at present, when a user remotely controls a surface cleaning robot through a remote control tool such as a corresponding application installed on a terminal, there is a problem in that the user cannot remotely control the movement of the robot in a direction in which the user actually wants the surface cleaning robot to move for cleaning work; when the remote control surface cleaning robot cleans the surfaces of large-area glass and the like, a user always needs to control the surface cleaning robot on a corresponding application interface, and once the control is stopped, the surface cleaning robot can immediately stop working, so that great inconvenience is brought to the user.
Disclosure of Invention
In view of the foregoing, the present application provides a robot control method, system and client device that solves or at least partially solves the foregoing problems.
In one embodiment of the present application, a robot control method is provided. The method is suitable for the client, and comprises the following steps:
determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions;
determining a moving speed according to the area of the operation termination position of the sliding operation in the interface;
and controlling the robot to adjust the gesture according to the target direction, and moving at the moving speed according to the adjusted gesture.
In another embodiment of the present application, a robot control method is also provided. The method is suitable for the client, and comprises the following steps:
determining a slip distance in response to a slip operation on the interface;
determining the area of the operation termination position of the sliding operation in an interface according to the sliding distance;
determining a moving speed according to the area of the operation termination position in the interface;
and controlling the robot to move according to the moving speed.
In yet another embodiment of the present application, a robot control method is also provided. The method is suitable for the client, and comprises the following steps:
determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions;
Determining a moving speed in response to a release operation after the slipping operation reaches a target area;
and controlling the robot to adjust the gesture according to the target direction, and continuously moving at the moving speed according to the adjusted gesture.
In yet another embodiment of the present application, a robot control method is also provided. The method is suitable for the client, and comprises the following steps:
displaying a control on an interface, wherein the control comprises a limit area and a draggable control which can be dragged to any direction on the limit area;
when the draggable control is detected to be triggered to move in any one of the 360-degree directions on the limiting area, controlling a robot to follow a target direction corresponding to the draggable control when the movement is stopped, adjusting the moving direction and moving according to the adjusted moving direction;
when the draggable control is detected to be released when the movement of the draggable control stops in a first area in the limiting area, the draggable control is restored to a set position, and the robot is controlled to stop moving;
when the draggable control is detected to be released when the movement of the draggable control stops in a second area in the limiting area, locking the draggable control, and controlling the robot to maintain to move according to the adjusted movement direction; wherein the second region is outside the first region.
A further embodiment of the present application provides a robot control system. The robot control system includes:
a robot for performing a job task on a surface of a job object;
and the client is in communication connection with the robot and is used for realizing the steps in the robot control method provided by the application.
Yet another embodiment of the present application provides a client device. The client device includes a memory and a processor; wherein,,
the memory stores one or more computer instructions;
the processor is coupled to the memory and is configured to execute the one or more computer instructions to implement the steps in the robot control methods provided in the present application.
According to the technical scheme, after the target direction is determined in response to the sliding operation on the interface in any direction of 360 degrees, the moving speed is determined in the area of the interface according to the operation termination position of the sliding operation, then the robot can be controlled to adjust the gesture according to the target direction, and the robot can move at the moving speed according to the adjusted gesture. In addition, the moving speed is determined according to the area of the operation termination position on the interface, so that the variety of moving speed determination modes can be realized, the control requirement of a user on the variety of moving speeds of the robot is met, and the user control experience is improved.
According to the technical scheme provided by the other embodiment of the application, based on the sliding distance determined in response to the sliding operation on the interface, the area of the operation termination position of the sliding operation in the interface can be determined, and then the moving speed is determined according to the area of the operation termination position in the interface, so that the robot is controlled to move according to the moving speed. According to the method and the device, the moving speed is determined based on the area of the operation termination position in the interface, so that various controls on the moving speed of the robot can be realized, and various speed control functions are provided for users.
According to the technical scheme provided by the further embodiment of the application, after the target direction is determined in response to the sliding operation on the interface along any one of the 360-degree directions and the moving speed is determined in response to the releasing operation after the sliding operation reaches the target area, the robot can be controlled to adjust the gesture according to the target direction, and the robot can continuously move at the determined moving speed according to the adjusted gesture. According to the scheme, after the user performs sliding operation to the target area, even if the user does not perform any operation, the robot can be controlled to continuously move, so that the operation burden of the user is favorably reduced, the remote control (man-machine interaction) efficiency is improved, the remote control cost is reduced, and particularly when the robot is controlled to work on a work object with a large surface area, very friendly man-machine interaction is provided for the user.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, a brief description will be given below of the drawings that are needed in the embodiments or the prior art descriptions, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1a is a schematic illustration of an inner/outer surface of a work object according to one embodiment of the present application;
FIG. 1b is a schematic illustration of the direction difference between the inner and outer surfaces of an object according to one embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a robot control method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a manual remote control interface according to an embodiment of the present disclosure;
fig. 4a and fig. 4b are schematic diagrams of drag interaction control on a first area in a limit area included in a manual remote control interaction interface according to an embodiment of the present application;
fig. 5a and 5b are schematic diagrams of a draggable control being dragged to a second area in a limit area according to an embodiment of the present disclosure;
FIGS. 6a and 6b are schematic diagrams of a draggable control provided in an embodiment of the present application in a locked state;
FIG. 7 is a schematic diagram of an orientation gesture of a graphical element synchronization robot on a draggable control provided by an embodiment of the present application;
fig. 8 and fig. 9 are schematic flow diagrams of a robot control method according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a robot control system according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a robot control device according to an embodiment of the present disclosure.
Detailed Description
Before introducing the technical solutions provided by the embodiments of the present application, some terms or technical terms related to the present application will be briefly described.
The "inner surface", "outer surface" of an object such as a glass window, a glass door, or the like (i.e., a work object described below) referred to in the present application is, in particular, with respect to a user:
the inner surface is a window surface or door surface facing the user when facing a surface of an object such as a glass window or glass door. As shown in fig. 1a for the glazing 2, the glazing 2 is directed towards the window surface 21 on the side of the user 1, i.e. the inner window (inner surface) with respect to the user 1.
The outer surface refers to a window surface, a door surface or the like on the other side opposite to the inner surface, and the outer surface and the inner surface are opposite and never adjacent. Fig. 1b shows an exploded view of the surface of the glazing 2 of fig. 1a, as seen in fig. 1b, with the window surface 22 on the opposite side from the window surface 21 (being the inner window), i.e. the outer window (outer surface) with respect to the user 1.
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application.
In some of the flows described in the specification, claims, and drawings described above, a plurality of operations occurring in a particular order are included, and the operations may be performed out of order or concurrently with respect to the order in which they occur. The sequence numbers of operations such as 101, 102, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different modes, devices, modules, etc., and do not represent a sequential order, and are not limited to the "first" and "second" being different types. Furthermore, the embodiments described below are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The following provides a detailed description of various embodiments provided herein.
Fig. 2 is a schematic flow chart of a robot control method according to an embodiment of the present application, where an execution body of the method is a client, and the client may be a terminal used by a user, and more specifically, the execution body may be an application software installed on the terminal, or may be a piece of hardware with an embedded program integrated on the terminal, or may also be a piece of tool software embedded in an operating system of the terminal, and the embodiment is not limited to this. Preferably, the execution subject is specifically application software installed on the terminal that can be used to remotely control the robot. The terminal may be a smart phone, a tablet computer, a personal computer, a smart wearable device (such as a smart bracelet), a remote controller, etc. that can communicate with the robot, which is not limited in this embodiment. The robot may be a surface cleaning robot for cleaning a surface, such as a window cleaning robot, a sweeping robot, etc. As shown in fig. 2, the robot control method provided in this embodiment includes the following steps:
101. determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions;
102. Determining a moving speed according to the area of the operation termination position of the sliding operation in the interface;
103. and controlling the robot to adjust the gesture according to the target direction, and moving according to the adjusted gesture.
In 101 above, the interface is provided by the client. The client (such as application software) often has a plurality of interfaces, each interface corresponds to a different function, where the interfaces are manual remote control interaction interfaces provided by the client for remotely controlling the robot, for example, the manual remote control interaction interface 5 or the manual remote control interaction interface 5 'shown in fig. 3, and the user may enter the manual remote control interaction interface 5 or the manual remote control interaction interface 5' by clicking the "manual remote control" control 31 on the main interface 3 provided by the client.
As seen in fig. 3, a manual remote control interactive interface 5 is shown on which is displayed a manipulation control comprising a limit area 51 and a draggable control 511 on the limit area 51 that can be dragged in any one of the 360 degrees. The limit area 51 may be used to limit the range in which the draggable control can be dragged, lock the draggable control, and the like. The draggable control 511 is used at least for controlling the direction in which the robot moves on the surface of the work object, for example, the moving speed of the robot may be set to a fixed value, that is, the robot is moving at a constant speed on the surface of the work object, in which case the draggable control 511 may be used only for controlling the moving speed of the robot. The draggable control 511 can move in any direction within the range defined by the limit area 51, so that a user can drag the draggable control 511 in any direction within the limit area 51 to realize the omnidirectional movement control of the robot. I.e., the sliding operation in any one of the 360 degree directions on the interface described in 101 above, in other words, the touch drag operation of the draggable control 511 in any one of the 360 degree directions in the limit area 51 by the user. The situation that the draggable control 511 is touched can be detected by executing a corresponding sensor on the main body, so as to determine whether the user triggers a touch drag operation on the draggable control in the limit area. Specifically, when the draggable control is detected to be touched and the position of the draggable control is changed, it is determined that the drag operation is triggered on the draggable control by the user in the limit area.
The draggable control 511 can be used to control, in addition to the direction of movement of the robot, other variables of the robot, for example, the speed of movement of the robot, and in particular, the speed of movement of the robot can be controlled in combination with at least one of the area in which the draggable control 511 is located in the limit area 51, the distance from the center of the limit area 51, and the like, in other embodiments.
Specific implementations of controlling the direction, speed, etc. of movement of the robot by the draggable controls 511 will be described in detail below and will not be described in detail herein.
The shapes of the limit area 51 and the draggable control 511 shown in fig. 3 are all circular, and this circular shape is only illustrative, and in practice, the shapes of the limit area 51 and the draggable control 511 may be other shapes, such as rectangular (e.g., rounded rectangular or rectangular), elliptical, etc. In addition, the default initial display position of the draggable control 511 is the center (the median) of the limit area, that is, in the case that the user does not perform any dragging operation on the draggable control, the draggable control is displayed at the center of the limit area by default, and specifically, by default, the draggable control and the limit area are displayed concentrically.
From the above, the method provided in this embodiment may further include the following steps:
100a, displaying a control on the interface, wherein the control comprises a limit area and a draggable control which can be dragged to any direction on the limit area;
100b, responding to the sliding operation, and moving the draggable control in a limit area according to the sliding operation;
in the above 101, when the target direction is determined, it may be determined according to the operation end position of the slip operation. That is, one possible implementation of 101 "determining the target scenario in response to the slip operation" described above may specifically include:
1011. determining an operation termination position of the slip operation;
1012. the target direction is determined based on the operation termination position.
In a specific implementation, the operation termination position of the sliding operation is a position in the limit area when the dragging of the draggable control is stopped, or a position in the limit area when the user stops dragging the draggable control, specifically, the operation termination position can be determined according to the position information of the center point of the draggable control in the limit area when the dragging is stopped.
The dragging stopping of the draggable control mainly means that a user pauses dragging of the draggable control, but still keeps touching the draggable control; of course, in other examples, stopping dragging the draggable control may also mean that the user has finished the drag operation on the draggable control, i.e., the user is not touching the draggable control (or has released the drag operation described above), i.e., the user releases the draggable control, such as in the case where the user drags the draggable control to a second area in the limited zone and releases the draggable control, as described below.
For example, as shown in fig. 4a, the draggable control 511 is at a position with a distance R from the center position of the limit area on the positive half axis of the x-axis when the movement is stopped, and this position is the operation end position of the sliding operation.
Based on the operation termination position of the sliding operation, the distance between the draggable control and the center of the limiting area, the area where the draggable control is positioned in the limiting area, the deflection angle of the draggable control relative to the reference direction and the like can be analyzed so as to provide data support for determining the moving speed and direction of the robot.
Based on the foregoing, in one implementation manner, the 1012 "determining the target direction based on the operation termination position" may specifically include:
10121. determining a yaw angle of the draggable control relative to a reference direction based on the operation termination position;
10122. and determining the target direction according to the deflection angle.
In practice, the positive direction of one coordinate axis in the image coordinate system corresponding to the limit area may be used as the reference direction. When determining the deflection angle of the draggable control relative to the reference direction, the deflection angle can be determined as the deflection angle between the vector formed by the connecting line between the central position of the limiting area and the operation termination position (i.e. the position where the draggable control stops moving) and the reference direction according to the preset determination direction, such as the clockwise direction.
See, for example, FIG. 4aThe positive y-axis direction (the positive north direction) in the image coordinate system corresponding to the limit area can be used as the reference direction, and the offset angle of the draggable control relative to the positive y-axis direction is 90 when the dragging is stopped o The determined target direction is the positive x-axis direction (the positive right direction, i.e., the positive eastern direction); or if the draggable control 511 has a yaw angle 45 with respect to the reference direction as shown in fig. 5a o The determined target direction is north-east 45 o Direction.
The determined target direction may determine a direction in which the robot moves on the surface of the work object. In other words, the target direction indicated by the yaw angle of the draggable control with respect to the reference direction, i.e. the direction of movement is determined for the robot.
The specific implementation of 102 will be described in detail below, and will not be repeated here.
In the step 103, the determined target direction may be sent to the robot, so that the robot adjusts its own orientation according to the received target direction, and after the adjustment is completed, moves on the surface of the work object according to the adjusted orientation at the movement speed determined in the step 102, thereby executing the corresponding work task.
For example, with continued reference to fig. 4a and receiving the examples listed in the detailed steps 10121-10122, assuming that the robot moves to a position on the surface of the work object 2, as shown by the dashed line where the robot 200 is located, the target direction received from the client is the positive x-axis direction (positive right direction), at which time the robot adjusts its orientation posture to the positive right direction, the adjusted orientation posture is the posture of the robot 200 shown by the solid line, and moves on the surface of the work object 2 in accordance with the adjusted orientation posture.
The working object may be a glass window of a vertical plane, a ceramic tile wall surface, or the like, or may be the ground. For example, if the robot is an adsorption type surface cleaning robot, the work object may be a glass window of a vertical type, a tile wall surface, or the like; if the robot is a sweeping robot, the work object may be the ground.
It should be noted here that, in addition to the above-mentioned determined target direction being sent to the robot, the determined movement speed and the actual surface of the work object where the robot is located may be sent to the robot, so as to ensure that the robot can move in the correct direction at the corresponding movement speed. The specific implementation of determining the moving speed, and the reason and the specific implementation of determining the actual surface corresponding to the operation of the robot will be described in detail below, and will not be described in detail here.
According to the technical scheme provided by the embodiment, after the target direction is determined in response to the sliding operation on the interface in any direction of 360 degrees, the moving speed is determined in the area of the interface according to the operation termination position of the sliding operation, then the robot can be controlled to adjust the gesture according to the target direction and move at the moving speed according to the adjusted gesture. In addition, the moving speed is determined according to the area of the operation termination position on the interface, so that the variety of moving speed determination modes can be realized, the control requirement of a user on the variety of moving speeds of the robot is met, and the user control experience is improved.
Further, the present embodiment can determine the moving speed according to the operation end position of the slip operation for controlling the moving speed of the robot in addition to the target direction according to the operation end position of the slip operation for controlling the moving direction of the robot. Wherein the movement speed can be determined in combination with the slip distance. Based on this, in one implementation solution, the step 102 of determining the moving speed according to the area to which the operation termination position belongs on the interface may specifically include the following steps:
S01, determining a sliding distance based on the operation termination position;
s02, determining the area of the operation termination position in the interface according to the sliding distance;
s03, determining the moving speed based on the area of the operation termination position in the interface.
In S01, the slip distance refers to a distance between an operation end position of the slip operation and a center of the limit region. That is, the above S01 "determining the slip distance based on the operation end position" may specifically include:
and S011, calculating the distance between the operation end position and the central position of the limit area to determine the slip distance.
In particular, a distance between the operation termination position and the center position of the limit region may be calculated by a corresponding distance calculation algorithm, such as euclidean distance.
In the above step S02, since the operation termination position is the position of the draggable control on the limit area when the movement of the draggable control is stopped, the area of the draggable control in the limit area is determined according to the sliding distance; the area where the draggable control is located in the limit area is the area where the operation termination position belongs to on the interface. That is, the above S02 "determining the region to which the operation termination position belongs in the interface according to the slip distance" may specifically include;
S021, determining the area where the draggable control is located in the limit area according to the sliding distance;
s022, determining the area where the draggable control is located in the limit area as the area where the operation termination position belongs in the interface.
Accordingly, the implementation scheme of determining the moving speed "based on the area to which the operation termination position belongs in the interface in S03" above is as follows:
s031, determining the moving speed based on the area where the draggable control is located in the limit area
In the embodiment, the limiting area is mainly divided into two areas, and the draggable controls are in different areasThe manner in which the speed of movement is determined for the robot to be controlled will also vary while in the area. For example, if the first area 512 is an area surrounded by a circle with a circular center and a radius R of the limiting area and the second area 513 is an area surrounded by a circle with an outermost part of the limiting area, as shown in fig. 4a, the first area of the draggable control in the limiting area may be determined if the sliding distance is less than or equal to R; under the condition that the sliding distance is greater than or equal to R', a second area of the draggable control in the limiting area can be determined; wherein,, The method comprises the steps of carrying out a first treatment on the surface of the Wherein R' represents a line from the center of the stopper region to the inner edge of the second region 513 +.>Is a distance of (3).
If the draggable control is in the first area when the dragging is stopped, the moving speed of the robot can be determined by the distance value (namely the sliding distance) between the draggable control and the center of the limiting area, or a preset speed value can be directly determined as the moving speed of the robot; if the draggable control is in the second area when the dragging is stopped, the moving speed of the robot may be determined by the deflection angle of the draggable control with respect to the reference direction, or the moving speed of the robot may be determined according to the rotation angle, the rotation direction, etc. of the user on the second area, or, of course, the preset other speed value may also be directly determined as the moving speed of the robot, etc., which is not limited herein.
Based on the above, in this embodiment, the above-mentioned limiting area includes a first area and a second area outside the first area. In a specific implementation manner, the step of determining the moving speed "in S031" based on the area where the draggable control is located in the limit area may be implemented by using any one of the following steps:
Case one: if the draggable control is in the first area in the limit area, then: determining the moving speed according to the sliding distance; or, determining the first preset speed as the moving speed.
And a second case: if the draggable control is in the second area in the limit area, then: determining the moving speed according to the deflection angle; or, in response to a rotation operation for the second region, determining the moving speed according to a rotation angle and a rotation direction corresponding to the rotation operation; or, determining the second preset speed as the moving speed.
In the above case, if the method 11 is adopted: and determining the moving speed according to the sliding distance, and then: the smaller the sliding distance is, the smaller the moving speed is; the greater the slip distance, the greater the speed of movement. I.e. the slip distance is proportional to the speed of movement. In particular, the movement speed may be determined based on a preset distance-speed correspondence, which may be characterized by a mathematical expression, which may be a continuous function or a piecewise function. Wherein,,
Under the condition that a data expression of the corresponding relation between the preset distance and the speed is a continuous function, determining the moving speed of the robot according to the ratio of the distance D (i.e. the sliding distance) of the draggable control relative to the center of the limiting area and the maximum distance D of the draggable control which can be dragged by a user on the first area; the maximum distance D that the draggable control can be dragged on the first area may be, but is not limited to, determined according to a distance between a target pixel point on the first area and a center of the limit area, where the target pixel point is a pixel point farthest from the center of the limit area on the first area. That is, in this case, the data expression of the preset correspondence relationship between the distance and the speed may be as follows formula 1:
formula 1: velocity v =
Where V represents the maximum movement speed supported by the robot.
For example, referring to fig. 4a, assuming that the first area 512 is an area surrounded by a circle with a circle center of the limit area and a radius R in the limit area, when the dragging is stopped, the draggable control 511 is on the boundary of the first area, that is, the distance between the draggable control 511 and the center of the limit area is R, and the moving speed of the robot is determined to be V based on the corresponding relationship between the preset distance and the speed represented by the above formula 1.
Under the condition that the data expression of the corresponding relation between the preset distance and the speed is a piecewise function, the maximum distance D of the draggable control which can be dragged on the first area can be constructed in a piecewise mode according to a plurality of speed levels supported by the robot, wherein each piece corresponds to one speed level supported by the robot. For example, assume that the movement speed supported by the robot is divided into three gears as follows: high speed (corresponding moving speed is V 1 ) Medium speed (corresponding moving speed is V 2 ) Low speed (corresponding moving speed is V 3 ) And correspondingly, the maximum distance D can be divided into three segments, and the three distance segment ranges obtained by dividing are as follows: 0~D 1 、D 1 ~D 2 、D 2 D, in this case, the data expression of the corresponding relationship between the preset distance and the speed may be as follows formula 2:
formula 2: velocity v =Wherein V is 1 >V 2 >V 3
Wherein V is as described above 1 、V 2 V (V) 3 Representing the speed of movement supported by the robot.
If mode 12 is adopted: the first preset speed is determined as the moving speed, and the first preset speed can be flexibly set according to actual conditions, so long as the first preset speed part is ensured to exceed the maximum moving speed supported by the robot.
In the second case, if the method 21 is adopted: determining a movement speed for the robot according to the deflection angle of the draggable control relative to the reference directionThe precondition is that the second area is designed to be fixed and not rotatable, in which case the yaw angle (or target direction) is in one-to-one correspondence with the speed of movement. In particular, the movement speed may be determined based on a preset correspondence between angle and speed. The preset correspondence between the angle and the speed may be established based on the number of directions dividing the direction of the second area and the maximum moving speed supported by the robot; if the number of directions dividing the direction of the second region is N, the peripheral angle corresponding to the second region is 360 o Viewed about the center of the limit area) in equally spaced apart N divisions thereby dividing the direction of the second area into N directions, each 360 being varied over the second area o The angle/N, i.e. one direction. When N is infinity or N is greater than a set threshold value such as 20000, it can be considered that the direction of the second region is designed to be endless; where N is finite and less than or equal to a set threshold (e.g., 20000, 360, 24, etc.), it may be considered that the direction of the second region is designed to be limiting.
For example, as shown in fig. 4a or fig. 5, the second region 513 is defined as a region surrounded by a circular ring at the outermost part of the limiting region, if the peripheral angle (360 o Is equally spaced about the center of the spacing zone 51 (i.e., center point 0), i.e., the direction of the second zone 513 is divided into 24 directions, each 15 changes over the second zone 513 o I.e. changing one direction, a speed value can be set for every other direction of the robot based on the maximum movement speed V supported by the robot, which is to say: every 15 on the second region 513 o A speed value, specifically, a data expression of a correspondence relationship between a direction and a speed (i.e., angle and speed) is set as follows in expression 3:
formula 3: velocity v =,/>Where n=1, 2,.....,24
Wherein, V is the maximum moving speed supported by the robot; the angle θ represents the yaw angle of the draggable control relative to the reference direction.
The above formula 3 can also be characterized by the following table 1:
TABLE 1
Based on equation 3 above, assuming that the second area 513 shown in fig. 5a is fixed, if the specific position of the draggable control 511 in the second area 513 is as shown in fig. 5a, the offset angle of the draggable control with respect to the reference direction is 45 o The moving speed of the robot is determined to be (V/24) 3.
If mode 22 is adopted: according to the rotation angle and the rotation direction corresponding to the rotation operation triggered by the second area, determining the moving speed, provided that: the second area is designed to be rotatable, in this case, after the user drags the draggable control to the first position of the second area to stop, the draggable control can be released, at this time, the draggable control can be fixed at the first position of the second area where the draggable control is located when being released, that is, the deflection angle of the draggable control relative to the reference direction is kept fixed, then the user can rotate the second area, wherein the rotation direction can determine whether to perform speed increase or speed decrease control on the robot, and the rotation angle can determine the magnitude of speed increase or decrease. For example, when rotating clockwise, the larger the rotation angle, the larger the moving speed can be made; when the needle is rotated counterclockwise, the larger the rotation angle is, the smaller the moving speed is. Namely, the offset angle (or target direction) and the moving speed are in one-to-many correspondence.
In particular, in the manner 22, a corresponding relationship between a rotation angle, a rotation direction and a speed is also established in advance based on the number of directions dividing the direction of the second area and the maximum moving speed supported by the robot, as described in the above manner 21, which is some related matters that can establish a corresponding relationship between a preset angle and a speed based on the number of directions dividing the direction of the second area and the maximum moving speed supported by the robot.
For example, assuming that the second region 513 shown in fig. 5a is rotatable, a reference line of the second region (an extension line of the reference line passes through the center of the limit region) is taken as a dividing start position in advance, and a peripheral angle (360 o Is divided into 24 parts at equal intervals around the center of the limit area 51 (i.e. the center point 0), namely the direction of the second area 513 is divided into 24 directions, and the rotation angle of the second area 513 is 15 when the user rotates the second area 513 every time o That is, changing a direction, with the maximum movement speed V supported by the robot, the mathematical expressions of the corresponding relationship between the rotation angle, the rotation direction and the speed can be expressed as following formulas 4 and 5:
upon clockwise rotation, formula 4: velocity v=v 0 +
Upon counterclockwise rotation, equation 5: velocity v=v 0 -
Wherein v is 0 Indicating the initial speed corresponding to the first position of the draggable control before the user rotates the second area;indicating a rotation angle corresponding to the rotation operation of the user on the second area; the speed v represents the movement speed determined from the rotation angle and the rotation angle.
It should be noted that, before the second area is not rotated by the user, that is, when the second area is in the default state, the reference line of the second area may be in an overlapping state with the reference direction, for example, the reference line may be a line passing through the center of the "lock" pattern displayed on the second area and the center of the defined area. In addition, in order to facilitate the clear knowledge of the user that the robot speed can be controlled to increase or decrease by rotating the second area in that direction, the user can be prompted by the display color of the second area, such as rotating clockwise to increase the speed of the robot, rotating counterclockwise to decrease the speed of the robot, and so on.
For example, after the user drags the draggable control 511 to the position of the second area as shown in fig. 5a and releases the draggable control 511, the draggable control 511 is fixed at the position as shown in fig. 5a, and a prompt message such as "please rotate the second area to be the speed of the robot selection movement" may be displayed at this time, if the initial speed of the robot is 0 at this time, the user rotates the second area by 30 clockwise as shown in fig. 5b o If the moving speed is determined to be V/12 based on the equation 4, the robot is controlled to be on the surface of the work object according to the north-east deviation 45 o The direction moves at a movement speed V/12. Further, if the user feels that the speed of the movement of the robot is too difficult, the second area is rotated again by 15 in the clockwise direction o Then based on equation 4, it can be determined that the moving speed is V/8 (i.e., V/12+V/24), then the robot is controlled to be in accordance with north-east 45 o The direction is moved at a movement speed V/8, and the speed of movement of the robot is increased. Further, if the user feels that the robot moves too fast, the job execution effect is not good, and the second area is rotated by 30 counterclockwise o Then based on equation 5, it can be determined that the moving speed is V/24 (i.e., V/8-V/12), then the robot is controlled to be in accordance with north-east 45 o The direction is moved at a movement speed V/24, and the speed of movement of the robot is reduced.
If mode 23 is adopted: and determining the second preset speed as the moving speed, wherein the second preset speed can be flexibly set according to actual conditions, and the second preset speed part is ensured to exceed the maximum moving speed supported by the robot. The second preset speed may be the same as or different from the first preset speed, and is not limited herein. In this way 23, the second region can also be designed to be fixed and not rotatable.
Determining the movement speed in this way 23 and in the way 12 described above will result in the movement speed being the same regardless of which direction the robot is controlled to move.
After determining the movement speed, the determined movement speed may be transmitted to the robot so that the robot moves at the movement speed when moving on the surface of the work object in accordance with the adjusted orientation posture.
In view of the above, the present embodiment determines a movement speed for a robot based on an area to which an operation termination position belongs on an interface, can implement various controls for the speed of movement of the robot, and provides a variety of speed control functions for users.
Further, if it is detected that the user triggers a release operation for a slip operation at the operation end position, that is, simply triggers a release operation for the draggable control at the operation end position, and the draggable control is not touched any more, it is also possible to control whether the robot stops moving or keeps moving continuously based on the above-described slip distance. Based on this, the method provided in this embodiment may further include the following steps:
s04, controlling the robot to stop moving or controlling the robot to remain moving at the moving speed in the target direction according to the slip distance in response to a release operation for a slip operation at the operation termination position.
When the method is implemented, if a first area of the draggable control in the limiting area is determined according to the sliding distance, the draggable control is restored to an initial position, namely the central position of the limiting area, when released, and the robot stops moving; if the second area of the draggable control in the limiting area is determined according to the sliding distance, the draggable control can be locked when released, so that the robot keeps moving continuously.
The specific implementation of locking the draggable control will be described in detail below, and will not be repeated here.
Further, considering that the robot is actually on the surface of the work object to perform the work task, the robot may be on the inner surface of the work object or may be on the outer surface of the work object, for different surfaces on which the robot is located, it is often required that the user inputs different control information (such as a moving direction) according to the actual surface of the robot work at present, so as to ensure that the robot can perform the work correctly according to the control information input by the user, for example: referring to fig. 1b, if the robot (window cleaning robot) is on the outer surface 22 of the glass window, the user wants the robot to clean to the left, and then needs to input control information to the right to enable the robot to move in the correct direction for performing the operation, which is extremely not in line with the operation habit of the user, and makes the use extremely inconvenient for the user. In this embodiment, in order to ensure that the robot can operate the draggable control directly according to the habit of the internal surface view angle to control the robot to move correctly, no matter on the internal surface or the external surface of the operation object, the solution provided in this embodiment displays, on an interface, a drawing element for the user to select the current view of the robot, where the drawing element is used to reflect whether the robot is on the external surface or the internal surface of the operation object. When the robot is controlled to move, the robot is controlled according to control information input by a user through operation of the draggable control, and the surface of the operation object where the robot is located is controlled, wherein the surface of the operation object where the robot is located can be determined through monitoring a selection operation triggered by a user on a picture element on an interface, and the selected picture element can be displayed on the draggable control, so that whether the robot is on the outer surface or the inner surface of the operation object can be expressed through the picture element displayed on the draggable control.
Based on the foregoing, the method provided in this embodiment further includes the following steps:
s11, determining the surface of an operation object where the robot is located;
s12, displaying a graph element reflecting the surface of the operation object where the robot is located on the draggable control;
wherein the surface of the job object includes at least one of: an inner surface facing the user, an outer surface opposite the inner surface.
The above step S11 may be performed before responding to a touch drag operation acting on the draggable control. In a specific implementation manner, the step S11 "determining the surface of the robot where the robot is located" may specifically include:
s111, displaying at least one picture element; one figure element reflects the robot on one surface of the work object;
s112, responding to the selection operation of the at least one picture element, and determining the selected picture element;
s113, determining the surface of the operation object where the robot is located according to the selected picture element.
For example, referring to FIG. 3, after clicking on the "manual remote control" control 31 on the client-provided main interface 3, the user may first switch into the selection interface 4, where two graphical elements 42, namely graphical element 421 and graphical element 422, are displayed on the selection interface 4. Wherein the graphical element 421 characterizes the user's view of the robot's grip surface with grips (top surface), reflecting the robot's presence on the interior surface of the work object; the graphical element 422 characterizes what the user sees is the track surface (bottom surface) of the robot, reflecting that the robot is on the outer surface of the work object. The user can select a figure element 42 matched with the robot view seen by the current view according to the robot view seen by the current view, and if the user triggers a clicking operation on the figure element 421, the executing body responds to the clicking operation and can switch into the manual remote control interactive interface 5, and the two figure elements 42 are displayed on the upper part of the manual remote control interactive interface 5, wherein the states of the figure element 421 in the two figure elements 42 are selected states, and in addition, the executing body can determine that the robot is on the inner surface of a work object according to the selected figure element 421; assume again that the user triggers a clicking operation on the graphic element 422, the executing body switches to enter the manual remote control interactive interface 5 'in response to the clicking operation, and the two graphic elements 42 are displayed on the upper portion of the manual remote control interactive interface 5', wherein the states of the graphic elements 422 in the two graphic elements 42 are selected states, and the executing body determines that the robot is on the outer surface of the operation object according to the selected graphic elements 422.
In the following, if the user wants to change the view of the robot seen from the current view angle, the user can directly click the graphic element 42 on the manual remote control interaction interface without returning to the selection interface 4.
Of course, in other embodiments, the selection interface 4 may be omitted, and after clicking the "manual remote control" control 31 on the main interface 3 provided by the client, the user directly switches to the manual remote control interactive interface 5, and displays the two graphical elements 42 on the manual remote control interactive interface 5, so that the user directly selects the graphical element 42 matching the robot view seen by the current viewing angle through the manual remote control interactive interface 5.
It should be noted here that when the user triggers a skip operation to the selection interface 4 described in the above example, the user may switch to the matched manual remote control interaction interface according to the robot view currently seen by the default user. The robot view currently seen by the default user may be a handle surface of the robot or a track surface of the robot, which is not limited in this embodiment. Preferably, the default user is currently seeing the grip surface of the robot, in other words, preferably the default is that the robot is on the inner surface of the work object. In addition, when the graphic element 42 is displayed, prompt information for reminding the user to select the robot view seen at the current viewing angle may be displayed near the graphic element 42, for example, referring to fig. 3, a "please select the robot view you are currently seeing" may be displayed on the selection interface 4 in an area above the area where the graphic element 42 is located.
In the step S12, the drawing element displayed on the draggable control and reflecting the surface of the operation object where the robot is located is the selected drawing element determined by executing the steps S111 to S113. For example, the examples listed above for steps S111 to S113 are received, and the drawing element displayed on the draggable control 511 included on the manual remote control interactive interface 5 is the selected drawing element 421.
In addition, in order to enable the user to directly know the gesture information of the robot through the drawing elements on the draggable control, the embodiment can also update the gesture of the drawing elements on the draggable control by acquiring the orientation gesture information of the robot. That is, further, the method provided in this embodiment may further include the following steps:
s13, acquiring orientation posture information of the robot;
and S14, updating the gesture of the graph element on the draggable control according to the orientation gesture information, so that the updated graph element on the draggable control can reflect the orientation gesture of the robot.
In specific implementation, the orientation and posture information of the robot is sent by the robot, and the robot can acquire the orientation and posture of the robot through a sensor arranged on the body.
For example, referring to fig. 7, in the process that the robot 200 works on the inner surface of the work object, the posture of the robot is adjusted according to the target direction received from the client 100, the posture of the adjusted robot is the direction indicated by the arrow in the figure, the collected posture information of the robot is sent to the client 100 after the adjustment of the robot is completed, and then the client 100 updates the posture of the drawing element 421 on the draggable control according to the received posture information of the robot, so that the posture of the drawing element 421 on the draggable control is synchronous with the direction posture of the robot, and the updated drawing element on the draggable control can reflect the posture of the robot. The description is as follows: in order to make clear that the gesture of the drawing element 421 on the draggable control is the facing gesture of the synchronous robot, an enlarged view of the draggable control 511 is also shown on the left side of the manual remote control interaction interface 5 in fig. 7.
Still further, considering that when the robot is manually controlled to move on the surface of the work object, the user is basically required to continuously operate the corresponding control by himself, for example, if the user wants to control the robot to perform upward continuous movement on the surface of the work object, the control provided on the application interface for controlling the upward movement of the robot must be always controlled, and once the control is released, the robot stops to move upward, which brings great inconvenience to the user. To solve the problem, the embodiment uses the limit area and the draggable control The robot is controlled by a user in two ways, namely, the robot is controlled by self-operating a draggable control, and the robot is controlled by locking the draggable control at a constant speed and in a directional manner. Specifically, as shown in fig. 4a, when the robot is required to clean a small area surface, the user may perform real-time operation on the draggable control 511 by himself in the first area 512 included in the limit area to control the movement of the robot, for example: the draggable control 511 can be dragged in the positive x-axis direction (positive right direction, i.e., forward eastern direction) in the first area 512, and when the dragging is stopped (during which the user always keeps touching the draggable control), the robot will perform a movement in the positive right direction of the surface on which the robot is located; when the area of the cleaning surface of the robot is relatively large, such as a large floor glass window with a height greater than a first threshold (e.g., 3 meters) and/or a width greater than a second threshold (e.g., 1.5 meters), the user may drag the draggable control 511 onto the second area 513 to lock the draggable control, thereby implementing constant speed directional autonomous movement control of the robot, for example: referring to FIG. 5a, if the user wants the robot to go north to east 45 on the surface of the glazing to which he or she is attached o Continued movement in the direction, draggable control 511 may be dragged into second region 513 at a yaw angle 45 clockwise relative to the positive y-axis direction (being the reference direction, i.e., the north-positive direction) o When the draggable control 511 enters the second area 513, the lock control function corresponding to the second area 513 is activated, and the graphic element originally displayed on the draggable control 511 for representing the view of the robot seen by the user (i.e. for reflecting the surface of the work object where the robot is located) also disappears, or does not disappear, which is not limited herein; when the user releases the draggable control 511, the draggable control 511 will be in a locked state and the robot will follow north-east 45 on the surface of the glazing to which it is attached o The direction keeps a fixed speed for autonomous movement. The determination of the fixed speed can be found in the above-related context.
Based on the above, when the draggable control is in the second area in the limit area, the method provided by the embodiment may further include the following steps:
s21, updating the display state of the second area to reflect the activation of the lock control function corresponding to the second area; the locking function is used for locking the draggable control so as to control the robot to perform directional fixed-speed autonomous movement.
And S22, the drawing element used for reflecting the surface of the operation object where the robot is located on the draggable control disappears or remains displayed.
In S21 described above, updating the display state of the second area may be achieved by, but not limited to, changing the color of the second area.
For example, referring to fig. 4a, in the case where the draggable control 511 does not enter the second area 513 in the first area 512, the display color of the second area 513 may be the first color to reflect that the lock control function corresponding to the second area is in an inactive state; further, as shown in fig. 5a, when the draggable control 511 is detected to enter the second area 513, the display color of the second area is updated from the first color to the second color, so as to reflect that the lock control function corresponding to the second area is in an activated state.
It should be noted that, in addition to updating the display state of the second area, the display state of the first area may also be updated to reflect that the draggable control is dragged out of the first area. In particular, updating the display state of the first region may be achieved, but is not limited to, by changing the boundary color of the first region or by changing the boundary appearance state of the first region, or the like. For example, with continued reference to fig. 4a and 5a, when the draggable control is in the first region, the boundary of the first region appears, and the boundary color is a third color, and so on; further, when it is detected that the draggable control is not in the first area but dragged into the second area, the boundary of the first area may disappear or the boundary color is updated from the third color to the fourth color.
In S22, for example, with continued reference to fig. 4a and fig. 5a, when the draggable control 511 enters the second area 513, the graphical element 421 originally displayed on the draggable control 511 for reflecting that the robot is on the inner surface of the work object disappears. Of course, as shown in fig. 5b, the graphical element 421 originally displayed on the draggable control 511 for reflecting that the robot is on the inner surface of the work object may also remain displayed, which is not limited herein.
In addition, under the precondition described above (i.e. when the draggable control is in the second area), the method provided in this embodiment may further include the following steps:
s23, locking the draggable control and updating the display state of the draggable control to reflect that the draggable control is in a locked state when detecting a release operation for a sliding operation at the operation termination position;
for details of the release operation, reference is made to the above-mentioned step S04.
For example, referring to fig. 5a and 6a, in the case where the second area is fixed and non-rotatable, when it is detected that the user releases the draggable control (the draggable control is released after touching is finished), the draggable control may be locked on the boundary of the first area, specifically, according to the operation termination position, the draggable control 511 may be locked on the boundary of the first area at a position closest to the operation termination position and on a line (e.g., fig. 6 a) when the draggable control 511 is released by the user, as shown in fig. 5a, to ensure that the deflection angle of the locked draggable control with respect to the reference direction is consistent with the deflection angle of the draggable control 511 with respect to the reference direction when the draggable control 511 is released. In the locking process, the display state of the draggable control may be updated, as shown in fig. 6a, a lock drawing element may be added to the draggable control 511, for example, the lock drawing element may be a "lock" drawing, the boundary color of the draggable control may be updated from a fifth color to a sixth color, an identifier (such as some arrows 6 shown in fig. 6 a) for indicating the deflection angle direction of the draggable control may be displayed near the draggable control, and the like, which is not limited herein, so long as the updated display state of the draggable control is guaranteed to reflect that the draggable control is in the locked state. In addition, locking the draggable control causes the second area 513 to disappear.
As another example, as shown in fig. 5b and fig. 6b, in the case that the second area is rotatable, when it is detected that the user releases the draggable control, the draggable control may be fixed at a position corresponding to the second area where the draggable control is located when released; in addition, the display state of the draggable control may also be updated, as shown in fig. 6b: a locked drawing element may be added to the boundary of the draggable control 511, or the drawing element originally displayed on the draggable control 511 may be updated to be a locked drawing, and the boundary color of the draggable control 511 may be changed, which is not limited herein, so long as the updated display state of the draggable control is guaranteed to reflect that the draggable control is in the locked state.
Based on the above example, in a specific implementation technical solution, the "locking the draggable control" in the above step S23 may be implemented by the following steps:
s231, fixing the draggable control at a target position on the boundary of the first area according to the operation termination position, wherein the target position is nearest to the operation termination position and is on a straight line;
s232, the second area disappears.
In another specific implementation technical scheme, the "locking the draggable control" in S23 may be implemented by the following steps:
S231', fixing the draggable control at the operation termination position.
It should be noted that after the draggable control is locked, locking prompt information, for example, locking prompt information may be written as "locking running" on the interface, for example, the draggable control may be displayed; or the system can also send out a vibration prompt for reminding the draggable control to be fixed (locked), so that a user can intuitively obtain that the draggable control is fixed through the vibration prompt, and the efficiency of man-machine interaction on robot control is higher.
In the locked state of the draggable control, a user can perform operations such as clicking at any position on the limit area to unlock the draggable control. The description is as follows: in order to avoid misunderstanding of the lock caused by the user's desire to rotate the second region when the second region is rotatable, the unlocking method provided in this embodiment is: the user can perform operations such as clicking on any position on the other areas except the second area in the limit area, and unlock the draggable control.
Alternatively, the draggable control may be automatically unlocked if it is determined that the robot movement is stopped. The unlocked draggable control is restored to the default position, namely the center position of the limit area. In addition, after the draggable control is unlocked, if the second area is in a disappearing state, the originally disappeared second area will also reappear.
That is, the method provided in this embodiment may further include the following steps:
s24, unlocking the draggable control and recovering the draggable control to the center position of the second area when the condition of unlocking is detected to be met;
s25, if the second area is in a disappearing state, enabling the second area to appear;
wherein satisfying the unlocking condition includes at least one of: receiving movement stop information sent by the robot, and monitoring an operation triggered on the limit area (more specifically, detecting an operation triggered on the remaining area except the second area in the limit area).
In practice, the reasons for stopping the movement of the robot may be, but are not limited to: a collision of the robot with an obstacle on the surface of the work object, a failure of a sensor (e.g., an infrared sensor) on the robot, a short circuit in the robot's line, etc. The obstacle on the surface of the operation object may be a static obstacle or a dynamic obstacle, for example, a robot is a window cleaning robot, the operation object is a glass window of a vertical plane, and the obstacle may be a frame of the glass window, a hook adhered on the glass window, a thicker sticker, and the like. Taking the robot as a sweeping robot and the ground with an operation object as a plane as an example, the obstacle can be a wall body, a table and a chair, a walking person and the like.
The operation triggered on the limit area may be clicking, long pressing, sliding, etc., and is not limited herein.
After the draggable control is unlocked according to the unlocking condition, the draggable control is restored to the center position of the second area, and the drawing element used for reflecting the surface of the working object where the robot is located is redisplayed on the draggable control, and the gesture of the drawing element is synchronized with the orientation gesture of the robot, so that the drawing element on the draggable control can reflect the gesture of the robot. For a specific implementation of synchronizing the orientation gestures of the robot with the gestures of the drawing elements on the draggable control, reference is made to the above related content, and detailed description thereof will not be repeated here. In addition, if the original second area is in a disappearing state, the second area is also reappeared.
As can be seen from the above, after the draggable control is unlocked, the interface is restored to the original state, such as to the manual remote control interactive interface 5 shown in fig. 3.
In summary, the embodiment can lock the draggable control by using the second area in the limit area, after the draggable control is locked, even if a user is not executing any operation, the execution main body of the embodiment can control the robot to continuously move on the surface of the operation object according to the direction and the speed corresponding to the position where the draggable control is locked, so as to execute the operation task.
The foregoing description of the present application is mainly directed to the point of view of providing an omni-directional control function for a robot, and the following description of the present application is directed to other points of view in conjunction with other embodiments.
Fig. 8 is a flowchart of a robot control method according to another embodiment of the present application. The execution subject of the method is a client. Details regarding clients can be found in the other embodiments above. As shown in fig. 8, the robot control method provided in this embodiment includes the following steps:
201. determining a slip distance in response to a slip operation on the interface;
202. determining the area of the operation termination position of the sliding operation in an interface according to the sliding distance;
203. determining a moving speed according to the area of the operation termination position in the interface;
204. and controlling the robot to move according to the moving speed.
For details of the implementation of the foregoing embodiments 201 to 204, reference may be made to the related content in other embodiments, and detailed descriptions thereof are omitted herein.
According to the technical scheme provided by the embodiment, the moving speed can be determined based on the sliding distance determined in response to the sliding operation on the interface, and then the robot can be controlled to move according to the moving speed. The sliding distance-based speed control method can realize various controls on the moving speed of the robot, and provides various speed control functions for users.
It should be noted that, in addition to the steps shown above, the robot control method provided in this embodiment may further include other steps, and for the specific steps that may be included, reference may be made to each step in the other robot control methods provided in this application, which is not described herein.
Fig. 9 is a flowchart of a robot control method according to another embodiment of the present disclosure. The execution subject of the method is a client. Details regarding clients can be found in the other embodiments above. As shown in fig. 9, the robot control method provided in this embodiment includes the following steps:
301. determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions;
302. determining a moving speed in response to a release operation after the slipping operation reaches a target area;
303. and controlling the robot to adjust the gesture according to the target direction, and continuously moving at the moving speed according to the adjusted gesture.
In the foregoing, the sliding operation performed on the interface along any direction of the 360 degrees may be a touch drag operation triggered by the user on the draggable control 511 in the limit area along any direction of the 360 degrees in the limit area included in the interface.
The target area is a second area in the limit area, and the target area is associated with a lock control function. The lock control function is used for locking an operation end position of the sliding operation when the release operation is detected. In the implementation, the operation termination position of the sliding operation may refer to a position of the draggable control in the target area when the draggable control is released after being dragged to the target area. Accordingly, locking the operation end position of the sliding operation can be achieved by locking the draggable control.
For details of the implementation of 301 to 303, reference may be made to the related content in the other embodiments, and detailed descriptions thereof are omitted herein.
According to the technical scheme provided by the embodiment, after the target direction is determined in response to the sliding operation on the interface along any one of the 360-degree directions and the moving speed is determined in response to the releasing operation after the sliding operation reaches the target area, the robot can be controlled to adjust the gesture according to the target direction, and the robot can continuously move at the determined moving speed according to the adjusted gesture. According to the scheme, after the user performs sliding operation to the target area, even if the user does not perform any operation, the robot can be controlled to continuously move, so that the operation burden of the user is favorably reduced, the remote control (man-machine interaction) efficiency is improved, the remote control cost is reduced, and particularly when the robot is controlled to work on a work object with a large surface area, very friendly man-machine interaction is provided for the user.
It should be noted that, in addition to the steps shown above, the robot control method provided in this embodiment may further include other steps, and for the specific steps that may be included, reference may be made to each step in the other robot control methods provided in this application, which is not described herein.
The further embodiment of the application also provides a robot control method which is suitable for the client. Specifically, the robot control method provided in the present embodiment includes the following steps:
401. displaying a control on an interface, wherein the control comprises a limit area and a draggable control which can be dragged to any direction in the limit area;
402. when the draggable control is detected to be triggered to move in any one of the 360-degree directions on the limiting area, controlling a robot to follow a target direction corresponding to the draggable control when the movement is stopped, adjusting the moving direction and moving according to the adjusted moving direction;
403. when the draggable control is detected to be released when the movement of the draggable control stops in a first area in the limiting area, the draggable control is restored to a set position, and the robot is controlled to stop moving;
404. When the draggable control is detected to be released when the movement of the draggable control stops in a second area in the limiting area, locking the draggable control, and controlling the robot to maintain to move according to the adjusted movement direction; wherein the second region is outside the first region
In the above description, the triggering of the draggable control means that the draggable control is touch-dragged as described in other embodiments above. When the robot is controlled to adjust the moving direction, the robot can be controlled to follow the corresponding target direction when the draggable control stops moving, and the gesture of the robot can be adjusted, so that the moving direction can be adjusted. The set position may be the center position of the limit area.
For details regarding the specific implementation of 401-404, see the description of other embodiments above.
It should be noted that, in addition to the steps shown above, the robot control method provided in this embodiment may further include other steps, and for the specific steps that may be included, reference may be made to each step in the other robot control methods provided in this application, which is not described herein.
The technical scheme provided by the application is described below by taking a robot as an example of a window cleaning machine for cleaning the surface of a glass window in combination with a specific application scene.
After the user adsorbs the window cleaning robot at a position on the inner surface of the glass window and turns on the switch, clicks on a "manual remote control" control 31 provided by the client on the main interface 3 as shown in fig. 3, turns on the manual remote control function for the window cleaning robot, first enters the robot state selection interface 4 as shown in fig. 3, clicks on a drawing element 421 for reflecting the window cleaning robot on the inner surface of the glass window according to the view of the window cleaning robot seen from the current view of the user, and then switches into the manual remote control interaction interface 5 as shown in fig. 3.
When the user remotely controls the window cleaning robot through the manual remote control interaction interface 5, the draggable control 511 can be dragged in any direction in the limiting area 51 to remotely control the window cleaning robot to move on the inner surface of the glass window to perform a cleaning task. Specifically, when the draggable control 511 is dragged in the positive x-axis direction to the position shown in fig. 4a, the draggable control 511 is in the first area in the limit area, the window cleaning robot adjusts its posture to move rightward on the inner surface of the glass window at a fixed speed, and at the same time, the graphic element 421 on the draggable control 511 moves synchronously with the orientation posture of the window cleaning robot. Further, when draggable control 511 is dragged at a position as shown in fig. 4b, draggable control 511 is still in the first area of the limited area, and the offset angle with respect to the reference direction is-45 o North-partial-western 45 o The window cleaning robot adjusts its posture on the inner surface of the glass window to orient itself to the north-west 45 o In the direction and at a fixed speed along the adjusted orientation, the gesture of the graphical element 421 on the draggable control 511 is updated to synchronize the orientation gesture of the window cleaning robot. And thenFurther, when the user drags the draggable control 511 to a position as shown in fig. 5a, the draggable control 511 enters a second area in the limited area and has a deflection angle of 45 with respect to the reference direction o At this point the lock control function of the second zone is activated and the user releases the draggable control 511, which draggable control 511 will be locked in the locked state as shown in fig. 6a, and the window cleaning robot will adjust its attitude on the inner surface of the glazing to orient itself to the north-east 45 o And the direction and the self-movement of the orientation and the fixed speed are started along the adjusted direction, the window wiping task is executed, and the draggable control is not required to be touched by a user all the time. Still further, when the user performs a click operation at any one position on the limit area as shown in fig. 6a, the manual remote control interactive interface will be restored to the remote control interactive interface 5 as shown in fig. 3.
The embodiment of the application also provides a robot control system. As shown in fig. 10, the system includes:
a robot 200 for performing a task on a surface of a work object
The client 100 is in communication connection with the robot, and is configured to implement the steps or functions in the robot control method provided in the embodiments of the present application.
The specific implementation of the corresponding functions in the client and the robot can be seen from the above. In addition, the client may have the functions corresponding to other steps in other embodiments, besides the above functions, which are also referred to above, and will not be described herein.
The robot may be an adsorption type surface cleaning robot for cleaning the surfaces of various work objects, such as: a window cleaning robot. Wherein, the operation object can be a vertical glass window, a ceramic tile wall surface and the like.
Fig. 11 shows a schematic structural diagram of a robot control device according to an embodiment of the present application. The robotic control device is deployed on a client. As shown in fig. 11, the robot control device includes: a determination module 61 and a control module 62; wherein,,
a determining module 61 for determining a target direction in response to a sliding operation in any one of 360 degrees directions on the interface; and further configured to determine a movement speed according to a region in the interface to which the operation termination position of the sliding operation belongs;
And a control module 62 for controlling the robot to adjust the posture according to the target direction and to move at the moving speed according to the adjusted posture.
Further, the determining module 61, when configured to determine the target direction in response to the slip operation, is specifically configured to: determining an operation termination position of the slip operation; the target direction is determined based on the operation termination position.
Further, the above-mentioned determination module 61, when determining the movement speed according to the area to which the operation termination position of the sliding operation belongs in the interface, is specifically configured to: determining a slip distance based on the operation termination position; determining a region of the operation termination position in the interface according to the sliding distance; and determining the moving speed based on the area of the operation termination position in the interface.
Further, the control module 62 is further configured to: in response to a release operation for a slip operation at the operation end position, the robot is controlled to stop moving or is controlled to remain moving at the moving speed in the target direction according to the slip distance.
Further, the device provided in this embodiment further includes: the display module is used for displaying control controls on the interface, wherein the control controls comprise limiting areas and draggable controls which can be dragged to any direction on the limiting areas; the response module is used for responding to the sliding operation, and the draggable control moves in the limit area according to the sliding operation; the operation termination position is the position on the limit area when the draggable control stops moving;
further, the determining module 61, when configured to determine the target direction based on the operation termination position, is specifically configured to: determining a yaw angle of the draggable control relative to a reference direction based on the operation termination position; determining the target direction according to the deflection angle
Further, the above-mentioned determination module 61, when used for determining the slip distance based on the operation end position, is specifically used for: calculating the distance between the operation termination position and the central position of the limit area to determine the slip distance; and when the operation termination position is used for determining the area of the interface according to the sliding distance, the operation termination position is specifically used for: determining the area where the draggable control is located in the limit area according to the sliding distance; and determining the area where the draggable control is located in the limit area as the area where the operation termination position belongs in the interface. And, when used for determining the movement speed based on the area to which the operation termination position belongs in the interface, is specifically used for: and determining the moving speed based on the area where the draggable control is positioned in the limiting area.
Further, the determining module 61 is specifically configured to, when determining the movement speed based on the area where the draggable control is located in the limit area:
if the draggable control is in the first area in the limiting area, determining the moving speed according to the sliding distance; or determining the first preset speed as the moving speed;
if the draggable control is in the second area in the limiting area, determining the moving speed according to the deflection angle; or, in response to a rotation operation for the second region, determining the moving speed according to a rotation angle and a rotation direction corresponding to the rotation operation; or, determining a second preset speed as the moving speed; wherein the second region is outside the first region.
Further, when the draggable control is in the second area in the limit area, the device provided in this embodiment may further include: the updating module is used for updating the display state of the second area so as to reflect the activation of the lock control function corresponding to the second area; the locking control function is used for locking the draggable control to control the robot to perform directional fixed-speed autonomous movement; and the disappearance/maintenance control module is used for disappearing or maintaining display of the graph elements used for reflecting the surface of the operation object where the robot is located on the draggable control.
Further, the device provided in this embodiment further includes: the locking module is used for locking the draggable control when the touch ending operation of the draggable control is detected; the updating module is further configured to update a display state of the draggable control to reflect that the draggable control is in a locked state.
Further, the locking module is specifically configured to, when being configured to lock the draggable control: fixing the draggable control at a target position on the boundary of the first area according to the operation termination position, wherein the target position is nearest to the operation termination position and is on a straight line; the second region disappears; alternatively, the draggable control is fixed at the operation termination position.
Still further, the apparatus provided in this embodiment further includes: the unlocking module is used for unlocking the draggable control and recovering the draggable control to the central position of the limit area when the condition of unlocking is detected; the display module is used for displaying the second area if the second area is in a disappearing state; wherein satisfying the unlocking condition includes at least one of: and receiving movement stopping information sent by the robot, and monitoring the operation triggered on the limit area.
Further, the determining module 61 is further configured to determine a surface of a work object where the robot is located; the display module is also used for displaying a graph element reflecting the surface of the operation object where the robot is located on the draggable control; wherein the surface of the job object includes at least one of: an inner surface facing the user, an outer surface opposite the inner surface.
Further, the determining module 61, when used for determining the surface of the work object where the robot is located, is specifically configured to: displaying at least one graphic element; a graphic element for reflecting the robot on a surface of the work object; determining a selected graphic element in response to a selection operation for the at least one graphic element; determining the surface of the operation object where the robot is located according to the selected picture element; and the drawing element displayed on the draggable control is the selected drawing element.
Further, the device provided in this embodiment further includes: the acquisition module is used for acquiring the attitude information of the robot; and updating the gesture of the drawing element on the draggable control according to the gesture information, so that the drawing element on the draggable control can reflect the gesture of the robot.
What needs to be explained here is: the details of each step in the robot control device provided in this embodiment may be referred to the corresponding content in each embodiment, and will not be described herein. In addition, the robot control device provided in this embodiment may further include other part or all of the steps in the foregoing embodiments, and specific reference may be made to the corresponding content in the foregoing embodiments, which is not repeated herein.
Another embodiment of the present application further provides a schematic structural diagram of the robot control device. The robotic control device is deployed on a client. The specific structure of the robot control device is similar to the structure shown in fig. 11, and specifically, the robot control device includes: a determining module and a control module; wherein,,
a determination module for determining a sliding distance in response to a sliding operation on the interface; determining the area of the operation termination position of the sliding operation in an interface according to the sliding distance; determining a moving speed according to the area of the operation termination position in the interface;
and the control module is used for controlling the robot to move according to the moving speed.
What needs to be explained here is: the details of each step in the robot control device provided in this embodiment may be referred to the corresponding content in each embodiment, and will not be described herein. In addition, the robot control device provided in this embodiment may further include other part or all of the steps in the foregoing embodiments, and specific reference may be made to the corresponding content in the foregoing embodiments, which is not repeated herein.
Still another embodiment of the present application provides a schematic structural diagram of a robot control device. The robotic control device is deployed on a client. The specific structure of the robot control device is similar to the structure shown in fig. 11, and specifically, the robot control device includes: a determining module and a control module; wherein,,
a determining module for determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions; and determining a moving speed in response to a release operation after the slipping operation to the target area;
and the control module is used for controlling the robot to adjust the gesture according to the target direction and continuously moving at the moving speed according to the adjusted gesture.
In the above, the target area is an area associated with a lock control function; the lock function is for locking an operation end position of the slip operation when the release operation is detected.
What needs to be explained here is: the details of each step in the robot control device provided in this embodiment may be referred to the corresponding content in each embodiment, and will not be described herein. In addition, the robot control device provided in this embodiment may further include other part or all of the steps in the foregoing embodiments, and specific reference may be made to the corresponding content in the foregoing embodiments, which is not repeated herein.
Still another embodiment of the present application provides a schematic structural diagram of a robot control device. The robotic control device is deployed on a client. Specifically, the robot control device includes: the display module and the detection control module; wherein,,
the display module is used for displaying control controls on the interface, wherein the control controls comprise limiting areas and draggable controls which can be dragged to any direction on the limiting areas;
the detection control module is used for controlling the robot to follow the corresponding target direction when the draggable control stops moving when the draggable control is triggered to move along any one of the 360-degree directions on the limit area, adjusting the moving direction and moving according to the adjusted moving direction; the draggable control is further used for detecting that the draggable control is released when the movement of the draggable control stops in a first area in the limiting area, and the draggable control is restored to a set position to control the robot to stop moving; when the draggable control is detected to be released when the movement of the draggable control stops in a second area in the limiting area, locking the draggable control, and controlling the robot to maintain to move according to the adjusted movement direction; wherein the second region is outside the first region.
What needs to be explained here is: the details of each step in the robot control device provided in this embodiment may be referred to the corresponding content in each embodiment, and will not be described herein. In addition, the robot control device provided in this embodiment may further include other part or all of the steps in the foregoing embodiments, and specific reference may be made to the corresponding content in the foregoing embodiments, which is not repeated herein.
In addition, another embodiment of the present application provides a client device. The client device may include a memory and a processor; wherein,,
the memory stores one or more computer instructions;
the processor is coupled to the memory and is configured to execute the one or more computer instructions to implement the steps in the illustrated robot control method embodiments provided in the embodiments of the present application.
The client device may be a smart phone, a computer, a smart phone, a tablet computer, a smart wearable device (such as a smart watch, glasses, etc.).
The memory may be configured to store various other data to support operations on the client. Examples of such data include instructions for any application or method operating on a client. The memory may be implemented by any type of volatile or nonvolatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
Further, the client device includes other components, such as a power supply component, a communication component, a display, etc., in addition to the memory and the processor.
Yet another embodiment of the present application provides a computer program product (not shown in the drawings of the specification). The computer program product comprises a computer program or instructions which, when executed by a processor, cause the processor to carry out the steps of the method embodiments described above.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program capable of implementing the method steps or functions provided by the above embodiments when executed by a computer.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (22)

1. A method of controlling a robot, adapted to a client, the method comprising:
determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions;
determining a moving speed according to the area of the operation termination position of the sliding operation in the interface;
and controlling the robot to adjust the gesture according to the target direction and moving at the moving speed according to the adjusted gesture.
2. The method of claim 1, wherein determining a target direction in response to the slipping operation comprises:
determining an operation termination position of the slip operation;
the target direction is determined based on the operation termination position.
3. The method according to claim 2, wherein determining a movement speed from an area in the interface to which the operation termination position belongs comprises:
determining a slip distance based on the operation termination position;
determining a region of the operation termination position in the interface according to the sliding distance;
and determining the moving speed based on the area of the operation termination position in the interface.
4. A method according to claim 3, further comprising:
In response to a release operation for a slip operation at the operation end position, the robot is controlled to stop moving or is controlled to remain moving at the moving speed in the target direction according to the slip distance.
5. The method according to any one of claims 2 to 4, further comprising:
the interface is provided with a control, and the control comprises a limit area and a draggable control which can be dragged to any direction on the limit area;
responding to the sliding operation, and moving the draggable control in the limit area according to the sliding operation;
and the operation termination position is the position on the limit area when the draggable control stops moving.
6. The method of claim 5, wherein determining the target direction based on the operation termination location comprises:
determining a yaw angle of the draggable control relative to a reference direction based on the operation termination position;
and determining the target direction according to the deflection angle.
7. The method of claim 5, wherein determining a slip distance based on the operation termination position comprises:
Calculating the distance between the operation termination position and the central position of the limit area to determine the slip distance; and determining, according to the slip distance, a region to which the operation termination position belongs in the interface, including:
determining the area where the draggable control is located in the limit area according to the sliding distance;
determining the area where the draggable control is located in the limit area as the area where the operation termination position belongs in the interface;
and determining the moving speed based on the area to which the operation termination position belongs in the interface, including:
and determining the moving speed based on the area where the draggable control is positioned in the limiting area.
8. The method of claim 7, wherein determining the movement speed based on an area in which the draggable control is located in the limit zone comprises:
if the draggable control is in the first area in the limiting area, determining the moving speed according to the sliding distance; or determining the first preset speed as the moving speed;
if the draggable control is in the second area in the limiting area, determining the moving speed according to the deflection angle of the draggable control relative to the reference direction; or, in response to a rotation operation for the second region, determining the moving speed according to a rotation angle and a rotation direction corresponding to the rotation operation; or, determining a second preset speed as the moving speed;
Wherein the second region is outside the first region.
9. The method of claim 8, wherein the draggable control is in a second region of the limit zone, the method further comprising:
updating the display state of the second area to reflect the activation of the lock control function corresponding to the second area; the locking control function is used for locking the draggable control to control the robot to perform directional fixed-speed autonomous movement;
and the drawing elements used for reflecting the surface of the operation object where the robot is located on the draggable control disappear or remain displayed.
10. The method as recited in claim 9, further comprising:
when a release operation for a sliding operation at the operation termination position is detected, locking the draggable control, and updating a display state of the draggable control to reflect that the draggable control is in the locked state.
11. The method of claim 10, wherein locking the draggable control comprises:
fixing the draggable control at a target position on the boundary of the first area according to the operation termination position, wherein the target position is nearest to the operation termination position and is on a straight line; the second region disappears; or alternatively
And fixing the draggable control at the operation termination position.
12. The method as recited in claim 11, further comprising:
when the condition of unlocking is detected to be met, unlocking the draggable control, and recovering the draggable control to the central position of the limit area;
if the second area is in a disappearing state, the second area is made to appear;
wherein satisfying the unlocking condition includes at least one of: and receiving movement stopping information sent by the robot, and monitoring the operation triggered on the limit area.
13. The method according to any one of claims 6 to 12, further comprising:
determining the surface of an operation object where the robot is located;
displaying a graph element reflecting the surface of the operation object where the robot is located on the draggable control;
wherein the surface of the job object includes at least one of: an inner surface facing the user, an outer surface corresponding to the inner surface.
14. The method of claim 13, wherein determining the surface of the work object on which the robot is located comprises:
displaying at least one graphic element; a graphic element for reflecting the robot on a surface of the work object;
Determining a selected graphic element in response to a selection operation for the at least one graphic element;
determining the surface of the operation object where the robot is located according to the selected picture element;
and the drawing element displayed on the draggable control is the selected drawing element.
15. The method as recited in claim 14, further comprising:
acquiring attitude information of the robot;
and updating the gesture of the drawing element on the draggable control according to the gesture information, so that the drawing element on the draggable control can reflect the gesture of the robot.
16. A method of controlling a robot, adapted to a client, the method comprising:
determining a slip distance in response to a slip operation on the interface;
determining the area of the operation termination position of the sliding operation in an interface according to the sliding distance;
determining a moving speed according to the area of the operation termination position in the interface;
and controlling the robot to move according to the moving speed.
17. A method of controlling a robot, adapted to a client, the method comprising:
Determining a target direction in response to a sliding operation on the interface in any one of the 360 degree directions;
determining a moving speed in response to a release operation after the slipping operation reaches a target area;
and controlling the robot to adjust the gesture according to the target direction, and continuously moving at the moving speed according to the adjusted gesture.
18. The method of claim 17, wherein the target area is an area associated with a lock control function; the lock function is for locking an operation end position of the slip operation when the release operation is detected.
19. A method of controlling a robot, adapted to a client, the method comprising:
displaying a control on an interface, wherein the control comprises a limit area and a draggable control which can be dragged to any direction on the limit area;
when the draggable control is detected to be triggered to move in any one of the 360-degree directions on the limiting area, controlling a robot to follow a target direction corresponding to the draggable control when the movement is stopped, adjusting the moving direction and moving according to the adjusted moving direction;
when the draggable control is detected to be released when the movement of the draggable control stops in a first area in the limiting area, the draggable control is restored to a set position, and the robot is controlled to stop moving;
When the draggable control is detected to be released when the movement of the draggable control stops in a second area in the limiting area, locking the draggable control, and controlling the robot to maintain to move according to the adjusted movement direction; wherein the second region is outside the first region.
20. A robot control system, comprising:
a robot for performing a job task on a surface of a job object;
a client communicatively connected to the robot for implementing the steps of the robot control method according to any one of the preceding claims 1 to 15, or the robot control method according to claim 16, or the robot control method according to claim 17 or 18, or the robot control method according to claim 19.
21. The system of claim 20, wherein the robot is an adsorption type surface cleaning robot.
22. A client device comprising a memory and a processor; wherein,,
the memory stores one or more computer instructions;
the processor, coupled to the memory, is configured to execute the one or more computer instructions for implementing the steps in the robot control method of any one of the preceding claims 1 to 15, or implementing the steps in the robot control method of claim 16, or implementing the steps in the robot control method of claim 17 or 18, or implementing the steps in the robot control method of claim 19.
CN202310636948.5A 2023-05-31 2023-05-31 Robot control method, system and client device Active CN116350113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310636948.5A CN116350113B (en) 2023-05-31 2023-05-31 Robot control method, system and client device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310636948.5A CN116350113B (en) 2023-05-31 2023-05-31 Robot control method, system and client device

Publications (2)

Publication Number Publication Date
CN116350113A CN116350113A (en) 2023-06-30
CN116350113B true CN116350113B (en) 2023-08-08

Family

ID=86905341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310636948.5A Active CN116350113B (en) 2023-05-31 2023-05-31 Robot control method, system and client device

Country Status (1)

Country Link
CN (1) CN116350113B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130107405A (en) * 2012-03-22 2013-10-02 허홍 User interface displaying and operating method of touch screen smart media device to control direction and speed according to the touch direction and distance
CN104364723A (en) * 2012-04-05 2015-02-18 里斯集团控股有限责任两合公司 Method for operating an industrial robot
CN106484182A (en) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 Touch control identification method and device
CN108241379A (en) * 2016-12-27 2018-07-03 三星电子株式会社 Electronic equipment for the method for controlling unmanned plane and for controlling unmanned plane
CN110764497A (en) * 2019-09-09 2020-02-07 深圳市无限动力发展有限公司 Mobile robot remote control method, device, storage medium and remote control terminal
CN111061412A (en) * 2019-10-31 2020-04-24 北京奇艺世纪科技有限公司 Playing control method and device for multimedia resources
CN306662864S (en) * 2021-07-06

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN306662864S (en) * 2021-07-06
KR20130107405A (en) * 2012-03-22 2013-10-02 허홍 User interface displaying and operating method of touch screen smart media device to control direction and speed according to the touch direction and distance
CN104364723A (en) * 2012-04-05 2015-02-18 里斯集团控股有限责任两合公司 Method for operating an industrial robot
CN106484182A (en) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 Touch control identification method and device
CN108241379A (en) * 2016-12-27 2018-07-03 三星电子株式会社 Electronic equipment for the method for controlling unmanned plane and for controlling unmanned plane
CN110764497A (en) * 2019-09-09 2020-02-07 深圳市无限动力发展有限公司 Mobile robot remote control method, device, storage medium and remote control terminal
CN111061412A (en) * 2019-10-31 2020-04-24 北京奇艺世纪科技有限公司 Playing control method and device for multimedia resources

Also Published As

Publication number Publication date
CN116350113A (en) 2023-06-30

Similar Documents

Publication Publication Date Title
EP3184014B1 (en) Control apparatus for a cleaning robot
US9789612B2 (en) Remotely operating a mobile robot
JP5876607B1 (en) Floating graphical user interface
US10222792B2 (en) Drone piloting device adapted to hold piloting commands and associated control method
WO2018053824A1 (en) Unmanned aerial vehicle control method, head-mounted display glasses, and system
US20100152897A1 (en) Method & apparatus for controlling the attitude of a camera associated with a robotic device
US8095239B2 (en) Method and apparatus for controlling the motion of a robotic device
CN103294024A (en) Intelligent home system control method
US11209937B2 (en) Error correction for seamless transition between hover and touch sensing
CN102541085A (en) Spherical-bottom device and method for tracking video target and controlling posture of video target
CN109920320A (en) A kind of manipulation planetarium carries out the method and control system of astronomical phenomena demonstration
US11360642B2 (en) Method and apparatus for setting parameter
CN116350113B (en) Robot control method, system and client device
JP4640191B2 (en) Surveillance camera system and surveillance camera direction control method
CN110313177B (en) Holder control method and device
US10969899B2 (en) Dynamically adaptive sensing for remote hover touch
US10250813B2 (en) Methods and systems for sharing views
US10791278B2 (en) Monitoring apparatus and system
EP3764212A1 (en) Electronic device for recognizing user input using sensor and method thereof
JP2017004491A (en) Floating graphical user interface
CN116048281A (en) Interaction method, device, equipment and storage medium in virtual reality scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant