CN114305255A - Robot operation control method and device - Google Patents

Robot operation control method and device Download PDF

Info

Publication number
CN114305255A
CN114305255A CN202011056813.4A CN202011056813A CN114305255A CN 114305255 A CN114305255 A CN 114305255A CN 202011056813 A CN202011056813 A CN 202011056813A CN 114305255 A CN114305255 A CN 114305255A
Authority
CN
China
Prior art keywords
medium
robot
target
attribute information
working mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011056813.4A
Other languages
Chinese (zh)
Inventor
郭盖华
李少海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen LD Robot Co Ltd
Original Assignee
Shenzhen LD Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen LD Robot Co Ltd filed Critical Shenzhen LD Robot Co Ltd
Priority to CN202011056813.4A priority Critical patent/CN114305255A/en
Publication of CN114305255A publication Critical patent/CN114305255A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The application is suitable for the technical field of intelligent hardware equipment, and provides a robot operation control method and device, wherein the method comprises the following steps: acquiring a detection signal of the surface of a medium to be operated by the robot by using a contact type detection assembly arranged at the bottom or the side of the robot; determining target medium attribute information corresponding to the medium surface according to the detection signal; and controlling the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information. Therefore, the working mode of the robot operation can be matched with the surface of the medium, and the robot can be prevented from damaging the surface of the medium in the operation process.

Description

Robot operation control method and device
Technical Field
The application belongs to the technical field of intelligent hardware equipment, and particularly relates to a robot operation control method and device.
Background
With the continuous development of intelligent technology, the robot can replace human beings to complete some works, and brings convenience to the production and the life of people. For example, the floor sweeping robot is used as an intelligent electric appliance capable of automatically sweeping an area to be swept, can replace a person to sweep the ground, reduces housework burden of the person, and is more and more accepted by the people.
At present, floor sweeping robots are gradually popularized, a plurality of families are paved with carpets, and due to the fact that the carpet materials are obviously different from common floors, the floor sweeping force and the suction force of the robots need to be adjusted during floor sweeping operation, and the sweeping efficiency of the floor sweeping robots is improved. Furthermore, many sweeping robots already have a mopping mode, but if the mopping mode is used on a carpet, damage will be caused to the carpet.
In view of the above problems, there is no better solution in the industry at present.
Disclosure of Invention
In view of the above, embodiments of the present disclosure provide a method and an apparatus for controlling a robot to at least solve the problem in the prior art that a working mode of the robot does not conform to a surface of a medium and may damage the surface of the medium.
A first aspect of an embodiment of the present application provides a method for controlling a robot, including: acquiring a detection signal of the surface of a medium to be operated by the robot by using a contact type detection assembly arranged at the bottom or the side of the robot; determining target medium attribute information corresponding to the medium surface according to the detection signal; and controlling the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
A second aspect of an embodiment of the present application provides a work control apparatus for a robot, including: a surface detection signal acquisition unit configured to acquire a detection signal of a surface of a medium to be robotically worked, using a contact type detection member mounted at a bottom or a side of the robot; a medium attribute determining unit configured to determine target medium attribute information corresponding to the medium surface according to the detection signal; and the working mode control unit is configured to control the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
A third aspect of embodiments of the present application provides a mobile device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method as described above when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, implements the steps of the method as described above.
A fifth aspect of embodiments of the present application provides a computer program product, which, when run on a mobile terminal, causes the mobile terminal to implement the steps of the method as described above.
Compared with the prior art, the embodiment of the application has the advantages that:
when the robot is ready to perform work, the contact type detection assembly mounted at the bottom or the side of the robot can be used for detecting the surface of a medium to be worked, determining target medium attribute information corresponding to the surface of the medium, and then controlling the robot to perform work on the surface of the medium in a target working mode corresponding to the target medium attribute information. Therefore, the medium attribute information of the working surface of the robot is identified by using a signal detection means, the target working mode is adaptively selected, the working mode of the robot can be matched with the medium surface, the better working effect on the medium surface can be guaranteed, and the medium surface can be prevented from being damaged.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application;
fig. 2 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application;
fig. 3 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application;
FIG. 4A is a schematic diagram illustrating an exemplary level signal variation corresponding to a surface of a floor-type medium;
FIG. 4B is a schematic diagram illustrating a level signal variation of an example corresponding to a cable-like dielectric surface;
FIG. 4C is a graph illustrating an exemplary level signal variation corresponding to a threshold-type media surface;
FIG. 4D is a schematic diagram illustrating an exemplary level signal variation corresponding to a carpet media surface;
fig. 5 is a schematic overall structural diagram of an example of the sweeping robot provided by the embodiment of the present application;
fig. 6 shows an explosion structure diagram of an example of the sweeping robot provided in the embodiment of the present application;
fig. 7 is a block diagram showing a configuration of an example of a work control apparatus of a robot according to an embodiment of the present application;
fig. 8 is a schematic diagram of an example of a mobile device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Fig. 1 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application. Regarding the execution body of the method of the embodiment of the present application, it may be a robot or a processor configured on a robot.
As shown in fig. 1, in step 110, a detection signal of a surface of a medium to be robotically worked is acquired using a contact type detection assembly mounted at the bottom or side of the robot. The surface of the medium can be detected by various surface detection methods, and for example, the properties of obstacles protruding from the ground can be determined by using a contact detection assembly arranged at the bottom of the machine body.
In step 120, target medium attribute information corresponding to the medium surface is determined according to the detection signal. Here, the medium attribute information may be information such as a type, material, hardness, and the like of the medium to be robotically worked.
In step 130, the robot is controlled to perform work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
Through the embodiment of the application, the robot can determine the corresponding medium attribute information according to the detection signal of the medium surface to be operated, and self-adaptively selects the corresponding working mode, so that the working process of the robot is matched with the medium surface, the medium surface can be prevented from being damaged, and the working effect on the medium surface can be improved.
With respect to step 130 above, in some embodiments, the target operation mode of the robot may be determined by means of a table lookup. Specifically, the target operating mode corresponding to the target medium attribute information may be determined based on a preset medium attribute mode table. Further, the robot may be controlled to perform work on the surface of the medium in the target operation mode.
It should be noted that the medium attribute mode table may include a plurality of medium attribute information and corresponding operation modes. For example, the medium attribute pattern table may have mapping relationships "floor attribute-mopping pattern" and "carpet attribute-sweeping pattern", and when the target medium attribute information corresponds to the carpet attribute, the robot may be controlled to execute the sweeping pattern.
It should be understood that, regarding the correspondence between the medium attribute information in the medium attribute mode table and the operation mode, no limitation should be imposed here, and the correspondence may also be configured or adjusted in advance according to the service requirement to meet the personalized requirements of different service scenarios or users.
Fig. 2 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application.
In some application scenarios, there may be multiple types of media surfaces in the working environment of the robot, such as both carpet and floor in a room, and therefore it is desirable that the robot work with different working modes under different media surfaces.
As shown in fig. 2, in step 210, the current working mode of the robot is acquired. At this time, the robot is already in a working state and needs to acquire the current working mode. In some embodiments, the current operating mode of the robot may be periodically acquired or captured to continuously monitor the working process of the robot.
In step 220, it is detected whether the current operating mode matches the target operating mode.
If the detection result in step 220 indicates that the current operation mode matches the target operation mode, go to step 240. If the detection result in step 220 indicates that the current operation mode does not match the target operation mode, the process goes to step 230.
In step 230, a target operating mode is switched.
In step 240, the robot is controlled to perform a job on the media surface in the target operational mode.
Through the embodiment of the application, the current working mode of the robot can be compared with the target working mode in the working process of the robot, and when the current working mode and the target working mode are not matched, the target working mode can be switched to, so that the robot can perform operation of the corresponding working mode on various types of medium surfaces, and the robot can adapt to complicated working environments, such as a carpet, the floor mopping function is stopped, and the floor mopping function is started on a floor.
Fig. 3 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application.
In the related art, objects such as carpets can be detected by detection methods such as optical flow and acoustic wave, but these apparatuses have disadvantages of complicated structure, high cost, and low sensitivity.
As shown in fig. 3, in step 310, a detection signal of the medium surface is acquired for a preset time period based on the contact detection assembly. Here, the indication signal in the detection signal may indicate a surface of the medium having concave or/and convex objects that come into contact with the contact detection assembly. For example, the contact detection assembly may be disposed at a location (e.g., bottom) between the robot and the work surface, and if the media surface is flat, the contact detection assembly will not make contact with the media surface and will always be in an initial state (e.g., low state); in addition, if a protruding object is present on the media surface, the contact detection assembly will contact the media surface and will generate an indication signal (e.g., a high signal) accordingly.
It should be appreciated that the touch detection assembly may employ various detection principles to detect the media surface to determine whether a media surface having concave and/or convex objects is present, and that various types of touch detection modules may be configured in the touch detection assembly. In some examples of embodiments of the present application, the contact detection assembly may have at least one of the following mounted therein: strain sensors, magnetic sensors and photosensors.
In step 320, a target signal change rule satisfied by the acquired detection signal within a preset time period is determined among a plurality of preset signal change rules. Here, each of the plurality of signal variation rules has corresponding medium attribute information, respectively.
For example, when the contact detection assembly is shaken or bent, the medium surface to be worked may be determined as a medium surface having a protruding obstacle, and corresponding medium attribute information (or obstacle attribute) may be determined according to a signal generated by the shaking or bending. Here, the obstacle attribute may include one or more of: obstacle shape, obstacle type, obstacle material, obstacle hardness, obstacle size, and obstacle height.
Fig. 4A is a schematic diagram showing a level signal change of an example corresponding to a floor-type medium surface, where the floor is in a continuously low level state without contacting the touch detection assembly because the floor surface is flat. FIG. 4B is a schematic diagram showing an example of a level signal variation corresponding to a cable-like dielectric surface, which may result in a transient high level signal due to a cable-like obstacle making a short contact with the touch sensing assembly. FIG. 4C is a graph illustrating an exemplary level signal variation for a threshold-type media surface, which results in a continuous high level signal due to a relatively long time contact between a threshold-type obstacle and a touch detection assembly. Fig. 4D is a schematic diagram illustrating an exemplary level signal variation corresponding to the surface of the carpet-like medium, wherein the carpet-like obstacle may cause a continuous level fluctuation signal with the touch detection assembly due to the softness and elasticity of the carpet-like obstacle.
It should be understood that the above types of protruding obstacles and corresponding signal variation rules are only used as examples, and the embodiments of the present application may also configure corresponding signal variation rules for more types of protruding objects.
In step 330, corresponding target media attribute information is determined according to the target signal variation rule. For example, the correspondence between the respective signal variation rules and the medium attribute information may be utilized to determine the corresponding target medium attribute information.
In step 340, according to the target medium attribute information, the robot is controlled to perform work on the medium surface in the corresponding target working mode. Here, the medium attribute information may contain one or more of the following information: the media may be any of media shapes (including linear, planar, strip, etc.), media types (including cables, carpets, doorsills, toys, etc.), media materials (including wood materials, porcelain materials, etc.), media hardness (including soft materials, hard materials, etc.), media sizes (including large-sized media, medium-sized media, small-sized media, etc.), media heights (including short media, etc.), and the like.
Illustratively, when the media surface is a carpeted surface and the robot is in a mopping or sweeping-all mode, the carpet is not cleaned. When the medium surface is the carpet surface and the robot is in a floor sweeping mode, the carpet is pressurized and cleaned, so that the cleaning strength of the carpet is guaranteed. When the medium surface is the cable surface, the side brush of the robot decelerates or stops rotating for cleaning, and the robot is prevented from being trapped. When the medium surface is a threshold or a toy surface, the robot can cross the medium (or an obstacle) to perform corresponding cleaning operation.
In the embodiment of the application, by utilizing the principle that the contact between the contact detection assembly and the concave or/and convex object generates the indication signal, the signal change rule and the working mode corresponding to a plurality of convex obstacles are preset, and by the matching operation aiming at the signal change rule of the actual detection signal, the real-time medium attribute can be effectively detected, and the operation can be carried out according to the corresponding working mode.
In some application scenarios, the robot may be a sweeping robot. Fig. 5 shows an overall structural schematic diagram of the sweeping robot provided by the embodiment of the application, and fig. 6 shows an explosion structural schematic diagram of the sweeping robot provided by the embodiment of the application.
Referring to fig. 5-6, the sweeping robot includes a robot body 50 and a contact type detecting assembly 60 disposed on the robot body 50. Here, by providing the contact type detecting unit 60 on the robot body, the structure of the robot can be simplified, cost can be saved, and high sensitivity can be obtained.
In the illustrated area a, the contact detecting assembly 60 includes a detecting member 601 and a sensing member 602, and at least a portion of the detecting member 601 protrudes from the bottom of the robot body 50. The detecting member 601 is movably connected to the robot body 50.
The detecting member 601 is used for detecting an object having an uneven depression or protrusion, such as a carpet or a floor mat; specifically, when the robot body 50 walks on a target object such as a carpet or a floor mat, the depression or the protrusion of the carpet or the floor mat can block the part of the detection piece 601 protruding out of the bottom of the robot body 50, and the detection piece 601 is movably connected with the robot body, so that the detection piece 601 can swing in different directions such as left and right or front and back under the action of external force; that is, the detecting member 601 may be disturbed by an object such as a carpet or a floor mat.
The sensor 602 is used for sensing the swing of the detector 601 in different directions, such as left and right, front and back, etc., and generating a characteristic signal in which the sensing signal regularly changes as shown in fig. 4D. It can be understood that when the robot body 50 travels to a target object such as a carpet or a floor mat from a common ground or a floor (which is flat and has no too large protrusion or recess to block the sensing element 601), the detection element 601 may swing to a large extent due to the blocking of the detection element 601 by the carpet or the floor mat, and at this time, the sensing signal generated by the sensing element 602 may have a large abrupt change.
According to the embodiment of the application, the contact type detection assembly installed at the bottom of the body of the sweeping robot is used for determining the attribute of the obstacle protruding out of the ground, and the robot can decide the cleaning behavior according to the determined attribute of the obstacle.
Fig. 7 is a block diagram showing a configuration of an example of a work control apparatus of a robot according to an embodiment of the present application.
As shown in fig. 7, the work control apparatus 700 of the robot includes a surface detection signal acquisition unit 710, a medium surface property determination unit 720, and an operation mode control unit 730.
The surface detection signal acquisition unit 710 is configured to acquire a detection signal of a surface of a medium to be robotically worked, using a contact type detection member mounted at the bottom or side of the robot;
the medium attribute determining unit 720 is configured to determine target medium attribute information corresponding to the medium surface according to the detection signal;
the operation mode control unit 730 is configured to control the robot to perform a job on the medium surface in a corresponding target operation mode according to the target medium attribute information.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 8 is a schematic diagram of an example of a mobile device according to an embodiment of the present application. As shown in fig. 8, the mobile device 800 of this embodiment includes: a processor 810, a memory 820, and a computer program 830 stored in the memory 820 and executable on the processor 810. The processor 810, when executing the computer program 830, implements the steps in the above-described method for controlling the operation of the robot, such as the steps 110 to 130 shown in fig. 1. Alternatively, the processor 810, when executing the computer program 830, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the units 710 to 730 shown in fig. 7.
Illustratively, the computer program 830 may be partitioned into one or more modules/units that are stored in the memory 820 and executed by the processor 810 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing certain functions, which are used to describe the execution of the computer program 830 in the mobile device 800. For example, the computer program 830 may be divided into a surface detection signal acquisition module, a medium surface attribute determination module, and an operation mode control module, and the specific functions of each module are as follows:
and the surface detection signal acquisition module is configured to acquire a detection signal of the surface of the medium to be operated by the robot by utilizing a contact type detection assembly arranged at the bottom or the side of the robot.
And the medium attribute determining module is configured to determine target medium attribute information corresponding to the medium surface according to the detection signal.
And the working mode control module is configured to control the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
The mobile device 800 may be a desktop computer, a notebook, a palm top computer, a cloud server, or other computing devices. The mobile device may include, but is not limited to, a processor 810, a memory 820. Those skilled in the art will appreciate that fig. 8 is merely an example of a mobile device 800 and does not constitute a limitation of the mobile device 800 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the mobile device may also include input output devices, network access devices, buses, etc.
In some examples of embodiments of the present application, the mobile device may be a cleaning robot, thereby enabling an efficient cleaning process.
The Processor 810 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 820 may be an internal storage unit of the mobile device 800, such as a hard disk or a memory of the mobile device 800. The memory 820 may also be an external storage device of the mobile apparatus 800, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the mobile apparatus 800. Further, the memory 820 may also include both internal storage units and external storage devices of the mobile apparatus 800. The memory 820 is used for storing the computer programs and other programs and data required by the mobile device. The memory 820 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile apparatus and method may be implemented in other ways. For example, the above-described device/mobile device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The above units can be implemented in the form of hardware, and also can be implemented in the form of software.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method for controlling a work of a robot, comprising:
acquiring a detection signal of the surface of a medium to be operated by the robot by using a contact type detection assembly arranged at the bottom or the side of the robot;
determining target medium attribute information corresponding to the medium surface according to the detection signal;
and controlling the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
2. The method of claim 1, wherein said controlling the robot to perform work on the media surface in a corresponding target mode of operation based on the target media attribute information comprises:
determining a target working mode corresponding to the target medium attribute information based on a preset medium attribute mode table, wherein the medium attribute mode table comprises a plurality of medium attribute information and corresponding working modes;
and controlling the robot to work on the surface of the medium in the target working mode.
3. The method of claim 1, wherein said acquiring detection signals of a surface of a medium to be robotically worked comprises:
acquiring a detection signal of the medium surface within a preset time period based on the contact detection assembly, wherein an indication signal in the detection signal is used for indicating the medium surface with concave or/and convex objects which is in contact with the contact detection assembly.
4. The method of claim 3, wherein the contact detection assembly has at least one of: strain sensors, magnetic sensors and photosensors.
5. The method of claim 3, wherein determining target media property information corresponding to the media surface based on the detection signal comprises:
determining a target signal change rule which is met by the acquired detection signal in the preset time period in a plurality of preset signal change rules, wherein each signal change rule in the plurality of signal change rules has corresponding medium attribute information;
and determining corresponding target medium attribute information according to the target signal change rule.
6. The method of claim 1, wherein said controlling the robot to perform the job on the media surface in the corresponding target operational mode based on the target media property information comprises:
acquiring a current working mode of the robot;
when the current working mode is not matched with the target working mode, switching to the target working mode;
controlling the robot to perform work on the media surface in the target operating mode.
7. A work control device for a robot, comprising:
a surface detection signal acquisition unit configured to acquire a detection signal of a surface of a medium to be robotically worked, using a contact type detection member mounted at a bottom or a side of the robot;
a medium attribute determining unit configured to determine target medium attribute information corresponding to the medium surface according to the detection signal;
and the working mode control unit is configured to control the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
8. A mobile device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of claims 1 to 6 when executing the computer program.
9. The mobile device of claim 8, wherein the mobile device is a cleaning robot.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202011056813.4A 2020-09-29 2020-09-29 Robot operation control method and device Pending CN114305255A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011056813.4A CN114305255A (en) 2020-09-29 2020-09-29 Robot operation control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011056813.4A CN114305255A (en) 2020-09-29 2020-09-29 Robot operation control method and device

Publications (1)

Publication Number Publication Date
CN114305255A true CN114305255A (en) 2022-04-12

Family

ID=81011159

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011056813.4A Pending CN114305255A (en) 2020-09-29 2020-09-29 Robot operation control method and device

Country Status (1)

Country Link
CN (1) CN114305255A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115464645A (en) * 2022-09-13 2022-12-13 达闼机器人股份有限公司 Task execution method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107397505A (en) * 2016-05-20 2017-11-28 德国福维克控股公司 Ground suction nozzle for vacuum cleaning unit
CN107397502A (en) * 2016-05-20 2017-11-28 德国福维克控股公司 Suction nozzle for vacuum cleaning unit
CN107578038A (en) * 2017-09-30 2018-01-12 深圳拓邦股份有限公司 A kind of ground identification device and cleaning equipment
CN110403529A (en) * 2019-06-21 2019-11-05 安克创新科技股份有限公司 Self-moving device and ground Material Identification method
CN110604520A (en) * 2018-06-14 2019-12-24 东芝生活电器株式会社 Autonomous walking type sweeping machine and sweeping system
CN111227723A (en) * 2020-03-14 2020-06-05 珠海市一微半导体有限公司 Soft earth's surface detection device and cleaning machines people
CN111443695A (en) * 2018-12-28 2020-07-24 珠海市一微半导体有限公司 Sweeping robot control method and device, storage medium and sweeping robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107397505A (en) * 2016-05-20 2017-11-28 德国福维克控股公司 Ground suction nozzle for vacuum cleaning unit
CN107397502A (en) * 2016-05-20 2017-11-28 德国福维克控股公司 Suction nozzle for vacuum cleaning unit
CN107578038A (en) * 2017-09-30 2018-01-12 深圳拓邦股份有限公司 A kind of ground identification device and cleaning equipment
CN110604520A (en) * 2018-06-14 2019-12-24 东芝生活电器株式会社 Autonomous walking type sweeping machine and sweeping system
CN111443695A (en) * 2018-12-28 2020-07-24 珠海市一微半导体有限公司 Sweeping robot control method and device, storage medium and sweeping robot
CN110403529A (en) * 2019-06-21 2019-11-05 安克创新科技股份有限公司 Self-moving device and ground Material Identification method
CN111227723A (en) * 2020-03-14 2020-06-05 珠海市一微半导体有限公司 Soft earth's surface detection device and cleaning machines people

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115464645A (en) * 2022-09-13 2022-12-13 达闼机器人股份有限公司 Task execution method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102668070B1 (en) Robot cleaner and control method thereof
CN106264357B (en) The carpet determination method and system of sweeping robot
JP5469746B2 (en) Touch detection device ground detection
JP2021011016A (en) Robot touch perception
CN110448225B (en) Cleaning strategy adjusting method and system and cleaning equipment
CN109846427A (en) A kind of control method and clean robot of clean robot
CN112137509A (en) Virtual forbidden zone setting method and device and cleaning robot
CN111374603A (en) Control method and chip for partitioned cleaning of vision robot and intelligent sweeping robot
CN108836195A (en) A kind of get rid of poverty method and the sweeping robot of sweeping robot
CN110680253A (en) Robot edge cleaning method and robot
CN105373273A (en) Systems and Methods for Capacitive Touch Detection
CN110908378A (en) Robot edge method and robot
CN114305255A (en) Robot operation control method and device
CN109448002A (en) A kind of sweeping robot control method, system, mobile terminal and storage medium
CN112842184A (en) Cleaning method and cleaning robot
WO2024140376A9 (en) Slip state detection method, device and storage medium
CN114431785A (en) Mopping humidity control method and device, robot and computer readable storage medium
CN116863102A (en) Grid cell division method, device and equipment for robot
US12064069B2 (en) Robot cleaner and method for controlling same
CN114631760A (en) Electric quantity prompting method and device for sweeper, electronic equipment and storage medium
CN113303707A (en) Carpet identification method for cleaning robot
US20240065508A1 (en) Water shortage detection and water supply system for robot cleaner
CN111419117B (en) Returning control method of visual floor sweeping robot and visual floor sweeping robot
CN112704437B (en) Sweeping robot control method, equipment and storage medium
CN113341981A (en) Sweeping control method and device of sweeping robot and sweeping robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Applicant after: Shenzhen Ledong robot Co.,Ltd.

Address before: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Applicant before: SHENZHEN LD ROBOT Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220412