Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Fig. 1 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application. Regarding the execution body of the method of the embodiment of the present application, it may be a robot or a processor configured on a robot.
As shown in fig. 1, in step 110, a detection signal of a surface of a medium to be robotically worked is acquired using a contact type detection assembly mounted at the bottom or side of the robot. The surface of the medium can be detected by various surface detection methods, and for example, the properties of obstacles protruding from the ground can be determined by using a contact detection assembly arranged at the bottom of the machine body.
In step 120, target medium attribute information corresponding to the medium surface is determined according to the detection signal. Here, the medium attribute information may be information such as a type, material, hardness, and the like of the medium to be robotically worked.
In step 130, the robot is controlled to perform work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
Through the embodiment of the application, the robot can determine the corresponding medium attribute information according to the detection signal of the medium surface to be operated, and self-adaptively selects the corresponding working mode, so that the working process of the robot is matched with the medium surface, the medium surface can be prevented from being damaged, and the working effect on the medium surface can be improved.
With respect to step 130 above, in some embodiments, the target operation mode of the robot may be determined by means of a table lookup. Specifically, the target operating mode corresponding to the target medium attribute information may be determined based on a preset medium attribute mode table. Further, the robot may be controlled to perform work on the surface of the medium in the target operation mode.
It should be noted that the medium attribute mode table may include a plurality of medium attribute information and corresponding operation modes. For example, the medium attribute pattern table may have mapping relationships "floor attribute-mopping pattern" and "carpet attribute-sweeping pattern", and when the target medium attribute information corresponds to the carpet attribute, the robot may be controlled to execute the sweeping pattern.
It should be understood that, regarding the correspondence between the medium attribute information in the medium attribute mode table and the operation mode, no limitation should be imposed here, and the correspondence may also be configured or adjusted in advance according to the service requirement to meet the personalized requirements of different service scenarios or users.
Fig. 2 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application.
In some application scenarios, there may be multiple types of media surfaces in the working environment of the robot, such as both carpet and floor in a room, and therefore it is desirable that the robot work with different working modes under different media surfaces.
As shown in fig. 2, in step 210, the current working mode of the robot is acquired. At this time, the robot is already in a working state and needs to acquire the current working mode. In some embodiments, the current operating mode of the robot may be periodically acquired or captured to continuously monitor the working process of the robot.
In step 220, it is detected whether the current operating mode matches the target operating mode.
If the detection result in step 220 indicates that the current operation mode matches the target operation mode, go to step 240. If the detection result in step 220 indicates that the current operation mode does not match the target operation mode, the process goes to step 230.
In step 230, a target operating mode is switched.
In step 240, the robot is controlled to perform a job on the media surface in the target operational mode.
Through the embodiment of the application, the current working mode of the robot can be compared with the target working mode in the working process of the robot, and when the current working mode and the target working mode are not matched, the target working mode can be switched to, so that the robot can perform operation of the corresponding working mode on various types of medium surfaces, and the robot can adapt to complicated working environments, such as a carpet, the floor mopping function is stopped, and the floor mopping function is started on a floor.
Fig. 3 is a flowchart showing an example of a work control method of a robot according to an embodiment of the present application.
In the related art, objects such as carpets can be detected by detection methods such as optical flow and acoustic wave, but these apparatuses have disadvantages of complicated structure, high cost, and low sensitivity.
As shown in fig. 3, in step 310, a detection signal of the medium surface is acquired for a preset time period based on the contact detection assembly. Here, the indication signal in the detection signal may indicate a surface of the medium having concave or/and convex objects that come into contact with the contact detection assembly. For example, the contact detection assembly may be disposed at a location (e.g., bottom) between the robot and the work surface, and if the media surface is flat, the contact detection assembly will not make contact with the media surface and will always be in an initial state (e.g., low state); in addition, if a protruding object is present on the media surface, the contact detection assembly will contact the media surface and will generate an indication signal (e.g., a high signal) accordingly.
It should be appreciated that the touch detection assembly may employ various detection principles to detect the media surface to determine whether a media surface having concave and/or convex objects is present, and that various types of touch detection modules may be configured in the touch detection assembly. In some examples of embodiments of the present application, the contact detection assembly may have at least one of the following mounted therein: strain sensors, magnetic sensors and photosensors.
In step 320, a target signal change rule satisfied by the acquired detection signal within a preset time period is determined among a plurality of preset signal change rules. Here, each of the plurality of signal variation rules has corresponding medium attribute information, respectively.
For example, when the contact detection assembly is shaken or bent, the medium surface to be worked may be determined as a medium surface having a protruding obstacle, and corresponding medium attribute information (or obstacle attribute) may be determined according to a signal generated by the shaking or bending. Here, the obstacle attribute may include one or more of: obstacle shape, obstacle type, obstacle material, obstacle hardness, obstacle size, and obstacle height.
Fig. 4A is a schematic diagram showing a level signal change of an example corresponding to a floor-type medium surface, where the floor is in a continuously low level state without contacting the touch detection assembly because the floor surface is flat. FIG. 4B is a schematic diagram showing an example of a level signal variation corresponding to a cable-like dielectric surface, which may result in a transient high level signal due to a cable-like obstacle making a short contact with the touch sensing assembly. FIG. 4C is a graph illustrating an exemplary level signal variation for a threshold-type media surface, which results in a continuous high level signal due to a relatively long time contact between a threshold-type obstacle and a touch detection assembly. Fig. 4D is a schematic diagram illustrating an exemplary level signal variation corresponding to the surface of the carpet-like medium, wherein the carpet-like obstacle may cause a continuous level fluctuation signal with the touch detection assembly due to the softness and elasticity of the carpet-like obstacle.
It should be understood that the above types of protruding obstacles and corresponding signal variation rules are only used as examples, and the embodiments of the present application may also configure corresponding signal variation rules for more types of protruding objects.
In step 330, corresponding target media attribute information is determined according to the target signal variation rule. For example, the correspondence between the respective signal variation rules and the medium attribute information may be utilized to determine the corresponding target medium attribute information.
In step 340, according to the target medium attribute information, the robot is controlled to perform work on the medium surface in the corresponding target working mode. Here, the medium attribute information may contain one or more of the following information: the media may be any of media shapes (including linear, planar, strip, etc.), media types (including cables, carpets, doorsills, toys, etc.), media materials (including wood materials, porcelain materials, etc.), media hardness (including soft materials, hard materials, etc.), media sizes (including large-sized media, medium-sized media, small-sized media, etc.), media heights (including short media, etc.), and the like.
Illustratively, when the media surface is a carpeted surface and the robot is in a mopping or sweeping-all mode, the carpet is not cleaned. When the medium surface is the carpet surface and the robot is in a floor sweeping mode, the carpet is pressurized and cleaned, so that the cleaning strength of the carpet is guaranteed. When the medium surface is the cable surface, the side brush of the robot decelerates or stops rotating for cleaning, and the robot is prevented from being trapped. When the medium surface is a threshold or a toy surface, the robot can cross the medium (or an obstacle) to perform corresponding cleaning operation.
In the embodiment of the application, by utilizing the principle that the contact between the contact detection assembly and the concave or/and convex object generates the indication signal, the signal change rule and the working mode corresponding to a plurality of convex obstacles are preset, and by the matching operation aiming at the signal change rule of the actual detection signal, the real-time medium attribute can be effectively detected, and the operation can be carried out according to the corresponding working mode.
In some application scenarios, the robot may be a sweeping robot. Fig. 5 shows an overall structural schematic diagram of the sweeping robot provided by the embodiment of the application, and fig. 6 shows an explosion structural schematic diagram of the sweeping robot provided by the embodiment of the application.
Referring to fig. 5-6, the sweeping robot includes a robot body 50 and a contact type detecting assembly 60 disposed on the robot body 50. Here, by providing the contact type detecting unit 60 on the robot body, the structure of the robot can be simplified, cost can be saved, and high sensitivity can be obtained.
In the illustrated area a, the contact detecting assembly 60 includes a detecting member 601 and a sensing member 602, and at least a portion of the detecting member 601 protrudes from the bottom of the robot body 50. The detecting member 601 is movably connected to the robot body 50.
The detecting member 601 is used for detecting an object having an uneven depression or protrusion, such as a carpet or a floor mat; specifically, when the robot body 50 walks on a target object such as a carpet or a floor mat, the depression or the protrusion of the carpet or the floor mat can block the part of the detection piece 601 protruding out of the bottom of the robot body 50, and the detection piece 601 is movably connected with the robot body, so that the detection piece 601 can swing in different directions such as left and right or front and back under the action of external force; that is, the detecting member 601 may be disturbed by an object such as a carpet or a floor mat.
The sensor 602 is used for sensing the swing of the detector 601 in different directions, such as left and right, front and back, etc., and generating a characteristic signal in which the sensing signal regularly changes as shown in fig. 4D. It can be understood that when the robot body 50 travels to a target object such as a carpet or a floor mat from a common ground or a floor (which is flat and has no too large protrusion or recess to block the sensing element 601), the detection element 601 may swing to a large extent due to the blocking of the detection element 601 by the carpet or the floor mat, and at this time, the sensing signal generated by the sensing element 602 may have a large abrupt change.
According to the embodiment of the application, the contact type detection assembly installed at the bottom of the body of the sweeping robot is used for determining the attribute of the obstacle protruding out of the ground, and the robot can decide the cleaning behavior according to the determined attribute of the obstacle.
Fig. 7 is a block diagram showing a configuration of an example of a work control apparatus of a robot according to an embodiment of the present application.
As shown in fig. 7, the work control apparatus 700 of the robot includes a surface detection signal acquisition unit 710, a medium surface property determination unit 720, and an operation mode control unit 730.
The surface detection signal acquisition unit 710 is configured to acquire a detection signal of a surface of a medium to be robotically worked, using a contact type detection member mounted at the bottom or side of the robot;
the medium attribute determining unit 720 is configured to determine target medium attribute information corresponding to the medium surface according to the detection signal;
the operation mode control unit 730 is configured to control the robot to perform a job on the medium surface in a corresponding target operation mode according to the target medium attribute information.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 8 is a schematic diagram of an example of a mobile device according to an embodiment of the present application. As shown in fig. 8, the mobile device 800 of this embodiment includes: a processor 810, a memory 820, and a computer program 830 stored in the memory 820 and executable on the processor 810. The processor 810, when executing the computer program 830, implements the steps in the above-described method for controlling the operation of the robot, such as the steps 110 to 130 shown in fig. 1. Alternatively, the processor 810, when executing the computer program 830, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the units 710 to 730 shown in fig. 7.
Illustratively, the computer program 830 may be partitioned into one or more modules/units that are stored in the memory 820 and executed by the processor 810 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing certain functions, which are used to describe the execution of the computer program 830 in the mobile device 800. For example, the computer program 830 may be divided into a surface detection signal acquisition module, a medium surface attribute determination module, and an operation mode control module, and the specific functions of each module are as follows:
and the surface detection signal acquisition module is configured to acquire a detection signal of the surface of the medium to be operated by the robot by utilizing a contact type detection assembly arranged at the bottom or the side of the robot.
And the medium attribute determining module is configured to determine target medium attribute information corresponding to the medium surface according to the detection signal.
And the working mode control module is configured to control the robot to work on the surface of the medium in a corresponding target working mode according to the target medium attribute information.
The mobile device 800 may be a desktop computer, a notebook, a palm top computer, a cloud server, or other computing devices. The mobile device may include, but is not limited to, a processor 810, a memory 820. Those skilled in the art will appreciate that fig. 8 is merely an example of a mobile device 800 and does not constitute a limitation of the mobile device 800 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the mobile device may also include input output devices, network access devices, buses, etc.
In some examples of embodiments of the present application, the mobile device may be a cleaning robot, thereby enabling an efficient cleaning process.
The Processor 810 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 820 may be an internal storage unit of the mobile device 800, such as a hard disk or a memory of the mobile device 800. The memory 820 may also be an external storage device of the mobile apparatus 800, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the mobile apparatus 800. Further, the memory 820 may also include both internal storage units and external storage devices of the mobile apparatus 800. The memory 820 is used for storing the computer programs and other programs and data required by the mobile device. The memory 820 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile apparatus and method may be implemented in other ways. For example, the above-described device/mobile device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The above units can be implemented in the form of hardware, and also can be implemented in the form of software.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.