CN111061259A - Incident driving method, system, device and storage medium for walking robot - Google Patents

Incident driving method, system, device and storage medium for walking robot Download PDF

Info

Publication number
CN111061259A
CN111061259A CN201811197403.4A CN201811197403A CN111061259A CN 111061259 A CN111061259 A CN 111061259A CN 201811197403 A CN201811197403 A CN 201811197403A CN 111061259 A CN111061259 A CN 111061259A
Authority
CN
China
Prior art keywords
module
event
walking robot
main thread
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811197403.4A
Other languages
Chinese (zh)
Inventor
梁浩
周骥
冯歆鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NextVPU Shanghai Co Ltd
Original Assignee
NextVPU Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NextVPU Shanghai Co Ltd filed Critical NextVPU Shanghai Co Ltd
Priority to CN201811197403.4A priority Critical patent/CN111061259A/en
Publication of CN111061259A publication Critical patent/CN111061259A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an incident driving method, a system, equipment and a storage medium of a walking robot, wherein the method comprises the following steps: at least one strategy module in the strategy module group sends an event to the main thread module, wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one strategy module in the strategy module group or at least one hardware module in the hardware module group; an event collection module of the main thread module receives an event and adds the event to an event pool; an event scheduling module of the main thread module processes a current event of the event pool and calls a target module to execute event information according to target module information of the current event; and the hardware module sends the feedback after the event information is executed as an event to the main thread module. The invention can reasonably utilize hardware resources, greatly improve the running speed, and synchronize the running speed of the algorithm with the data processing of the camera, thereby achieving the real-time operation process.

Description

Incident driving method, system, device and storage medium for walking robot
Technical Field
The present invention relates to the field of robot driving, and more particularly, to an incident driving method, system, device, and storage medium for a walking robot.
Background
At present, in order to better complete cleaning tasks and more advanced tasks, mobile robots (such as sweeping robots) need to position themselves with high precision and to sense the entire external environment (which may be specifically mapped). The existing sweeping robot generally adopts a non-visual sensor (such as a laser radar) to realize the positioning and mapping of the robot. The solution implemented with lidar has the advantage of high precision and high resolution, but is too costly. And the accuracy is not ideal based on other non-visual sensors, such as infrared and the like. The technical scheme has the common point that the data volume is small, and the operation requirement can be met by adopting a simple real-time system and a serial mode.
For example, existing lidar solutions are cost prohibitive. Other non-visual sensor solutions have lower accuracy. The mobile robot adopting the visual scheme can meet the requirements of high precision and low cost at the same time. However, the visual scheme requires more computational resources, including memory and computational power. If a serial software system is adopted, the result is slow operation, slow response and incapability of processing visual data in real time, so that real-time positioning and mapping cannot be realized.
Accordingly, the present invention provides an incident driving method, system, device and storage medium for a walking robot.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide an incident driving method, a system, equipment and a storage medium of a walking robot, which can reasonably utilize hardware resources, balance load, have low coupling degree, are beneficial to modification and debugging, can greatly improve the running speed, and enable the running speed of an algorithm to be synchronous with the data processing of a camera, thereby achieving the real-time effect of the whole running process.
An embodiment of the present invention provides an incident driving method of a walking robot, including the steps of:
at least one policy module in the policy module group sends an event to the main thread module, wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one policy module in the policy module group or at least one hardware module in the hardware module group;
an event collection module of the main thread module receives an event and adds the event to an event pool;
an event scheduling module of the main thread module processes the current event of the event pool and calls a target module to execute event information according to the target module information of the current event; and
and the hardware module sends the feedback after the event information is executed as an event to the main thread module.
Preferably, the event further includes a generation timestamp, and the event collection module of the main thread module adds the events to the event pool according to the timestamps of the events in the order.
Preferably, the policy module group comprises a data fusion module, a visual algorithm module, a path planning module and a communication service module.
Preferably, the set of hardware modules includes a camera module, a navigation module, a sensor module, and a motion control module.
Preferably, the walking robot is a sweeping robot or a mopping robot, the camera module includes a camera of the walking robot, and the motion control module includes a walking wheel of the walking robot.
Preferably, the motion control module comprises a motion instruction module and a motion response module.
An embodiment of the present invention further provides an eventing driving system for a walking robot, which is used for implementing the above-mentioned eventing driving method for a walking robot, and the system includes: the system comprises a strategy module group, a main thread module and a hardware module group;
at least one policy module in the policy module group sends an event to a main thread module, wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one policy module in the policy module group or at least one hardware module in the hardware module group; an event collection module of the main thread module receives an event and adds the event to an event pool; an event scheduling module of the main thread module processes the current event of the event pool and calls a target module to execute event information according to the target module information of the current event; and the hardware module sends the feedback after the event information is executed as an event to the main thread module.
Preferably, the event further includes a generation timestamp, and the event collection module of the main thread module adds the events to the event pool according to the timestamps of the events in the order.
Preferably, the policy module group comprises a data fusion module, a visual algorithm module, a path planning module and a communication service module.
Preferably, the set of hardware modules includes a camera module, a navigation module, a sensor module, and a motion control module.
Preferably, the walking robot is a sweeping robot or a mopping robot, the camera module includes a camera of the walking robot, and the motion control module includes a walking wheel of the walking robot.
Preferably, the motion control module comprises a motion instruction module and a motion response module.
An embodiment of the present invention also provides an eventing driving apparatus of a walking robot, including:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the above-described incident driving method of the walking robot via execution of the executable instructions.
Embodiments of the present invention also provide a computer-readable storage medium storing a program that, when executed, implements the steps of the incident driving method of the walking robot described above.
The incident driving method, the incident driving system, the incident driving equipment and the incident driving storage medium of the walking robot can reasonably utilize hardware resources, balance loads, have low coupling degree, are beneficial to modification and debugging, can greatly improve the running speed, and can synchronize the running speed of an algorithm with the data processing of a camera, thereby achieving the real-time effect of the whole running process.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
Fig. 1 is a flowchart of an incident driving method of a walking robot of the present invention;
fig. 2 is a block diagram of an incident driving system of the walking robot of the present invention;
fig. 3 is a schematic view of a sweeping robot having an eventing driving system of the walking robot of the present invention;
fig. 4 is a schematic view illustrating an implementation process of the incident driving method of the walking robot of the present invention;
fig. 5 is a schematic configuration diagram of an eventing driving apparatus of the walking robot of the present invention; and
fig. 6 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their repetitive description will be omitted.
Fig. 1 is a flowchart of an incident driving method of a walking robot according to the present invention. As shown in fig. 1, the method for event-based driving of a walking robot according to the present invention includes the steps of:
s101, at least one policy module in the policy module group sends an event to a main thread module (event handler), wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one policy module in the policy module group or at least one hardware module in the hardware module group.
S102, an event collection module (event collector) of the main thread module receives the event and adds the event to an event pool.
S103, an event scheduling module (event dispatcher) of the main thread module processes a current event of the event pool, and calls a target module to execute event information according to the target module information of the current event.
And S104, the hardware module sends the feedback after the event information is executed as an event to the main thread module.
The invention can better utilize the multi-core resources of the embedded chip system, normally respond to various events, process visual data and visual algorithm in real time and provide good support for positioning and drawing based on vision. Meanwhile, the event-driven concurrency framework can reasonably utilize multi-core resources in a multi-event mode, can reduce the coupling degree of each event (or each module), and facilitates the development and debugging of a software system.
In a preferred embodiment, the main thread module may include a Finite-state machine (FSM), which is a mathematical model representing the behavior of Finite states and transitions and actions between these states. The policy module in the policy module group and the hardware module in the hardware module group only communicate with the event collection module of the main thread module by preset events. The occurrence of an exception is also communicated to the event collection module of the main thread module in the form of an event. When a new event enters the main thread module, the new event is firstly stored by the event collection module and then enters the event scheduling module in sequence. The event scheduling module distributes events to other events according to the finite state machine, so that the modules can be operated in a parallel mode.
In a preferred embodiment, the event further includes a generation timestamp, and the event collection module of the main thread module adds the events to the event pool according to the timestamps of the events in the order of precedence, but not limited thereto.
In a preferred embodiment, a message loop macro (Mainloop) may be added to an event, i.e., a data patterning function of the event is turned on. Node macros can be added after the event Mainloop macro, so that the running time information between each section of nodes can be obtained.
In a preferred embodiment, the set of policy modules includes a data fusion module, a visual algorithm module, a path planning module, and a communication service module. For the mobile robot with the main object vision class, the algorithm execution time of each frame of vision data and each frame of vision data are the main embodiment of the running state of the whole system. To achieve real-time processing of the system for visual processing, the two parts first need to be in parallel. In addition, according to the analysis result obtained by the Profiling framework, after the two parts are subjected to eventing, the main bottleneck of real-time processing becomes a data fusion part, so that the data fusion is further subjected to eventing. In addition, the whole visual algorithm is also subjected to eventing according to the analysis result obtained by the Profiling framework, so that the multi-core resource is fully utilized in a concurrent manner, and the real-time effect is achieved.
In a preferred embodiment, the set of hardware modules includes a camera module, a navigation module, a sensor module, and a motion control module.
In a preferred embodiment, the walking robot is a sweeping robot or a mopping robot, the camera module comprises a camera of the sweeping robot or the mopping robot, and the motion control module comprises a walking wheel of the sweeping robot or the mopping robot.
In a preferred embodiment, the motion control module includes a motion command module and a motion response module.
In a preferred embodiment, the main thread module handles all events. Its state machine is modified to respond to changes in other tasks/events, patterns, exceptions, etc.
Fig. 2 is a block diagram of an eventing drive system of the walking robot of the present invention. As shown in fig. 2, an embodiment of the present invention further provides an eventing driving system for a walking robot, which is used for implementing the above-mentioned eventing driving method for a walking robot, and includes: a policy module group 1, a main thread module 2 and a hardware module group 3. At least one policy module in the policy module group 1 sends an event to the main thread module 2, wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one policy module in the policy module group 1 or at least one hardware module in the hardware module group 3; the event collection module 21 of the main thread module 2 receives the event and adds the event to the event pool; the event scheduling module 22 of the main thread module 2 processes the current event of the event pool, and calls a target module to execute event information according to the target module information of the current event; and the hardware module sends the feedback after executing the event information as an event to the main thread module 2.
In a preferred embodiment, the events further include a generation timestamp, and the event collection module 21 of the main thread module 2 adds the events to the event pool in a sequential order according to the timestamps of the events.
In a preferred embodiment, the policy module group 1 includes a data fusion module 11, a visual algorithm module 12, a path planning module 13, and a communication service module 14.
In a preferred embodiment, the set of hardware modules 3 comprises a camera module 31, a navigation module 32, a sensor module 33 and a motion control module 34.
In a preferred embodiment, the motion control module 34 includes a motion command module and a motion response module.
The incident driving system of the walking robot can reasonably utilize hardware resources, balance load, has low coupling degree, is beneficial to modification and debugging, can greatly improve the running speed, and enables the running speed of an algorithm to be synchronous with the data processing of a camera, thereby achieving the real-time effect of the whole running process.
Fig. 3 is a schematic view of the sweeping robot 4 having the eventing drive system of the walking robot of the present invention. Fig. 4 is a schematic diagram illustrating an implementation process of the incident driving method of the walking robot according to the present invention. As shown in fig. 3 and 4, in the present embodiment, the sweeping robot 4 includes the eventing driving system of the walking robot of the present invention, wherein the camera module 21 includes the camera 311 of the sweeping robot 4, and the motion control module includes the walking wheels 341 of the sweeping robot 4. The following illustrates an implementation procedure of the event-driven method by the walking robot of the present invention:
s201, the path planning module 13 generates a T-time walking event and sends the T-time walking event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event into an event pool;
s202, the vision algorithm module 12 generates a shooting event at the time T and sends the shooting event to an event collection module 21 of the main thread module 2, and the event collection module 21 adds the event to an event pool;
s203, the path planning module 13 generates a T +1 moment walking event and sends the T +1 moment walking event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event into an event pool;
s204, the vision algorithm module 12 generates a shooting event at the moment of T +1 and sends the shooting event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event into an event pool;
s205, the event scheduling module 2 of the main thread module 2 processes the walking event at time T in the current event pool, and sends the walking event to the walking wheel 341 of the cleaning robot 4.
S206, the event scheduling module 2 of the main thread module 2 processes the shooting event at the time T in the current event pool, and sends the shooting event to the camera 311 of the cleaning robot 4.
S207, the event scheduling module 2 of the main thread module 2 processes the walking event at the time T +1 in the current event pool, and sends the walking event to the walking wheel 341 of the cleaning robot 4.
And S208, the event scheduling module 2 of the main thread module 2 processes the shooting event at the T +1 moment in the current event pool, and sends the shooting event to the camera 311 of the sweeping robot 4.
For the event pool of the present invention, the event collection module 21 adds the events in the event pool in parallel with the event scheduling module 22 processing the events in the event pool. For the event collection module 21, whenever a new event is received, it will join the event pool. For the event scheduling module 22, as soon as there is a new event in the event pool, the current event is processed until all events in the event pool have been processed. Therefore, step S205 does not necessarily need to be performed after step S204 in terms of time sequence, but may be performed after step S201, so as to implement parallel processing of various events sent by the modules.
Subsequently, in S209, the camera 311 generates a shooting feedback event at time T, and sends the shooting feedback event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event to the event pool.
S210, the camera 311 generates a shooting feedback event at the moment of T +1, and sends the shooting feedback event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event to an event pool.
S211, the road wheel 341 generates a walking feedback event at time T, and sends the walking feedback event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event to the event pool.
S212, the walking wheel 341 generates a walking feedback event at the time T +1, and sends the walking feedback event to the event collection module 21 of the main thread module 2, and the event collection module 21 adds the event to the event pool.
It can be seen that, because of parallel processing, the main thread module 2 does not need to wait for the feedback of the walking event at time T in S205, but can continuously issue instructions to the hardware module. The feedback event of the hardware module is used as a new event to be inserted into the event pool according to the generated time sequence.
S213, the main thread module 2 sends the shooting feedback at the time T to the visual algorithm module 12;
s214, the main thread module 2 sends shooting feedback of the T +1 moment to the visual algorithm module 12;
s215, the main thread module 2 sends the walking feedback at the time T to the path planning module 13;
s216, the main thread module 2 sends the walking feedback at the T +1 moment to the path planning module 13.
From the above, the visual algorithm module 12 and the path planning module 13 of the sweeping robot 4 according to the present invention are completely parallel to the scheduling of the hardware modules, and are not limited by the serial operation in the prior art. The invention can reasonably utilize hardware resources, balance load, has low coupling degree, is beneficial to modification and debugging, can greatly improve the running speed, and ensures that the running speed of the algorithm can be synchronous with the data processing of the camera, thereby achieving the real-time operation process of the whole operation process.
The foregoing shows and describes the general principles and broad features of the present invention and advantages thereof. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the present invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined by the appended claims and their equivalents.
The embodiment of the invention also provides an incident driving device of the walking robot, which comprises a processor. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the steps of the incident driving method of the walking robot via execution of the executable instructions.
As shown above, the embodiment can reasonably utilize hardware resources, balance loads, has low coupling degree, is beneficial to modification and debugging, can greatly improve the running speed, and enables the running speed of the algorithm to be synchronous with the data processing of the camera, thereby achieving the real-time performance of the whole running process.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 5 is a schematic configuration diagram of an eventing driving apparatus of a walking robot according to the present invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 5. The electronic device 600 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 that connects the different platform components (including memory unit 620 and processing unit 610), and the like.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
An embodiment of the present invention also provides a computer-readable storage medium storing a program that implements the steps of the incident driving method of the walking robot when the program is executed. In some possible embodiments, the aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
As shown above, the embodiment can reasonably utilize hardware resources, balance loads, has low coupling degree, is beneficial to modification and debugging, can greatly improve the running speed, and enables the running speed of the algorithm to be synchronous with the data processing of the camera, thereby achieving the real-time performance of the whole running process.
Fig. 6 is a schematic structural diagram of a computer-readable storage medium of the present invention. Referring to fig. 6, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In summary, the present invention is directed to provide an incident driving method, system, device and storage medium for a walking robot, which can reasonably utilize hardware resources, balance loads, have a low coupling degree, facilitate modification and debugging, and greatly increase the operation speed, so that the operation speed of an algorithm can be synchronized with the data processing of a camera, thereby achieving real-time operation of the whole operation process.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (14)

1. An incident driving method for a walking robot, comprising the steps of:
at least one policy module in the policy module group sends an event to the main thread module, wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one policy module in the policy module group or at least one hardware module in the hardware module group;
an event collection module of the main thread module receives an event and adds the event to an event pool;
an event scheduling module of the main thread module processes the current event of the event pool and calls a target module to execute event information according to the target module information of the current event; and
and the hardware module sends the feedback after the event information is executed as an event to the main thread module.
2. The eventualization driving method of the walking robot according to claim 1, characterized in that: the event also comprises a generation time stamp, and the event collection module of the main thread module adds the events to the event pool according to the time stamp of the events.
3. The eventualization driving method of the walking robot according to claim 1, wherein the policy module group comprises a data fusion module, a visual algorithm module, a path planning module, and a communication service module.
4. The eventualization driving method of the walking robot according to claim 1, characterized in that: the hardware module group comprises a camera module, a navigation module, a sensor module and a motion control module.
5. The eventuality driving method of the walking robot according to claim 4, wherein the walking robot is a sweeping robot or a mopping robot, the camera module comprises a camera of the walking robot, and the motion control module comprises a walking wheel of the walking robot.
6. The eventualization driving method of the walking robot according to claim 4, characterized in that: the motion control module comprises a motion instruction module and a motion response module.
7. An eventing drive system of a walking robot for realizing the eventing drive method of the walking robot according to any one of claims 1 to 6, comprising: the system comprises a strategy module group, a main thread module and a hardware module group;
at least one policy module in the policy module group sends an event to a main thread module, wherein the event comprises sending module information, target module information and event information, and the target module comprises at least one policy module in the policy module group or at least one hardware module in the hardware module group; an event collection module of the main thread module receives an event and adds the event to an event pool; an event scheduling module of the main thread module processes the current event of the event pool and calls a target module to execute event information according to the target module information of the current event; and the hardware module sends the feedback after the event information is executed as an event to the main thread module.
8. The eventing actuation system of a walking robot according to claim 7, characterized in that: the event also comprises a generation time stamp, and the event collection module of the main thread module adds the events to the event pool according to the time stamp of the events.
9. The eventing actuation system of a walking robot according to claim 7, characterized in that: the strategy module group comprises a data fusion module, a visual algorithm module, a path planning module and a communication service module.
10. The eventing actuation system of a walking robot according to claim 7, characterized in that: the hardware module group comprises a camera module, a navigation module, a sensor module and a motion control module.
11. The eventing actuation system of a walking robot according to claim 10, characterized in that: the walking robot is a floor sweeping robot or a floor mopping robot, the camera module comprises a camera of the walking robot, and the motion control module comprises a walking wheel of the walking robot.
12. The eventing actuation system of a walking robot according to claim 10, characterized in that: the motion control module comprises a motion instruction module and a motion response module.
13. An eventing driving apparatus of a walking robot, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the incident driving method of the walking robot of any one of claims 1 to 6 via execution of the executable instructions.
14. A computer-readable storage medium storing a program, wherein the program is executed to implement the steps of the eventing drive method of the walking robot of any one of claims 1 to 6.
CN201811197403.4A 2018-10-15 2018-10-15 Incident driving method, system, device and storage medium for walking robot Pending CN111061259A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811197403.4A CN111061259A (en) 2018-10-15 2018-10-15 Incident driving method, system, device and storage medium for walking robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811197403.4A CN111061259A (en) 2018-10-15 2018-10-15 Incident driving method, system, device and storage medium for walking robot

Publications (1)

Publication Number Publication Date
CN111061259A true CN111061259A (en) 2020-04-24

Family

ID=70296290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811197403.4A Pending CN111061259A (en) 2018-10-15 2018-10-15 Incident driving method, system, device and storage medium for walking robot

Country Status (1)

Country Link
CN (1) CN111061259A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114079698A (en) * 2020-08-12 2022-02-22 北京有限元科技有限公司 Method and device for polling intelligent outbound robot and storage medium
CN114968459A (en) * 2022-05-27 2022-08-30 重庆长安汽车股份有限公司 Event processing method based on automobile AI intelligent assistant

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120315920A1 (en) * 2011-06-10 2012-12-13 International Business Machines Corporation Systems and methods for analyzing spatiotemporally ambiguous events
CN103984235A (en) * 2014-05-27 2014-08-13 湖南大学 Space manipulator control system software architecture based on C/S structure and establishing method
CN106444780A (en) * 2016-11-10 2017-02-22 速感科技(北京)有限公司 Robot autonomous navigation method and system based on vision positioning algorithm
CN107426023A (en) * 2017-07-21 2017-12-01 携程旅游信息技术(上海)有限公司 Cloud platform log collection and retransmission method, system, equipment and storage medium
CN107515900A (en) * 2017-07-24 2017-12-26 宗晖(上海)机器人有限公司 Intelligent robot and its event memorandum system and method
CN107942753A (en) * 2017-12-07 2018-04-20 惠州市德赛西威汽车电子股份有限公司 The communication frame and the means of communication of the software of robot and terminal device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120315920A1 (en) * 2011-06-10 2012-12-13 International Business Machines Corporation Systems and methods for analyzing spatiotemporally ambiguous events
CN103984235A (en) * 2014-05-27 2014-08-13 湖南大学 Space manipulator control system software architecture based on C/S structure and establishing method
CN106444780A (en) * 2016-11-10 2017-02-22 速感科技(北京)有限公司 Robot autonomous navigation method and system based on vision positioning algorithm
CN107426023A (en) * 2017-07-21 2017-12-01 携程旅游信息技术(上海)有限公司 Cloud platform log collection and retransmission method, system, equipment and storage medium
CN107515900A (en) * 2017-07-24 2017-12-26 宗晖(上海)机器人有限公司 Intelligent robot and its event memorandum system and method
CN107942753A (en) * 2017-12-07 2018-04-20 惠州市德赛西威汽车电子股份有限公司 The communication frame and the means of communication of the software of robot and terminal device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114079698A (en) * 2020-08-12 2022-02-22 北京有限元科技有限公司 Method and device for polling intelligent outbound robot and storage medium
CN114968459A (en) * 2022-05-27 2022-08-30 重庆长安汽车股份有限公司 Event processing method based on automobile AI intelligent assistant

Similar Documents

Publication Publication Date Title
Du et al. Robot cloud: Bridging the power of robotics and cloud computing
CN110516971B (en) Anomaly detection method, device, medium and computing equipment
Calisi et al. OpenRDK: a modular framework for robotic software development
CN103034578B (en) A kind of application data method for supervising and device
CN107678752B (en) Task processing method and device for heterogeneous cluster
CN103699599A (en) Message reliable processing guarantee method of real-time flow calculating frame based on Storm
JP2020535517A (en) Methods for job processing in quantum computing-enabled cloud environments, quantum cloud environments (QCE), and computer programs
CN103488775A (en) Computing system and computing method for big data processing
CN113641413B (en) Target model loading updating method and device, readable medium and electronic equipment
CN109951553B (en) Data processing method, system, electronic device and computer readable storage medium
Wang et al. Transformer: A new paradigm for building data-parallel programming models
CN111061259A (en) Incident driving method, system, device and storage medium for walking robot
US9280383B2 (en) Checkpointing for a hybrid computing node
CN103488517A (en) PHP code compiling method and device and PHP code running method and device
CN114489622A (en) Js application, electronic device, and storage medium
CN112633502B (en) Cross-platform execution method and device of deep learning model and electronic equipment
CN111240686B (en) Cloud compiling method and system, terminal device, cloud server and storage medium
CN114706622B (en) Method, device, equipment, medium and product for starting model service
CN115567526B (en) Data monitoring method, device, equipment and medium
US20160147621A1 (en) Mobile agent based memory replication
Hu et al. The AST3 controlling and operating software suite for automatic sky survey
CN113141407B (en) Page resource loading method and device and electronic equipment
CN111949862B (en) Method and device for managing business task flow and electronic equipment
CN112152947B (en) Processor, implementation method, electronic device and storage medium
US11163603B1 (en) Managing asynchronous operations in cloud computing environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200424