CN106383521B - Motion processing module based on robot development platform - Google Patents

Motion processing module based on robot development platform Download PDF

Info

Publication number
CN106383521B
CN106383521B CN201610970635.3A CN201610970635A CN106383521B CN 106383521 B CN106383521 B CN 106383521B CN 201610970635 A CN201610970635 A CN 201610970635A CN 106383521 B CN106383521 B CN 106383521B
Authority
CN
China
Prior art keywords
action
processing unit
data
instruction
steering engine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610970635.3A
Other languages
Chinese (zh)
Other versions
CN106383521A (en
Inventor
陈辉
于赛赛
洪定安
何仁渊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Changdong Intelligent Technology Co ltd
Original Assignee
Hangzhou Changdong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Changdong Intelligent Technology Co ltd filed Critical Hangzhou Changdong Intelligent Technology Co ltd
Priority to CN201610970635.3A priority Critical patent/CN106383521B/en
Publication of CN106383521A publication Critical patent/CN106383521A/en
Application granted granted Critical
Publication of CN106383521B publication Critical patent/CN106383521B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0692Rate of change of altitude or depth specially adapted for under-water vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63HMARINE PROPULSION OR STEERING
    • B63H1/00Propulsive elements directly acting on water
    • B63H1/30Propulsive elements directly acting on water of non-rotary type
    • B63H1/36Propulsive elements directly acting on water of non-rotary type swinging sideways, e.g. fishtail type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63HMARINE PROPULSION OR STEERING
    • B63H25/00Steering; Slowing-down otherwise than by use of propulsive elements; Dynamic anchoring, i.e. positioning vessels by means of main or auxiliary propulsive elements
    • B63H25/06Steering by rudders
    • B63H25/08Steering gear
    • B63H25/14Steering gear power assisted; power driven, i.e. using steering engine

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a motion processing module based on a robot development platform, which comprises an action execution unit, a processing unit and a sensing unit, wherein the action execution unit and the sensing unit are respectively connected with the processing unit in two directions; the processing unit sends out an action instruction; the action execution unit executes the action instruction and feeds back the execution condition of the action instruction to the processing unit; the processing unit sends out a query instruction; the sensing unit detects parameters in the query instruction and feeds the detected parameters back to the processing unit; the processing unit judges the completion condition of the action instruction by combining the detection parameters and the execution condition of the action instruction. The invention has flexible motion control and high control precision, and greatly improves the simulation degree of the robot action.

Description

Motion processing module based on robot development platform
Technical Field
The invention relates to a robot development motion module, in particular to a motion processing module based on a robot development platform.
Background
2016 is the world frightened by Alphago of Google, an artificial intelligence original year. The robot is a carrier of artificial intelligence technology, so that research institutes or companies all over the world also promote a plurality of robots, from humanoid robots, service robots to armed robots on the military, and it can be said that the robot rows are as vigorous and hot as the mobile internet of the current year. The robots in the market are mainly industrial robots and service robots, which are generally customized according to specific functions, such as welding robots, sweeping robots, etc., and further modification and upgrade of the robots are difficult and basically redesigned. The general robot development platform is high in entrance, basically has a large monopoly and is expensive, and small enterprises or individuals basically cannot consider the monopoly. Those small robot development platforms often provide only a few development boards or development modules, which are only suitable for learning or making simple toys. At present, a robot development platform which is easy and convenient to develop and high in modularization degree is urgently needed. In a robot development platform, motion processing is the core of robot motion, and how to move from single repetitive motion to intelligent high-simulation motion is a blank in the current research field.
Disclosure of Invention
The invention aims to overcome the problems in the prior art and provides a motion processing module based on a robot development platform.
In order to achieve the technical purpose and achieve the technical effect, the invention is realized by the following technical scheme:
a motion processing module based on a robot development platform comprises an action execution unit, a processing unit and a sensing unit, wherein the action execution unit and the sensing unit are respectively connected with the processing unit in a bidirectional mode; the processing unit sends out an action instruction; the action execution unit executes the action instruction and feeds back the execution condition of the action instruction to the processing unit; the processing unit sends out a query instruction; the sensing unit detects parameters in the query instruction and feeds the detected parameters back to the processing unit; the processing unit judges the completion condition of the action instruction by combining the detection parameters and the execution condition of the action instruction.
Further, the processing unit includes a user instruction processing unit, and the user instruction processing unit is configured to respond to an instruction sent by the user side.
Further, the processing unit further includes a free motion process, and the free motion process is used for responding to a random instruction in an idle state of the user side instruction.
Furthermore, the action execution unit comprises a direction steering engine and a power steering engine; the direction steering engine responds to a direction instruction sent by the processing unit; and the power steering engine responds to a forward command sent by the processing unit.
Furthermore, the action execution unit also comprises an auxiliary steering engine; the auxiliary steering engine comprises a direction auxiliary steering engine and a power auxiliary steering engine; the direction auxiliary steering engine responds to a direction auxiliary instruction sent by the processing unit; and the power auxiliary steering engine responds to a forward auxiliary command sent by the processing unit.
Further, the action execution unit further comprises an action customizing unit; the action customizing unit comprises a sinking and floating action unit and a reverse power action unit; the sinking and floating action unit responds to a floating or sinking instruction sent by the processing unit; the reverse power action unit responds to a backward command sent by the processing unit.
Furthermore, the sensing unit comprises a gyroscope sensor, an acceleration sensor, a distance sensor, an angle sensor, a humidity sensor, a temperature sensor, a water pressure sensor, a current sensor and a Hall sensor; the gyroscope sensor induces the robot position and posture information to be fed back to the processing unit; the acceleration sensor senses the motion acceleration information of the robot and feeds the motion acceleration information back to the processing unit; the distance sensor comprises an infrared distance measuring sensor, an ultrasonic distance measuring sensor and a sonar distance measuring sensor; the distance sensor senses the distance information between the robot and the external environment and feeds the distance information back to the processing unit; the angle sensor senses the rotation angle information of the steering engine and feeds the rotation angle information back to the processing unit; the humidity sensor senses humidity information in the steering engine and feeds the humidity information back to the processing unit; the temperature sensor senses temperature information in the steering engine and feeds the temperature information back to the processing unit; the water pressure sensor senses water pressure information in the sinking and floating action unit and feeds the water pressure information back to the processing unit; the current sensor senses phase current information in the steering engine and feeds the phase current information back to the processing unit; and the Hall sensor senses the position information of the motor rotor in the steering engine and feeds the position information back to the processing unit.
Further, the user command processing comprises the following steps: initializing variables, analyzing action instructions (action type judgment, action state judgment, action instruction sending, action instruction executing, action synchronization judgment, action timeout judgment and action cycle judgment), updating robot states, judging errors and deleting action instructions.
Further, the free-motion processing comprises the following steps: variable initialization, delay setting, idle state judgment, user configuration judgment, scene mode setting, scene mode judgment (executing scene mode), free state mode judgment, obstacle avoidance judgment (executing obstacle avoidance), random number generation judgment execution probability judgment and random action execution.
The invention provides a motion processing module based on a robot development platform, which comprises an action execution unit, a processing unit and a sensing unit, wherein the action execution unit and the sensing unit are respectively connected with the processing unit in a bidirectional way; the processing unit sends out an action instruction; the action execution unit executes the action instruction and feeds back the execution condition of the action instruction to the processing unit; the processing unit sends out a query instruction; the sensing unit detects parameters in the query instruction and feeds the detected parameters back to the processing unit; the processing unit judges the completion condition of the action instruction by combining the detection parameters and the execution condition of the action instruction. The invention has flexible motion control and high control precision, and greatly improves the simulation degree of the robot action.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings. The detailed description of the present invention is given in detail by the following examples and the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a motion processing module framework based on a robot development platform according to the present invention;
FIG. 2 is a user instruction processing flow diagram of the present invention;
FIG. 3 is a free-form process flow diagram of the present invention;
FIG. 4 is an exploded view of a robot of the present invention;
FIG. 5 is a schematic diagram of the internal structure of a robot according to the present invention;
FIG. 6 is a schematic turning diagram of a robot of the present invention;
reference numbers in the figures: the robot comprises a robot 1, a head 2, a tail 3, a bionic fish fin 4, a framework 5, a direction steering engine 6, an auxiliary steering engine 7, a lower jaw 21, a fish fin framework 41 and a power steering engine 60.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Referring to fig. 1-6, a motion processing module based on a robot development platform, as shown in fig. 1, includes an action execution unit, a processing unit, and a sensing unit, where the action execution unit and the sensing unit are respectively connected with the processing unit in two directions; the processing unit sends out an action instruction; the action execution unit executes the action instruction and feeds back the execution condition of the action instruction to the processing unit; the processing unit sends out a query instruction; the sensing unit detects parameters in the query instruction and feeds the detected parameters back to the processing unit; the processing unit judges the completion condition of the action instruction by combining the detection parameters and the execution condition of the action instruction.
As shown in fig. 2, the processing unit includes a user instruction processing unit, and the user instruction processing unit is configured to respond to an instruction issued by a user terminal. The user instruction processing comprises the following steps: initializing variables, analyzing action instructions (action type judgment, action state judgment, action instruction sending, action instruction executing, action synchronization judgment, action overtime judgment and action circulation judgment) to judge whether the actions are real-time test actions, and assigning different pointer values to the pointers of the action structure to be analyzed according to classification. And analyzing the action step by step, setting the angle and the speed of a steering engine according to the action type, overtime, action repetition times, a synchronous bit of each action state, an action control mode and the like, judging the action type after executing one-time action analysis, releasing the signal quantity, informing a processing unit that the operation can be continued, updating the state of the robot, and wrongly judging and deleting the action command.
As shown in fig. 3, the processing unit further includes a free-motion process, where the free-motion process is used to respond to a random command in an idle state of a user-side command. The free motion processing flow is as follows: firstly, in the priority, the special situation of the contextual model has the highest priority, the sensing unit data is detected, the comprehensive judgment is carried out, if a certain contextual model is met, the action command set by the contextual model is directly executed, when the contextual model is not met, the sensing unit value is judged firstly, the obstacle avoidance operation is carried out, if the obstacle avoidance is not needed, the current robot state is judged, the corresponding action is executed, if the robot state is in a free state, the random number within the action command number range is generated firstly, meanwhile, the random number execution probability is calculated to determine whether the action is executed, and after the occurrence times are reached, the command corresponding to the random number is executed.
As shown in fig. 4, which is a schematic exploded view of a structure of a robot 1 in a specific embodiment, the robot 1 includes a head 2, a tail 3, a bionic fin 4, and a skeleton 5, as shown in fig. 5, a direction steering gear 6 is installed on the front part of the skeleton 5, and a power steering gear 60 is installed on each of the middle part and the tail of the skeleton 5; the direction steering engine responds to a direction instruction sent by the processing unit; and the power steering engine responds to a forward command sent by the processing unit. The framework 5 is also provided with 4 auxiliary steering engines 7; the 4 auxiliary steering engines 7 are symmetrically arranged in a pair of bionic fish fins 4 at the front and the back respectively; the auxiliary steering engine 7 is connected with the bionic fish fin 4; as shown in fig. 6, when the robot 1 turns, the single-side auxiliary steering engine 7 moves to drive the single-side bionic fish fin 4 to move, and when the robot 1 turns to the right, the left-side auxiliary steering engine 7 moves to drive the left-side bionic fish fin 4 to move, so that auxiliary right turning is completed; the principle of steering to the left is the same as above. Meanwhile, the bionic fish fins 4 are also used as auxiliary power steering engines, when the robot 1 does not perform steering motion, the auxiliary steering engines 7 on the two sides drive the bionic fish fins 4 on the two sides to move, and the motion modes are synchronous moving and asynchronous moving.
Preferably, the action execution unit further comprises an action customizing unit; the action customizing unit comprises a sinking and floating action unit and a reverse power action unit; the sinking and floating action unit responds to a floating or sinking instruction sent by the processing unit; the reverse power action unit responds to a backward command sent by the processing unit. When the floating and sinking action unit responds to the floating or sinking instruction, the bionic fish fins 4 drive the bionic fish fins 4 on two sides to move in the auxiliary steering engines 7 on two sides, and the motion modes are synchronous floating and asynchronous floating, so that the simulation motion and the auxiliary floating and sinking functions are realized.
Preferably, the directional steering engine 6, the auxiliary steering engine 7 and the power steering engine 60 are all provided with servo driving systems, the servo driving systems include motion communication, the motion communication includes data sending and data receiving, the motion communication is transmitted by a motion instruction data packet, and as shown in table 1, the format of the motion instruction data packet is frame header + machine code + instruction type + data length + sub-command data + check frame + frame tail.
TABLE 1
Figure BDA0001145635360000071
Description of the format of the action command packet:
(1) the frame header is represented by two bytes, namely 0xfe and 0 xef;
(2) machine code, different equipment, define different code to distinguish;
(3) the instruction type is used for indicating the function to be executed by the data packet and can be defined by self;
(4) data length, the data length after removing four bytes of the frame head and the frame tail;
(5) a sub-command, a sub-command class under the instruction type, i.e. a secondary command;
(6) subdata, transmitted user end data;
(7) and checking the frame by adopting CRC, and calculating the result value of the data from the frame head to the front of the CRC value through CRC.
(8) The end of the frame, represented by two bytes, is 0xfd, 0xdf, respectively.
Preferably, the data sending is to form a series of data packets according to the action instruction data format according to the sending request of the user side, and send out the series of data packets wirelessly, wait for the response flag to receive, and start the timeout count of waiting for response. And if the overtime counting is finished and the response mark is not received or the received response mark is wrong, retransmitting the data, and if the received response mark is correct and is not overtime, ending the transmission.
And the data receiving end judges whether the byte is 0xfe after receiving the first byte, if so, the data receiving end continues to receive the second byte, judges that the second byte is 0xef, and starts to store the effective data received later and records the length of the received data, and the frame header is correct. When the received data is 0xfd and the next received data is 0xdf, it indicates that the end of the data packet is received, compares the length value in the data packet with the length value recorded in the receiving record to see if they are equal, if they are not equal, it indicates that the data is lost in the transmitting process, and the data in the data packet has error and can not be used. If the lengths of the two are equal, the length of the data packet is correct, then a check value is calculated through CRC, the calculated check value is compared with the CRC check value in the data packet, if the lengths of the two are equal, the data is correct, if the lengths of the two are not equal, the data is wrong due to other reasons in the sending process, the data packet is unavailable, and the resending request is made. If the received data is checked to be correct, a correct response mark is returned, otherwise, an incorrect response mark is returned, the sending end is requested to resend the data, and the data of each communication is ensured to be correct.
The instruction types are shown in Table 2.
TABLE 2
Instruction type value Definition of
0x01 User side 1 (handset pad) sends out reading command
0x02 The user side 2 (action simulator) issues a read command
0x03 User side 3(Pc configuration software) issues a read command
0x04 User side 1 (handset pad) sends out write-in command
0x05 The user side 2 (action simulator) issues a write command
0x06 The user side 3(PC configuration software) issues a write command
0x07 User side 1 (handset pad) sends out control command
0x11 The terminal feeds back the result of the reading command to the user side 1 (mobile phone pad)
0x12 The terminal feeds back the result of the read command to the user side 2 (action simulator)
0x13 The terminal feeds back the result of the read command to the user side 3(PC configuration software)
0x14 The terminal feeds back the writing command result to the user side 1 (mobile phone pad)
0x15 The terminal feeds back the result of the write command to the user side 2 (action simulator)
0x16 The terminal feeds back the write command result to the user side 3(PC configuration software)
0x17 The terminal feeds back a control command result 1 to the user side 1 (mobile phone pad)
0x18 The terminal feeds back a control command result 2 to the user side 1 (mobile phone pad)
0x19 The terminal feeds back a control command result 3 to the user side 1 (mobile phone pad)
In table 3 are sub-command types.
TABLE 3
Figure BDA0001145635360000091
Figure BDA0001145635360000101
Preferably, a data structure struct { data 1; data 2; data 3; data 4; data 5; data 6; data 7; data8 }; the data structure has 8 bytes of data in it as action execution value A + action execution value B + action execution value C + priority bit + fault bit + instruction + action execution reservation bit A + action execution reservation bit B, and in different action execution units, the meaning of each byte is also different:
(1) the steering engine execution module: data1 represents the angle of the steering engine; data2 represents the speed of the steering engine; data3 represents the current of the steering engine; data4 represents the ID of the steering engine module; data5 represents command, data6 represents fault information of the steering engine, data7, and data8 reserves.
(2) The servo motor execution module: data1 represents the angle of the servo motor; data2 represents the speed of the servo motor; data3 represents the current of the servo motor; data4 denotes a servo motor module ID; data5 represents a command, data6 represents failure information of the servo motor, data7, data8 reserve.
(3) A stepping motor module: data1 denotes the angle of the stepping motor; data2 represents the speed of the stepper motor; data3 represents the current of the stepping motor; data4 denotes the ID of the stepper motor module; data5 denotes a command, data6 denotes failure information of the stepping motor, data7, data8 reserve.
Preferably, the sensing unit comprises a gyroscope sensor, an acceleration sensor, a distance sensor, an angle sensor, a humidity sensor, a temperature sensor, a water pressure sensor, a current sensor and a hall sensor; the gyroscope sensor induces the robot position and posture information to be fed back to the processing unit; the acceleration sensor senses the motion acceleration information of the robot and feeds the motion acceleration information back to the processing unit; the distance sensor comprises an infrared distance measuring sensor, an ultrasonic distance measuring sensor and a sonar distance measuring sensor; the distance sensor senses the distance information between the robot and the external environment and feeds the distance information back to the processing unit; the angle sensor senses the rotation angle information of the steering engine and feeds the rotation angle information back to the processing unit; the humidity sensor senses humidity information in the steering engine and feeds the humidity information back to the processing unit; the temperature sensor senses temperature information in the steering engine and feeds the temperature information back to the processing unit; the water pressure sensor senses water pressure information in the sinking and floating action unit and feeds the water pressure information back to the processing unit; the current sensor senses phase current information in the steering engine and feeds the phase current information back to the processing unit; and the Hall sensor senses the position information of the motor rotor in the steering engine and feeds the position information back to the processing unit. The sonar sensor is a sensor specially used for underwater distance measurement, sonar signals are transmitted in water and are not easy to attenuate, and the stability of the signals is good. Two sonar sensors are installed below the fish head in the application, and the contained angle of two sonars is about 20 degrees, is used for detecting the distance change of left place ahead and right place ahead respectively. When the robotic fish moves about, processing unit intermittent type formula control sonar sensor transmission sonar signal to the return signal of real-time receipt sonar, through calculating the time difference of receiving after the sonar transmission and the transmission speed of sonar in aqueous, can work out the distance between robotic fish and the barrier, this section time of sonar from launching to receiving the reflection signal is exactly sonar time of making a round trip, and the product result division of this time and sonar transmission speed in aqueous is exactly the distance value of 2. The processing unit judges the change of the distance in real time, controls the left turn or the right turn of the robot 1 in advance, and achieves the function of avoiding obstacles. The gyroscope sensor, the acceleration sensor and the water pressure sensor are matched with sonar ranging in two directions (in front and below), so that the motion attitude and the planned motion path of the robot 1 can be detected in real time. The flowmeter is used for measuring the water inflow and the water outflow of the water storage tank, the liquid level meter is used for detecting the depth of the robot in the water pool, and the water pressure sensor is used for detecting the water storage amount in the water storage tank. The two sides of the body of the robot 1 are respectively provided with a plurality of visible light sensors, the sensors can detect the illumination intensity, and the advancing direction of the robot is controlled by comparing the light intensity of the two sides, so that the phototaxis or photophobic movement is realized. The robot body is provided with a laser receiver for receiving the laser beam after special modulation, and a user can trigger the robot to make corresponding actions by irradiating different receivers with a laser pen; some moving parts of the robot 1 may contact or hurt people and other animals, and a pressure sensor and a microswitch are added at the parts, so that accidents can be prevented; the current sensor detects the phase current of a brushless motor in the steering engine, the position of a rotor of the brushless motor in the Hall sensor steering engine, the temperature sensor detects the temperature of a motor module, the humidity sensor detects whether the motor in the steering engine enters water or not, and the magnetic angle sensor detects the rotation angle of the motor in the steering engine.
The invention provides a motion processing module based on a robot development platform, which comprises an action execution unit, a processing unit and a sensing unit, wherein the action execution unit and the sensing unit are respectively connected with the processing unit in two directions; the processing unit sends out an action instruction; the action execution unit executes the action instruction and feeds back the execution condition of the action instruction to the processing unit; the processing unit sends out a query instruction; the sensing unit detects parameters in the query instruction and feeds the detected parameters back to the processing unit; the processing unit judges the completion condition of the action instruction by combining the detection parameters and the execution condition of the action instruction. The invention has flexible motion control and high control precision, and greatly improves the simulation degree of the robot action.
The foregoing is merely a preferred embodiment of the invention and is not intended to limit the invention in any manner; the present invention may be readily implemented by those of ordinary skill in the art as illustrated in the accompanying drawings and described above; however, those skilled in the art should appreciate that they can readily use the disclosed conception and specific embodiments as a basis for designing or modifying other structures for carrying out the same purposes of the present invention without departing from the scope of the invention as defined by the appended claims; meanwhile, any changes, modifications, and evolutions of the equivalent changes of the above embodiments according to the actual techniques of the present invention are still within the protection scope of the technical solution of the present invention.

Claims (5)

1. The utility model provides a motion processing module based on robot development platform, includes action execution unit, processing unit, sensing unit, its characterized in that: the action execution unit and the sensing unit are respectively connected with the processing unit in two directions; the processing unit sends out an action instruction; the action execution unit executes the action instruction and feeds back the execution condition of the action instruction to the processing unit; the processing unit sends out a query instruction; the sensing unit detects parameters in the query instruction and feeds the detected parameters back to the processing unit; the processing unit judges the completion condition of the action instruction by combining the detection parameters and the execution condition of the action instruction; the processing unit also comprises free motion processing, and the free motion processing is used for responding to a random instruction in an idle state of a user side instruction; the free motion processing comprises the following steps: initializing variables, setting delay, judging an idle state, judging user configuration, setting a contextual model, judging the contextual model, judging a free state model, avoiding obstacles, generating random numbers, judging execution probability and executing random actions; the free motion processing flow is as follows: firstly, in the priority, the special situation of the contextual model has the highest priority, the data of the sensing units are detected, the comprehensive judgment is carried out, if the contextual model accords with a certain contextual model, the action command set by the contextual model is directly executed, when the contextual model does not accord with the certain contextual model, the value of the sensing units is judged firstly, the obstacle avoidance operation is carried out, if the obstacle avoidance is not needed, the current state of the robot is judged, the corresponding action is executed, if the current state is a free state, the random number within the action command number range is generated firstly, meanwhile, the random number execution probability is calculated to determine whether the action is executed, and after the occurrence times are reached, the command corresponding to the random number is executed;
the action execution unit comprises a direction steering engine and a power steering engine; the direction steering engine responds to a direction instruction sent by the processing unit; the power steering engine responds to a forward command sent by the processing unit, and the action execution unit further comprises an auxiliary steering engine; the auxiliary steering engine comprises a direction auxiliary steering engine and a power auxiliary steering engine; the direction auxiliary steering engine responds to a direction auxiliary instruction sent by the processing unit; the power auxiliary steering engine responds to a forward auxiliary command sent by the processing unit; the direction steering engine, the auxiliary steering engine and the power steering engine are all provided with servo driving systems, the servo driving systems comprise motion communication, the motion communication comprises data sending and data receiving, the motion communication is transmitted by motion instruction data packets, and the format of the motion instruction data packets is frame header, machine code number, instruction type, data length, sub-command data, check frame and frame tail; the frame head is represented by two bytes which are respectively 0xfe and 0xef, the frame tail is represented by two bytes which are respectively 0xfd and 0xdf, the data transmission forms a string of data packets according to the transmission request of the user side and the action instruction data format, the string of data packets are transmitted out in a wireless way, the response mark is waited to be received, the response overtime counting is started, if the response mark is not received or the received response mark is wrong after the overtime counting is finished, the data is retransmitted, and if the response mark is received correctly and is not overtime, the transmission is ended; the data receiving end judges whether the byte is 0xfe after receiving the first byte, if so, the data receiving end continues to receive the second byte, judges that the second byte is 0xef, and the frame header is correct, starts to store the effective data received later, and records the length of the received data; when the received data is 0xfd and the next received data is 0xdf, the data indicates that the end of the frame is received, the data packet is ended, the length value in the data packet is compared with the length value recorded by receiving, if the length value in the data packet is not equal to the length value recorded by receiving, the data is lost in the sending process, and the data of the data packet is wrong and cannot be used; if the lengths of the two are equal, the length of the data packet is correct, then a check value is calculated through CRC, the calculated check value is compared with the CRC check value in the data packet, if the lengths of the two are equal, the data is correct, if the lengths of the two are not equal, the data is error due to other reasons in the sending process, the data packet is unavailable, and the resending request is made; if the received data is checked to be correct, a correct response mark is returned, otherwise, an incorrect response mark is returned, the sending end is requested to resend the data, and the data of each communication is ensured to be correct.
2. The robot development platform-based motion processing module of claim 1, wherein: the processing unit comprises a user instruction processing unit, and the user instruction processing unit is used for responding to an instruction sent by the user side.
3. The robot development platform-based motion processing module of claim 1, wherein: the action execution unit also comprises an action customizing unit; the action customizing unit comprises a sinking and floating action unit and a reverse power action unit; the sinking and floating action unit responds to a floating or sinking instruction sent by the processing unit; the reverse power action unit responds to a backward command sent by the processing unit.
4. The robot development platform-based motion processing module of claim 1, wherein: the sensing unit comprises a gyroscope sensor, an acceleration sensor, a distance sensor, an angle sensor, a humidity sensor, a temperature sensor, a water pressure sensor, a current sensor and a Hall sensor; the gyroscope sensor induces the robot position and posture information to be fed back to the processing unit; the acceleration sensor senses the motion acceleration information of the robot and feeds the motion acceleration information back to the processing unit; the distance sensor comprises an infrared distance measuring sensor, an ultrasonic distance measuring sensor and a sonar distance measuring sensor; the distance sensor senses the distance information between the robot and the external environment and feeds the distance information back to the processing unit; the angle sensor senses the rotation angle information of the steering engine and feeds the rotation angle information back to the processing unit; the humidity sensor senses humidity information in the steering engine and feeds the humidity information back to the processing unit; the temperature sensor senses temperature information in the steering engine and feeds the temperature information back to the processing unit; the water pressure sensor senses water pressure information in the sinking and floating action unit and feeds the water pressure information back to the processing unit; the current sensor senses phase current information in the steering engine and feeds the phase current information back to the processing unit; and the Hall sensor senses the position information of the motor rotor in the steering engine and feeds the position information back to the processing unit.
5. The robot development platform-based motion processing module of claim 2, wherein the user command processing comprises the following steps: initializing variables, analyzing action instructions, updating the state of the robot, judging errors and deleting the action instructions; the analysis action command comprises action type judgment, action state judgment, action command sending, action command executing, action synchronization judgment, action timeout judgment and action circulation judgment.
CN201610970635.3A 2016-11-05 2016-11-05 Motion processing module based on robot development platform Active CN106383521B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610970635.3A CN106383521B (en) 2016-11-05 2016-11-05 Motion processing module based on robot development platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610970635.3A CN106383521B (en) 2016-11-05 2016-11-05 Motion processing module based on robot development platform

Publications (2)

Publication Number Publication Date
CN106383521A CN106383521A (en) 2017-02-08
CN106383521B true CN106383521B (en) 2021-05-18

Family

ID=57958876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610970635.3A Active CN106383521B (en) 2016-11-05 2016-11-05 Motion processing module based on robot development platform

Country Status (1)

Country Link
CN (1) CN106383521B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107263526A (en) * 2017-06-05 2017-10-20 上海交通大学 A kind of multifunctional light quantification service joint of robot module
WO2019018963A1 (en) * 2017-07-22 2019-01-31 深圳市萨斯智能科技有限公司 Robot movement speed control method, and robot
CN108052109A (en) * 2018-02-08 2018-05-18 广东环境保护工程职业学院 A kind of automatic inspection robot and aquatic monitoring system
WO2020156684A1 (en) * 2019-02-01 2020-08-06 Positec Power Tools (Suzhou) Co., Ltd. Self moving device and magnetic boundary system
CN111405250A (en) * 2020-03-24 2020-07-10 北京海益同展信息科技有限公司 Video stream data processing method, device, robot and storage medium
CN114326495B (en) * 2021-12-24 2024-02-09 中电海康集团有限公司 Robot control system architecture and voice instruction processing method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005059656A (en) * 2003-08-08 2005-03-10 Fuji Heavy Ind Ltd Landing control device and method for flying object
CN100584695C (en) * 2007-04-30 2010-01-27 哈尔滨工程大学 Bionic underwater chelonian robot
CN102426455B (en) * 2011-12-31 2013-10-30 浙江中控研究院有限公司 Solar mirror surface cleaning robot system
CN102745320B (en) * 2012-07-26 2015-03-11 中国科学院自动化研究所 Backward swimming control method of biomimetic carangiform robot fish
CN203259877U (en) * 2013-05-09 2013-10-30 兰州商学院 Baby carriage intelligence control system
CN104849717A (en) * 2015-04-03 2015-08-19 吴李海 Automatic fish-finding system

Also Published As

Publication number Publication date
CN106383521A (en) 2017-02-08

Similar Documents

Publication Publication Date Title
CN106383521B (en) Motion processing module based on robot development platform
CN104865965B (en) The avoidance obstacle method and system that robot depth camera is combined with ultrasonic wave
CN106406328B (en) Motion control method based on robot development platform
EP3104284A1 (en) Automatic labeling and learning of driver yield intention
JP7185811B2 (en) Method for processing obstacle detection result by ultrasonic array, computer device, storage medium, program and system
CN109590986B (en) Robot teaching method, intelligent robot and storage medium
CN113561963B (en) Parking method and device and vehicle
CN112034735B (en) Simulation experiment platform for multi-AUV underwater cooperative operation
CN106364650B (en) Bionic machine fish
CN204423154U (en) A kind of automatic charging toy robot based on independent navigation
CN112631314B (en) Robot control method and system based on multi-line laser radar and event camera SLAM
CN113524265B (en) Robot anti-falling method, robot and readable storage medium
CN208953961U (en) A kind of sliceable intelligent carriage based on graphic programming
CN111429515A (en) Learning method of robot obstacle avoidance behavior based on deep learning
Lee et al. Fast perception, planning, and execution for a robotic butler: Wheeled humanoid m-hubo
CN110825106B (en) Obstacle avoidance method of aircraft, flight system and storage medium
CN106547558A (en) It is a kind of to be based on modularization robot platform development system
CN106648614B (en) Robot development system architecture based on modular platform and main control unit thereof
CN115619869B (en) Positioning method and device of automatic guiding transport vehicle and automatic guiding transport vehicle
CN110480636A (en) A kind of mechanical arm control system based on 3D vision
CN103317513A (en) Networked robot control system based on CPUs
CN115812646B (en) Fish behavior analysis method in fishway
CN103472838A (en) Fast sprint controller of four-wheel micro-mouse based on double processors
CN112947426A (en) Cleaning robot motion control system and method based on multi-sensing fusion
CN208969510U (en) A kind of submersible six-freedom motion real-time measurement apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant