CN106648614B - Robot development system architecture based on modular platform and main control unit thereof - Google Patents

Robot development system architecture based on modular platform and main control unit thereof Download PDF

Info

Publication number
CN106648614B
CN106648614B CN201610971036.3A CN201610971036A CN106648614B CN 106648614 B CN106648614 B CN 106648614B CN 201610971036 A CN201610971036 A CN 201610971036A CN 106648614 B CN106648614 B CN 106648614B
Authority
CN
China
Prior art keywords
unit
interaction
control unit
module
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610971036.3A
Other languages
Chinese (zh)
Other versions
CN106648614A (en
Inventor
陈辉
于赛赛
洪定安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Changdong Intelligent Technology Co ltd
Original Assignee
Hangzhou Changdong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Changdong Intelligent Technology Co ltd filed Critical Hangzhou Changdong Intelligent Technology Co ltd
Priority to CN201610971036.3A priority Critical patent/CN106648614B/en
Publication of CN106648614A publication Critical patent/CN106648614A/en
Application granted granted Critical
Publication of CN106648614B publication Critical patent/CN106648614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/20Software design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/42Bus transfer protocol, e.g. handshake; Synchronisation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a modular platform-based robot development system architecture, which comprises a sensing unit, an action execution unit and an interaction unit, wherein the sensing unit, the action execution unit and the interaction unit are controlled by a main control unit; the initialization module initializes the parameters of the action execution unit and the parameters of system variables in the main control unit; the instruction interaction module receives an interaction unit instruction and compiles the interaction unit instruction into an action instruction; the information interaction module acquires data of the sensing unit; and the action execution module executes the action instruction. The robot control system has the advantages of ingenious logic structure and high modularization degree, adopts consistent communication standards, and unifies the communication protocols of all the sensing units, the action execution units and the main control unit, thereby facilitating the research and development of the robot based on the platform architecture.

Description

Robot development system architecture based on modular platform and main control unit thereof
Technical Field
The invention relates to a robot development system architecture, in particular to a robot development system architecture of a modularized platform and a main control unit thereof.
Background
2016 is the world frightened by Alphago of Google, an artificial intelligence original year. The robot is a carrier of artificial intelligence technology, so that research institutes or companies all over the world also promote a plurality of robots, from humanoid robots, service robots to armed robots on the military, and it can be said that the robot rows are as vigorous and hot as the mobile internet of the current year. The robots in the market are mainly industrial robots and service robots, which are generally customized according to specific functions, such as welding robots, sweeping robots, etc., and further modification and upgrade of the robots are difficult and basically redesigned. The general robot development platform is high in entrance, basically has a large monopoly and is expensive, and small enterprises or individuals basically cannot consider the monopoly. Those small robot development platforms often provide only a few development boards or development modules, which are only suitable for learning or making simple toys. At present, a robot development platform which is easy and convenient to develop and high in modularization degree is urgently needed.
Disclosure of Invention
The invention aims to overcome the problems in the prior art and provides a robot development system architecture based on a modular platform.
In order to achieve the technical purpose and achieve the technical effect, the invention is realized by the following technical scheme:
the robot development system architecture based on the modular platform comprises a main control unit, a sensing unit, an action execution unit and an interaction unit; the sensing unit, the action execution unit and the interaction unit are controlled by the main control unit; a user inputs an instruction to the main control unit through the interaction unit, the sensing unit senses an external signal and transmits the external signal to the main control unit, the main control unit processes and generates an instruction for controlling the action execution unit, and the action execution unit executes the instruction; the sensing unit, the action execution unit, the interaction unit and the main control unit adopt interactive communication, and communication protocols of the interactive communication are kept consistent.
Furthermore, the interaction unit comprises PC end interaction, mobile end interaction and action simulator interaction; the PC end interaction and the mobile end interaction are connected with the cloud server; the transmission mode of the interaction unit and the main control unit adopts wireless transparent transmission; the mobile terminal interaction comprises voice interaction, an intelligent module, action recognition interaction, facial expression recognition interaction and action simulation interaction.
Further, the modular platform-based robot development system architecture further comprises a special action execution unit; feedback units are arranged in the special action execution unit, the sensing unit, the action execution unit and the interaction unit; the feedback unit is connected with the main control unit and provides feedback information for the main control unit.
The robot development system main control unit based on the modularized platform comprises an initialization module, an instruction interaction module, an information interaction module and an action execution module; the initialization module initializes the parameters of the action execution unit and the parameters of system variables in the main control unit; the instruction interaction module receives an interaction unit instruction and compiles the interaction unit instruction into an action instruction; the information interaction module acquires data of the sensing unit; the action execution module executes the action instruction; the initialization module has the highest priority in the main control unit.
Further, the initialization module comprises the following steps: hardware initialization, equipment initialization, system global variable initialization and module task creation.
Further, the initialization module further comprises the following steps: the method comprises the steps of main power supply starting, version information output, user configuration loading, clock interrupt starting, system multitask switching starting and early warning and alarming circulation.
Further, the instruction interaction module comprises the following steps: waiting for instructions, interrupt detection, selecting instruction types.
Further, the instruction interaction module further comprises the following steps: setting and initializing variables, checking instructions and sending feedback messages.
Further, the information interaction module comprises the following steps: and setting sampling frequency, sampling by the sensing unit and inquiring information of the action execution unit.
Further, the information interaction module further comprises the following steps: initializing variables in a task, filtering, judging a threshold value, counting cycle times, acquiring equipment signals, and judging whether information is wrong.
Further, the action execution module comprises the following steps: judging the action type, judging the action state, sending an action command and executing the action command.
Further, the action execution module further comprises the following steps: variable initialization, action synchronization judgment, action overtime judgment, action circulation judgment, robot state updating, error judgment and action instruction deletion.
Furthermore, the robot development system main control unit based on the modularized platform also comprises a free state module and a debugging test module, wherein the free state module is a robot control module when the robot development system is operated without an interactive unit; the debugging test module is used for debugging and maintaining.
Further, the free state module comprises the following steps: the method comprises the steps of idle state judgment, setting of a contextual model, contextual model judgment (execution of the contextual model), free state model judgment, random number generation judgment execution probability and random action execution.
Further, the free state module further comprises the following steps: variable initialization, delay setting, user configuration judgment and obstacle avoidance judgment (executing obstacle avoidance).
The invention provides a modular platform-based robot development system architecture, which comprises a sensing unit, an action execution unit and an interaction unit, wherein the sensing unit, the action execution unit and the interaction unit are controlled by a main control unit; the initialization module initializes the parameters of the action execution unit and the parameters of system variables in the main control unit; the instruction interaction module receives an interaction unit instruction and compiles the interaction unit instruction into an action instruction; the information interaction module acquires data of the sensing unit; and the action execution module executes the action instruction. The robot control system has the advantages of ingenious logic structure and high modularization degree, adopts consistent communication standards, and unifies the communication protocols of all the sensing units, the action execution units and the main control unit, thereby facilitating the research and development of the robot based on the platform architecture.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings. The detailed description of the present invention is given in detail by the following examples and the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a modular platform based robot development system architecture diagram of the present invention;
FIG. 2 is a block diagram of the master control unit organization of the present invention;
FIG. 3 is a flow diagram of an initialization module of the present invention;
FIG. 4 is a schematic flow diagram of an instruction interaction module of the present invention;
FIG. 5 is a flow diagram of an information interaction module of the present invention;
FIG. 6 is a flow diagram of an action execution module of the present invention;
FIG. 7 is a free state block flow diagram of the present invention;
FIG. 8 is a schematic flow diagram of a debug test module of the present invention;
FIG. 9 is a schematic diagram of a user-side communication data transmission flow of the present invention;
FIG. 10 is a schematic diagram of a user-side communication data reception process of the present invention;
FIG. 11 is a schematic diagram of a sensing unit communication structure of the present invention;
FIG. 12 is a diagram of the communication structure of the action execution unit of the present invention;
FIG. 13 is a schematic diagram of a sensing unit communication flow of the present invention;
fig. 14 is a communication flow diagram of the action execution unit of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Referring to fig. 1 to 14, a modular platform based robot development system architecture, as shown in fig. 1, includes a main control unit, a sensing unit, an action execution unit, an interaction unit, and a special action execution unit, where the sensing unit, the action execution unit, the interaction unit, and the special action execution unit are controlled by the main control unit; the special action execution unit comprises an underwater sinking and floating module and an underwater obstacle avoidance module. As shown in fig. 2, the main control unit includes an initialization module, an instruction interaction module, an information interaction module, and an action execution module; the initialization module initializes the parameters of the action execution unit and the parameters of system variables in the main control unit; the instruction interaction module receives an interaction unit instruction and compiles the interaction unit instruction into an action instruction; the information interaction module acquires data of the sensing unit; the action execution module executes the action instruction; the initialization module has the highest priority in the main control unit.
Preferably, feedback units are arranged in the sensing unit, the action execution unit and the interaction unit; the feedback unit is connected with the main control unit and provides feedback information for the main control unit. The main control unit judges whether to execute the user instruction according to the feedback information, whether the instruction is finished or not, whether the instruction is completely transmitted or not, whether the instruction can be smoothly executed or not, whether the robot needs to avoid the obstacle or not, whether the robot enters a free state or not and the like.
With reference to fig. 3, the initialization module flow is as follows: initializing a system, initializing various hardware before system interruption is not started, starting interruption (1ms), entering system global variable initialization, reading user configuration information in a main control unit flash, checking errors, creating other module tasks, then entering a dead cycle, checking errors, and setting alarm flashing frequency if errors occur.
With reference to fig. 4, the instruction interaction module flow is as follows: the module task is a blocking type task and mainly receives instructions (communication commands and data) of an interaction unit. And waiting for the information obtained by interruption, if the interruption receives the frame head and frame tail meeting the frame head, judging that the state and the receiving length of the robot are not abnormal, sending an instruction, judging to perform CRC (cyclic redundancy check) after the action execution unit receives the instruction, resolving the instruction type according to a structure resolving command after the CRC is passed, executing the resolved instruction, and finally returning the information.
With reference to fig. 5, the information interaction module process is as follows: initializing variables in a task, setting time delay, forming sampling frequency, sampling and filtering by a sensing unit, counting cycle times, judging a threshold value, inquiring information of an action execution unit, acquiring equipment signals, and judging whether the information is wrong. The information interaction module is used for collecting various sensing unit data of the robot at high frequency, processing and analyzing the data, and making corresponding actions or updating the content of a data packet output by the robot according to the requirements. The types of the sensing unit generally include a voltage and current sensing unit, a depth sensing unit, a hall sensing unit, a sonar sensing unit, a distance infrared sensing unit, and a gyroscope. Preferably, the action execution unit further comprises a steering engine, a water pump and an air pump. The angular velocity and gyroscope signals are collected through an IIC interface and are mainly used for balance and stability control; voltage, current and depth signals are collected through an AD interface and are used for undervoltage alarm, overcurrent protection, sinking and floating control judgment and the like; the sonar ranging and angle and speed information of each action execution unit are collected through a bus and are respectively used for avoiding obstacles, communicating with the main control unit and judging the execution condition of each action execution unit.
With reference to fig. 6, the flow of the action execution module is as follows: initializing variables, analyzing action instructions (action type judgment, action state judgment, action instruction sending, action instruction executing, action synchronization judgment, action overtime judgment and action circulation judgment) to judge whether the actions are real-time test actions, and assigning different pointer values to the pointers of the action structure to be analyzed according to classification. And then analyzing the action step by step, setting the angle and the speed of a steering engine according to the action type, overtime, action repetition times, a synchronous bit of each action state, an action control mode and the like, judging the action type after executing one-time action analysis, releasing the signal quantity, informing a main control unit that the operation can be continued, updating the state of the robot, and wrongly judging and deleting the action command.
Preferably, as shown in fig. 1, the interaction unit includes PC-side interaction, mobile-side interaction, and action simulator interaction; the PC end interaction and the mobile end interaction are connected with the cloud server; the transmission mode of the interaction unit and the main control unit adopts wireless transparent transmission. The mobile terminal interaction comprises voice interaction, an intelligent module, action recognition interaction, facial expression recognition interaction and action simulation interaction. The interaction unit is connected with the main control unit through wifi wireless transparent transmission, information such as angles and speeds of all action execution units of the robot can be set, states of all action execution units and states of all sensing units can be read, various actions to be achieved can be achieved through action combinations with certain rules after test compiling, Flash of the main control unit can be stored in the action combinations in a command code mode, and meanwhile the actions can be shared or sold in a charged mode through being transmitted to a cloud server. When the robot needs to execute a certain command, a corresponding command sequence is read from Flash of the main control unit and automatically analyzed into a corresponding command by a hardware system to be transmitted to each action execution unit. The information interaction module can inquire various states of the robot in real time, can give an alarm in time when a fault occurs, and can be continuously used by directly replacing a damaged action execution unit and setting certain parameters if a certain action execution unit is damaged, so that the maintenance is simple and convenient. The whole system architecture really realizes modular intelligent design and has very strong expandability. The action simulator is connected with the main control unit through the wireless module, can simply control each action execution unit to execute various basic actions, and can debug out the combined action that you want more quickly by combining with the interaction unit.
As shown in fig. 2, the main control unit further includes a free state module and a debugging test module; the free state module is a robot control module when no interactive unit operates; the debugging test module is used for debugging and maintaining.
With reference to fig. 7, the free state module flow is as follows: the module is used for processing the contextual model, avoiding obstacles and generating free action commands. Firstly, in the priority, the special situation of the contextual model has the highest priority, the sensing unit data is detected, the comprehensive judgment is carried out, if a certain contextual model is met, the action command set by the contextual model is directly executed, when the contextual model is not met, the sensing unit value is judged firstly, the obstacle avoidance operation is carried out, if the obstacle avoidance is not needed, the current robot state is judged, the corresponding action is executed, if the robot state is in a free state, the random number within the action command number range is generated firstly, meanwhile, the random number execution probability is calculated to determine whether the action is executed, and after the occurrence times are reached, the command corresponding to the random number is executed.
Referring to fig. 8, the debugging and testing module uses an oscilloscope, and determines whether each unit of each module of the robot is normal by setting a delay and observing the output in the oscilloscope.
It should be understood that the units are represented as actual hardware entities and the modules are represented as software logical processes.
Preferably, the sensing unit, the action execution unit, the interaction unit and the main control unit adopt interactive communication, and communication protocols of the interactive communication are kept consistent. The interactive communication comprises user side communication, sensing unit communication and action execution unit communication, as shown in fig. 1, the user side communication is used for transmitting an instruction sent by the interactive unit to the main control unit, and the instruction feedback information is transmitted back to the interactive unit through the user side communication after being processed by the main control unit; the sensing unit is used for transmitting the instruction processed by the main control unit to the sensing unit in a communication way, and the sensing unit transmits sensing feedback information back to the main control unit through the communication of the sensing unit after sensing; and the action execution unit is used for transmitting the instruction processed by the main control unit to the action execution unit, and the action execution unit transmits the action execution feedback information back to the main control unit through the action execution unit communication after executing the action.
Preferably, the transmission mode of the user side communication is wireless transparent transmission or wifi transmission or bluetooth transmission.
The user side communication comprises data sending and data receiving, the user side communication is transmitted by a user side data packet, and as shown in table 1, the format of the user side data packet is frame header + machine code number + instruction type + data length + sub command data + check frame + frame tail.
TABLE 1
Figure BDA0001146065610000101
Specification of user-side data packet format:
(1) the frame header is represented by two bytes, namely 0xfe and 0 xef;
(2) machine code, different equipment, define different code to distinguish;
(3) the instruction type is used for indicating the function to be executed by the data packet and can be defined by self;
(4) data length, the data length after removing four bytes of the frame head and the frame tail;
(5) a sub-command, a sub-command class under the instruction type, i.e. a secondary command;
(6) subdata, transmitted user side data;
(7) and checking the frame by adopting CRC, and calculating the result value of the data from the frame head to the front of the CRC value through CRC.
(8) The end of the frame, represented by two bytes, is 0xfd, 0xdf, respectively.
As shown in fig. 9, the data transmission forms data into a series of data packets according to the transmission request of the interactive unit and the user-side data packet format, and transmits the series of data packets wirelessly, waits for the response flag to be received, and starts the response waiting timeout count. And if the overtime counting is finished and the response mark is not received or the received response mark is wrong, retransmitting the data, and if the received response mark is correct and is not overtime, ending the transmission.
As shown in fig. 10, after receiving the first byte, the data receiving end determines whether the byte is 0xfe, if so, continues to receive the second byte, determines that the second byte is 0xef, and if the header is correct, starts to store the valid data received later, and records the length of the received data. When the received data is 0xfd and the next received data is 0xdf, it indicates that the end of the data packet is received, compares the length value in the data packet with the length value recorded in the receiving record to see if they are equal, if they are not equal, it indicates that the data is lost in the transmitting process, and the data in the data packet has error and can not be used. If the lengths of the two are equal, the length of the data packet is correct, then a check value is calculated through CRC, the calculated check value is compared with the CRC check value in the data packet, if the lengths of the two are equal, the data is correct, if the lengths of the two are not equal, the data is wrong due to other reasons in the sending process, the data packet is unavailable, and the resending request is made. If the received data is checked to be correct, a correct response mark is returned, otherwise, an incorrect response mark is returned, the sending end is requested to resend the data, and the data of each communication is ensured to be correct.
The instruction types are shown in Table 2.
TABLE 2
Instruction type value Definition of
0x01 User side 1 (handset pad) sends reading command
0x02 The user side 2 (action simulator) issues a read command
0x03 User side 3(Pc configuration software) issues a read command
0x04 User side 1 (handset pad) sends out write-in command
0x05 The user side 2 (action simulator) issues a write command
0x06 The user side 3(PC configuration software) issues a write command
0x07 User side 1 (handset pad) sends out control command
0x11 The main control unit feeds back the result of the read command to the user side 1 (mobile phone pad)
0x12 The master control unit feeds back the result of the read command to the user side 2 (action simulator)
0x13 The main control unit feeds back the result of the read command to the user side 3(PC configuration software)
0x14 The main control unit feeds back the writing command result to the user side 1 (mobile phone pad)
0x15 The master control unit feeds back the result of the write command to the user side 2 (action simulator)
0x16 The main control unit feeds back the write command result to the user side 3(PC configuration software)
0x17 The main control unit feeds back a control command result 1 to the user side 1 (mobile phone pad)
0x18 The main control unit feeds back a control command result 2 to the user side 1 (mobile phone pad)
0x19 The main control unit feeds back a control command result 3 to the user side 1 (mobile phone pad)
In table 3 are sub-command types.
TABLE 3
Subcommand type value Definition of
0x01 Robot state update
0x02 At rest
0x03 Free play
0x11 Query robot information
0x12 Querying action execution Unit information
0x13 Inquiring sinking and floating module information
0x14 Querying communication error rates
0x21 Real-time action execution unit control
0x31 Write basic action combinations
0x32 Testing basic action combinations
0x33 Writing user configuration data
0x34 Erase command flash
0x35 IAP upgrade Command
0x36 User configuration reading
0x37 Action command reading
0x38 Setting an absolute position of an action execution unit
0x39 Setting a robot into a free mode
0x3A Make the robot enter an idle state
0x41 Robot error code
0x41 Entering a device into a write mode
0x42 Make the main control unit enter into a programming mode
Preferably, the sensing unit communication is transmitted by using a bus communication protocol. Defining a data structure struct { data1 in the sensing unit; data 2; data 3; data 4; data 5; data 6; data 7; data8 }; the data structure is internally provided with 8 bytes of data which are respectively a sensing value A + a sensing value B + a fault bit + a priority bit + an instruction + a reserved bit A + a sensing reserved bit B + a sensing reserved bit C, and the meaning of each byte is different in different sensing units:
(1) sonar sensor: data1 bytes represent the sonar measured distance; data2 bytes represent the current of sonar; data3 bytes represent fault information of the sonar module; data4 represents the ID of the sonar module; data5 represents the command, data6, data7, data8 reservation.
(2) A video sensor: data1 byte represents the color of the image; data2 byte represents the brightness of the image; data3 byte represents the fault information of the video module; data4 denotes the ID of the video module; data5 represents the command, data6, data7, data8 reservation.
(3) Ultrasonic ranging sensor: data1 byte represents the upper 8 bits of the measured distance; data2 byte represents the lower 8 bits representing the measured distance; data3 byte represents fault information of the ultrasonic module; data4 represents the ID of the ultrasound module; data5 represents the command, data6, data7, data8 reservation.
(4) An infrared distance measurement sensor: data1 byte represents the upper 8 bits of the measured distance; data2 byte represents the lower 8 bits representing the measured distance; data3 byte represents fault information of the infrared module; data4 represents the ID of the infrared module; data5 represents the command, data6, data7, data8 reservation.
(5) Temperature and humidity detection sensor: data1 byte represents the measured temperature value; data2 byte represents the measured humidity value; data3 bytes represent fault information of the temperature and humidity module; data4 represents the ID of the temperature and humidity module; data5 represents the command, data6, data7, data8 reservation.
Preferably, the action performing unit communication is transmitted using a bus communication protocol. Defining a data structure struct { data1 in the action execution unit; data 2; data 3; data 4; data 5; data 6; data 7; data8 }; the data structure has 8 bytes of data in it as action execution value A + action execution value B + action execution value C + priority bit + fault bit + instruction + action execution reservation bit A + action execution reservation bit B, and in different action execution units, the meaning of each byte is also different:
(1) the steering engine execution module: data1 represents the angle of the steering engine; data2 represents the speed of the steering engine; data3 represents the current of the steering engine; data4 represents the ID of the steering engine module; data5 represents command, data6 represents fault information of the steering engine, data7, and data8 reserves.
(2) The servo motor execution module: data1 represents the angle of the servo motor; data2 represents the speed of the servo motor; data3 represents the current of the servo motor; data4 denotes a servo motor module ID; data5 represents a command, data6 represents failure information of the servo motor, data7, data8 reserve.
(3) A stepping motor module: data1 denotes the angle of the stepping motor; data2 represents the speed of the stepper motor; data3 represents the current of the stepping motor; data4 denotes the ID of the stepper motor module; data5 denotes a command, data6 denotes failure information of the stepping motor, data7, data8 reserve.
As shown in fig. 11 to 14, when the bus is idle, all the action execution units (sensing units) can transmit data, and the first access gets control. In the bus protocol, all messages are sent in a fixed format. When the bus is free, all action performing units (sensing units) connected to the bus can start sending new messages. When two or more action execution units (sensor units) start sending messages at the same time, the priority is determined based on the identifier ID. The ID indicates not the destination address of transmission but the priority of the message accessing the bus. When two or more units start to transmit messages at the same time, arbitration comparison is performed for each bit of each message ID one by one. The unit with the winning arbitration (determined as the highest priority) can continue to send messages, and the unit with the losing arbitration immediately stops sending and receives. When the main control unit and the action execution unit (sensing unit) send information at the same time, the main control unit sends the highest priority, the action execution unit (sensing unit) stops sending, and enters into a state of receiving the instruction sent by the main control unit, and the instruction ID is judged to be the ID of the action execution unit (sensing unit), if so, the instruction data is received, otherwise, the instruction receiving is abandoned. When a plurality of action execution units (sensing units) simultaneously transmit information, arbitration comparison is performed on each bit of each message ID one by one. The action execution unit (sensing unit) with the arbitration winning (judged as the highest priority) can continuously send feedback information to the main control unit, and the action execution unit (sensing unit) with the arbitration losing can immediately stop sending to enter a receiving state and receive the instruction sent by the main control unit.
The bus can determine whether the type of error is a temporary data error on the bus (e.g., external noise, etc.) or a persistent data error (e.g., an intra-cell failure, a driver failure, a wire break, etc.). With this function, when a persistent data error occurs on the bus, the unit causing the failure can be isolated from the bus. A bus is a bus to which a plurality of units can be connected at the same time. The total number of connectable units is theoretically unlimited. In practice the number of connectable units is limited by time delays on the bus and electrical loads. The communication speed is reduced, and the number of connectable units is increased; when the communication speed is increased, the number of connectable units is reduced.
During communication, a data packet defined by the main action execution unit (sensing unit) according to the structure transmits data to the bus in a bus communication mode, a bus filter of the auxiliary action execution unit (sensing unit) mounted on the bus can automatically identify the ID non-main action execution unit (sensing unit) contained in the data frame, and the data packet can be filtered out and does not receive the following data any more. If the filter identifies that the frame data ID is the main action execution unit (sensing unit), the filter receives the whole frame data, analyzes the command function in the data packet, and executes the corresponding action according to the command code and the data by the slave action execution unit (sensing unit) and feeds back the feedback information to the master control unit.
The invention provides a modular platform-based robot development system architecture, which comprises a sensing unit, an action execution unit and an interaction unit, wherein the sensing unit, the action execution unit and the interaction unit are controlled by a main control unit; the initialization module initializes the parameters of the action execution unit and the parameters of system variables in the main control unit; the instruction interaction module receives an interaction unit instruction and compiles the interaction unit instruction into an action instruction; the information interaction module acquires data of the sensing unit; and the action execution module executes the action instruction. The robot control system has the advantages of ingenious logic structure and high modularization degree, adopts consistent communication standards, and unifies the communication protocols of all the sensing units, the action execution units and the main control unit, thereby facilitating the research and development of the robot based on the platform architecture.
The foregoing is merely a preferred embodiment of the invention and is not intended to limit the invention in any manner; the present invention may be readily implemented by those of ordinary skill in the art as illustrated in the accompanying drawings and described above; however, those skilled in the art should appreciate that they can readily use the disclosed conception and specific embodiments as a basis for designing or modifying other structures for carrying out the same purposes of the present invention without departing from the scope of the invention as defined by the appended claims; meanwhile, any changes, modifications, and evolutions of the equivalent changes of the above embodiments according to the actual techniques of the present invention are still within the protection scope of the technical solution of the present invention.

Claims (13)

1. Robot development system architecture based on modularization platform, including main control unit, sensing unit, action execution unit, interactive unit, its characterized in that: the sensing unit, the action execution unit and the interaction unit are controlled by the main control unit; a user inputs an instruction to the main control unit through the interaction unit, the sensing unit senses an external signal and transmits the external signal to the main control unit, the main control unit processes and generates an instruction for controlling the action execution unit, and the action execution unit executes the instruction; the sensing unit, the action execution unit, the interaction unit and the main control unit adopt interactive communication, and communication protocols of the interactive communication are kept consistent;
the main control unit also comprises a free state module and a debugging test module, wherein the free state module is a robot control module when the interaction unit is not operated; the debugging test module is used for debugging and maintaining, wherein when the free state module is in a free state without a conforming scene mode and obstacle avoidance, a random number in a sub-action instruction range is generated, the random number execution probability is calculated to determine whether to execute the action, and after the occurrence times are reached, a command corresponding to the random number is executed;
the main control unit further comprises an instruction interaction module, and the instruction interaction module comprises the following steps: waiting for an instruction, detecting an interrupt and selecting an instruction type, wherein the instruction interaction module comprises the following processes: the command interaction module task is a blocking type task, mainly receives a command of an interaction unit, waits for a message obtained by interruption, sends a command if the interruption receives a frame head and frame tail meeting the frame head and judges whether the state and the receiving length of the robot are abnormal or not, and after the action execution unit receives the command, firstly judges to perform CRC (cyclic redundancy check), analyzes the command type according to a structure analysis command after the verification is passed, executes the analyzed command and finally returns the message;
when the main control unit and the action execution unit send information at the same time, the main control unit sends the highest priority, the action execution unit stops sending and enters an instruction receiving state sent by the main control unit;
when more than two action execution units start to send messages at the same time, the priority is determined according to the identifier ID, the unit with high priority continues to send messages, and the unit with low priority stops sending and carries out receiving work;
the sensing unit communication adopts a bus communication protocol for transmission, a data structure body is defined in the sensing unit, 8 bytes of data are arranged in the data structure body, the meaning represented by each byte is different in different sensing units, and the sensing unit comprises: sonar sensor, video sensor, ultrasonic ranging sensor, infrared ranging sensor and temperature and humidity detection sensor.
2. The modular platform based robot development system architecture of claim 1, wherein: the modular platform-based robot development system architecture further comprises a special action execution unit; feedback units are arranged in the special action execution unit, the sensing unit, the action execution unit and the interaction unit; the feedback unit is connected with the main control unit and provides feedback information for the main control unit.
3. The modular platform based robot development system architecture of claim 1, wherein: the interaction unit comprises PC end interaction, mobile end interaction and action simulator interaction; the PC end interaction and the mobile end interaction are connected with the cloud server; the transmission mode of the interaction unit and the main control unit adopts wireless transparent transmission; the mobile terminal interaction comprises voice interaction, an intelligent module, action recognition interaction, facial expression recognition interaction and action simulation interaction.
4. The robot development system main control unit based on the modularized platform comprises an initialization module, an instruction interaction module, an information interaction module and an action execution module, and is characterized in that: the initialization module initializes the parameters of the action execution unit and the parameters of system variables in the main control unit; the instruction interaction module receives an interaction unit instruction and compiles the interaction unit instruction into an action instruction; the information interaction module acquires data of the sensing unit; the action execution module executes the action instruction; the initialization module has the highest priority in the main control unit;
the main control unit also comprises a free state module and a debugging test module, wherein the free state module is a robot control module when the interaction unit is not operated; the debugging test module is used for debugging and maintaining, wherein when the free state module is in a free state without a conforming scene mode and obstacle avoidance, a random number in a sub-action instruction range is generated, the random number execution probability is calculated to determine whether to execute the action, and after the occurrence times are reached, a command corresponding to the random number is executed;
the main control unit further comprises an instruction interaction module, and the instruction interaction module comprises the following steps: waiting for an instruction, detecting an interrupt and selecting an instruction type, wherein the instruction interaction module comprises the following processes: the command interaction module task is a blocking type task, mainly receives a command of an interaction unit, waits for a message obtained by interruption, sends a command if the interruption receives a frame head and frame tail meeting the frame head and judges whether the state and the receiving length of the robot are abnormal or not, and after the action execution unit receives the command, firstly judges to perform CRC (cyclic redundancy check), analyzes the command type according to a structure analysis command after the verification is passed, executes the analyzed command and finally returns the message;
when the main control unit and the action execution unit send information at the same time, the main control unit sends the highest priority, the action execution unit stops sending and enters an instruction receiving state sent by the main control unit;
when more than two action execution units start to send messages at the same time, the priority is determined according to the identifier ID, the unit with high priority continues to send messages, and the unit with low priority stops sending and carries out receiving work;
the sensing unit data is sent to the main control unit by the sensing unit, the sensing unit communication adopts bus communication protocol to transmit, defines a data structure body in the sensing unit, has 8 bytes of data in the data structure body, and in different sensing units, the meaning that every byte represents also differs, the sensing unit includes: sonar sensor, video sensor, ultrasonic ranging sensor, infrared ranging sensor and temperature and humidity detection sensor.
5. The modular platform based robot development system master control unit of claim 4, wherein the initialization module comprises the steps of: hardware initialization, equipment initialization, system global variable initialization and module task creation.
6. The modular platform based robot development system master control unit of claim 4, wherein the initialization module further comprises the steps of: the method comprises the steps of main power supply starting, version information output, user configuration loading, clock interrupt starting, system multitask switching starting and early warning and alarming circulation.
7. The modular platform based robot development system master control unit of claim 4, wherein the command interaction module further comprises the steps of: setting and initializing variables, checking instructions and sending feedback messages.
8. The modular platform based robot development system master control unit of claim 4, wherein the information interaction module comprises the following steps: and setting sampling frequency, sampling by the sensing unit and inquiring information of the action execution unit.
9. The modular platform based robot development system master control unit of claim 4, wherein the information interaction module further comprises the steps of: initializing variables in a task, filtering, judging a threshold value, counting cycle times, acquiring equipment signals, and judging whether information is wrong.
10. The modular platform based robot development system master control unit of claim 4, wherein the action execution module comprises the steps of: judging the action type, judging the action state, sending an action command and executing the action command.
11. The modular platform based robot development system master control unit of claim 4, wherein the action execution module further comprises the steps of: variable initialization, action synchronization judgment, action overtime judgment, action circulation judgment, robot state updating, error judgment and action instruction deletion.
12. The modular platform based robot development system master control unit of claim 4, wherein the free state module comprises the steps of: the method comprises the steps of idle state judgment, setting of a contextual model, contextual model judgment (execution of the contextual model), free state model judgment, random number generation judgment execution probability and random action execution.
13. The modular platform based robot development system master control unit of claim 4, wherein the free state module further comprises the steps of: variable initialization, delay setting, user configuration judgment and obstacle avoidance judgment (executing obstacle avoidance).
CN201610971036.3A 2016-11-05 2016-11-05 Robot development system architecture based on modular platform and main control unit thereof Active CN106648614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610971036.3A CN106648614B (en) 2016-11-05 2016-11-05 Robot development system architecture based on modular platform and main control unit thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610971036.3A CN106648614B (en) 2016-11-05 2016-11-05 Robot development system architecture based on modular platform and main control unit thereof

Publications (2)

Publication Number Publication Date
CN106648614A CN106648614A (en) 2017-05-10
CN106648614B true CN106648614B (en) 2021-05-18

Family

ID=58821934

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610971036.3A Active CN106648614B (en) 2016-11-05 2016-11-05 Robot development system architecture based on modular platform and main control unit thereof

Country Status (1)

Country Link
CN (1) CN106648614B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765571B (en) * 2017-06-28 2019-11-08 北京光年无限科技有限公司 The data processing method and Development Framework of intelligent robot Development Framework
CN111600891A (en) * 2020-05-19 2020-08-28 上海傅利叶智能科技有限公司 Exoskeleton robot communication protocol, man-machine interaction system and multi-platform interaction software
CN114281315A (en) * 2021-12-27 2022-04-05 中科南京信息高铁研究院 Visual software development system and method applied to superconducting computer

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1416381A (en) * 2000-12-06 2003-05-07 索尼公司 Robot device, method for controlling motion of robot device, and system for controlling motion of robot device
CN104057458A (en) * 2014-06-16 2014-09-24 浙江大学 Multi-shaft mechanical arm visual control system and method based on somatosensation and touch
CN105752280A (en) * 2015-12-18 2016-07-13 南京理工大学 Robot ship system used for acquiring water-area information and control method of robot ship system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4868472A (en) * 1986-11-20 1989-09-19 Unimation Inc. Communication interface for multi-microprocessor servo control in a multi-axis robot control system
US6374155B1 (en) * 1999-11-24 2002-04-16 Personal Robotics, Inc. Autonomous multi-platform robot system
KR100835361B1 (en) * 2003-08-29 2008-06-04 삼성전자주식회사 Walking robot and its control method using simple ground reaction force sensors
JP2006092123A (en) * 2004-09-22 2006-04-06 Denso Wave Inc Simulation system
US8583282B2 (en) * 2005-09-30 2013-11-12 Irobot Corporation Companion robot for personal interaction
US20080027580A1 (en) * 2006-07-28 2008-01-31 Hui Zhang Robot programming method and apparatus with both vision and force
US9630317B2 (en) * 2014-04-03 2017-04-25 Brain Corporation Learning apparatus and methods for control of robotic devices via spoofing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1416381A (en) * 2000-12-06 2003-05-07 索尼公司 Robot device, method for controlling motion of robot device, and system for controlling motion of robot device
CN104057458A (en) * 2014-06-16 2014-09-24 浙江大学 Multi-shaft mechanical arm visual control system and method based on somatosensation and touch
CN105752280A (en) * 2015-12-18 2016-07-13 南京理工大学 Robot ship system used for acquiring water-area information and control method of robot ship system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"导游机器人的设计与实现";杨志鹏;《中国优秀硕士学位论文全文数据库 信息科技辑》;20100815;I140-224 *
杨志鹏."导游机器人的设计与实现".《中国优秀硕士学位论文全文数据库 信息科技辑》.2010, *

Also Published As

Publication number Publication date
CN106648614A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN111881520B (en) Anomaly detection method and device for automatic driving test, computer equipment and storage medium
CN106648614B (en) Robot development system architecture based on modular platform and main control unit thereof
CN106383521B (en) Motion processing module based on robot development platform
CN107139179A (en) A kind of intellect service robot and method of work
CN105929839A (en) Unmanned aerial vehicle flight control method and control system
US20190340197A1 (en) System and method for controlling camera and program
CN106406328B (en) Motion control method based on robot development platform
CN110595799A (en) Method, device, equipment and storage medium for testing unmanned vehicle
CN218332314U (en) HIL simulation test platform based on intelligent driving area controller
CN115577151A (en) Vehicle fault real-time diagnosis method and device, electronic equipment and storage medium
CN103317513A (en) Networked robot control system based on CPUs
CN112822684B (en) Vehicle intrusion detection method and defense system
CN109597389B (en) Test system of embedded control system
WO2021212344A1 (en) Fault management method and device for unmanned aerial vehicle, and storage medium
CN113341921A (en) Automobile controller function testing method and system
CN115619869B (en) Positioning method and device of automatic guiding transport vehicle and automatic guiding transport vehicle
CN112327804A (en) Fault diagnosis method, device and system of traction control unit and train
CN115657494A (en) Virtual object simulation method, device, equipment and storage medium
CN113074955B (en) Method, apparatus, electronic device, and medium for controlling data acquisition
US11823452B2 (en) Video analytics evaluation
CN112634489B (en) Vehicle state determination method, device and system based on mobile terminal
CN112836219B (en) Target system security evaluation device and method
CN113703839A (en) Method for managing unmanned vehicle based on mobile human-computer interaction equipment
US11654566B2 (en) Robotic activity decomposition
CN112353652A (en) Intelligent blind guiding system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant