CN109118884B - Teaching device of robot experiment course - Google Patents

Teaching device of robot experiment course Download PDF

Info

Publication number
CN109118884B
CN109118884B CN201811063985.7A CN201811063985A CN109118884B CN 109118884 B CN109118884 B CN 109118884B CN 201811063985 A CN201811063985 A CN 201811063985A CN 109118884 B CN109118884 B CN 109118884B
Authority
CN
China
Prior art keywords
robot
teaching
obstacle
target object
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811063985.7A
Other languages
Chinese (zh)
Other versions
CN109118884A (en
Inventor
武仪
武青春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201811063985.7A priority Critical patent/CN109118884B/en
Publication of CN109118884A publication Critical patent/CN109118884A/en
Application granted granted Critical
Publication of CN109118884B publication Critical patent/CN109118884B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass

Abstract

The application provides a teaching device of robot experiment course includes: the system comprises a teaching robot, a robot running array disc, an image acquisition device, a user side and a server; robot positioning identifiers arranged in an array are marked on the robot running array disc, and the positioning identifiers are used for positioning and identifying the teaching robot, the moving target object and the barrier; the image acquisition device is used for acquiring image information of the relative position of the teaching robot on the robot running array disc and sending the image information to the server; the user side is used for displaying image information and receiving an autonomous motion control program input by a user; and the server is used for analyzing the autonomous motion control program, generating the control command and issuing the control command to the teaching robot. The robot teaching experiment course learning method is used for learning the hardware structure and the software function required by the robot to realize the autonomous movement through the robot teaching experiment course, so that students participate in the design and construction of the autonomous movement function of the robot.

Description

Teaching device of robot experiment course
Technical Field
The application relates to the technical field of robots, in particular to a teaching device for robot experiment courses.
Background
With the development of related technologies such as an automation control technology, an artificial intelligence technology, a mode recognition technology, a visual perception technology and the like, intelligent robots are increasingly researched. In the field of education, many colleges and universities have established relevant courses in terms of robotics among students, play a positive role in cultivating and improving scientific literacy of the students, are popularized in numerous schools of middle and primary schools, are deeply loved by teenagers due to the characteristic of playing the schools of middle and primary schools, and have become a definite trend as the robots walk into the schools as the computers are popularized in the schools, so that the knowledge of the students about a new curiosity, namely the robots, is improved. Accordingly, teaching robot platforms for student experiments are increasingly popular. The teaching robot is a practical training experiment platform with open characteristics suitable for students in primary, middle and primary schools, can be used for students to assemble robot structures, build circuit connection and design various software and hardware, is a fusion of various high skills, and can exercise logical thinking ability, practical ability and equipment control ability of the students. At present, a teaching robot is a robot finished product, a suit or a spare part which is specially developed by a manufacturer and aims at exciting the learning interest of students and cultivating the comprehensive ability of the students, and the teaching robot also has corresponding control software, a teaching textbook and the like besides a robot body.
Autonomous motion is an important capability for many robots and one of its most attractive features. In practical applications, robots with autonomous locomotion capabilities can assume many functions such as personnel or cargo transport, work and investigation in special environments, entertainment and competition. For a robot, the autonomous motion is not only a mechanical motion execution component such as legs or wheels, but also a camera, a range finder, and other components, and software for implementing functions such as visual recognition, target positioning, path planning, obstacle avoidance, and the like, for example, a control program related to image processing, identification or object extraction, positioning and measurement, path design, obstacle avoidance, and the like. In summary, the robot needs complicated software and hardware support to realize autonomous movement.
However, for student experiments, the software and hardware configuration necessary for realizing the autonomous movement of the robot is too deep and complicated. For example, the robot first finds a target from a picture shot by a camera, which requires many professional algorithms such as image enhancement, edge extraction, pattern recognition, and the like, then converts the location of the target in a real space according to the position coordinates of the target in the picture, which relates to a coordinate system conversion formula, further realizes distance measurement and path planning according to the location of the robot itself and the location of the target in the real space, and finally controls the motion of the robot according to path segments. The control programs with extremely high speciality are very abstract and completely exceed the understanding level of general students, particularly for primary and secondary school students, the students cannot grasp the programs in a short course time, and certainly, the students cannot actually participate in establishing a set of autonomous movement functions for the robot design. Therefore, the current robot teaching cannot touch the autonomous movement of one of the most core functions of the robot; or, students can only be made to visit how the robot system acts in a demonstration mode, and the students only leave some visual impressions and cannot learn the basic principle of the robot autonomous movement and know which basic hardware structures are needed for the robot to realize the autonomous movement, and the control software of the robot comprises which necessary functions.
Disclosure of Invention
In view of this, an object of the present application is to provide a teaching device for robot experiment courses, so as to solve the technical problems in the prior art that, due to the complexity of hardware structures and software functions involved in autonomous movement of a robot, numerous students in middle and primary schools cannot understand the basic principle of autonomous movement of the robot through the robot teaching experiment courses, cannot understand which basic hardware structures and software functions are required for realizing autonomous movement of the robot, cannot participate in design and construction of the autonomous movement function of the robot, and is not favorable for popularization of a teaching robot in schools in middle and primary schools.
Based on above-mentioned purpose, this application has provided a teaching device of robot experiment course, includes:
the system comprises a teaching robot, a robot running array disc, an image acquisition device, a user side and a server;
the robot running array disc is marked with robot positioning identifiers arranged in an array manner, and the positioning identifiers are used for identifying any one or more of the teaching robot, the moving target object and the obstacle and positioning the position of the teaching robot; the robot running matrix disc is also used for driving a moving target object and/or an obstacle;
the image acquisition device is used for acquiring image information of relative positions of the teaching robot, the moving target object and the obstacle on the robot operation array disc and sending the image information to the server;
the user side is used for displaying image information of relative positions of the teaching robot, the moving target object and the obstacle on the robot running array disc, identifying the teaching robot, the moving target object and the obstacle according to the image information, extracting relative position coordinates of the teaching robot, the moving target object and the obstacle on the robot running array disc according to the image information, receiving an autonomous motion control program input by a user, and sending the autonomous motion control program to the server; the autonomous motion control program is used for planning a motion path of the robot according to the teaching robot and relative position coordinates of the moving target object and the obstacle, and sending control instructions related to the motion direction and speed in stages according to the motion path;
the server is used for analyzing the autonomous motion control program, generating the control instruction and issuing the control instruction to the teaching robot;
the teaching robot is used for receiving the control instruction from the server and controlling the teaching robot according to the control instruction so as to change the relative position of the teaching robot on the robot running array disc.
In some embodiments, the size and spacing of each location identifier on the robotic playing array is matched to the size of any one of a teaching robot, a moving object, and an obstacle placed or moving on the robotic playing array such that the teaching robot, the moving object, and the obstacle completely cover and obscure at least one of the robot location identifiers.
In some embodiments, the robot operation matrix disc comprises: the motion driving mechanism and the start-stop synchronous signal generating circuit; the motion driving mechanism is used for driving the moving target object and/or the barrier to move on the robot running matrix disc; the start-stop synchronous signal generating circuit is used for controlling the motion driving mechanism to drive the motion target object and/or the barrier to move on the robot running matrix disc in a time synchronous manner with the teaching robot.
In some embodiments, the user side is configured to identify the teaching robot and any one of the moving target and the obstacle in the image information according to the serial number distribution of the robot positioning identifier invisible in the image information, and determine the relative position coordinates of the teaching robot and any one of the moving target and the obstacle in the robot running matrix.
In some embodiments, the user side displays a graphical programming interface, and graphical program modules corresponding to the identification function, the positioning function and the path planning function and graphical logic modules corresponding to the basic operation logic of judgment, circulation, skip and the like are displayed in the graphical programming interface; and analyzing by calling a program code segment according to the connection relation and option configuration established by the graphical program module and the graphical logic module on the graphical programming interface by the user to generate the autonomous motion control program.
In some embodiments, the image capture device includes a first wireless communication module for establishing a communication connection with the server.
In some embodiments, the server includes a second wireless communication module for establishing communication connection with the teaching robot, the image acquisition device and the robot operation matrix.
In some embodiments, the server further comprises:
the analysis module is used for receiving the autonomous motion control program uploaded by the user side, analyzing and identifying the program code of the autonomous motion control program, generating a control instruction executable by the robot and sending the control instruction to the execution module;
and the execution module is used for issuing the control instruction to the teaching robot according to each synchronous time interval according to the control instruction sent by the analysis module, and controlling the teaching robot to execute corresponding actions.
In some embodiments, the teaching robot comprises:
the device comprises a sensing unit, a control unit and an execution unit;
the sensing unit is used for sensing surrounding environment signals, and the control unit is used for controlling the execution unit to execute corresponding actions in the control instructions together according to the surrounding environment signals and the control instructions issued by the server.
In some embodiments, the robotic operation matrix comprises a grid arranged in an array, and the positioning identifier is disposed within the grid.
In some embodiments, the server further comprises a loop logic module to:
and when the current position and the end point position of the teaching robot do not coincide, the teaching robot is controlled by the execution module to continue to move according to the motion path of the current stage in the next synchronous time interval until the current position and the end point position of the teaching robot coincide.
In some embodiments, the loop logic module is further configured to:
judging whether the current position of the teaching robot is contained in the current motion path, if so, controlling the teaching robot to move according to the motion path through the execution module, if not, feeding back error reporting information to the analysis module by the circulating logic module to prompt the current position to deviate, then stopping the motion of the teaching robot, until the execution module obtains the motion path again after the program is updated, and controlling the teaching robot to move according to the newly obtained motion path.
The application provides a teaching device for robot experiment courses; the teaching device firstly provides a multifunctional robot running array disc which is adaptive to teaching characteristics, the array disc is used for easily identifying teaching robots, moving targets, obstacles and the like and positioning the relative positions of the teaching robots, the moving targets, the obstacles and the like through camera shooting and visual perception, and output results which are intuitive in form, easy to understand and program are generated in the form of identifier serial numbers, so that students can conveniently understand concepts and principles of stages such as visual perception, target identification, relative positioning and the like in autonomous movement of the robots; the array plate can drive the target object and the obstacle according to the design of the experimental course, so that the array plate can be suitable for learning of various target courses, can realize rich learning contents, and effectively ensures the moving synchronicity of the robot, the target object and the obstacle; the user provides the visual show to target extraction, discernment, the positioning process based on visual perception, more importantly provides the graphical programming mode of image, allows the student to participate in the design of robot autonomous motion procedure, and according to autonomous motion control procedure is control stage by stage the teaching robot is in motion on the robot operation matrix disc has solved teaching robot and has relied on control program, does not have universal suitability to the middle and primary school student that programming ability is weak, is unfavorable for the technical problem of teaching robot in the popularization of middle and primary school.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a schematic structural diagram of a frame of a teaching apparatus for a robot experiment course according to a first embodiment of the present application;
fig. 2 is a schematic structural diagram of a teaching device of a robot experiment course according to a first embodiment of the present application;
fig. 3 is a schematic view of a motion driving mechanism inside a robot operation matrix disc according to a first embodiment of the present application;
fig. 4 is a schematic structural diagram of a start-stop synchronization signal generating circuit inside a robot running matrix according to a first embodiment of the present application;
fig. 5 is a schematic diagram of a graphical programming interface at a user end according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a server of a robot experiment course teaching apparatus according to a second embodiment of the present application;
fig. 7 is a schematic structural diagram of a teaching robot according to a second embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 is a schematic diagram of a frame structure of a teaching apparatus for a robot experiment session according to an embodiment of the present application. Fig. 2 is a schematic structural diagram of a teaching apparatus for a robot experiment course according to a first embodiment of the present application. As can be seen from fig. 1 and fig. 2, the teaching apparatus for robot experiment lessons provided in this embodiment includes:
the teaching robot comprises a teaching robot 1 (figure in a cylindrical shape in figure 1), a robot operation matrix 2, an image acquisition device 3, a user terminal 4 and a server 5. The figures and letters in the present embodiment are only exemplary to illustrate the technical solutions of the present application, and should not be construed as limiting the technical solutions of the present application.
The robot running matrix disk 2 is marked with robot positioning identifiers arranged in an array, for example, the robot positioning identifiers such as letters a1, B1, C1, D1, E1 to a6, B6, C6, D6, E6 and the like shown in fig. 1 form an 8 x 6 matrix array; besides letters, the robot positioning identifier may also be patterns in various shapes, or other characters with identification functions such as numbers, Chinese characters, etc., which are not listed here. By marking the robot positioning identifiers arranged in a matrix on the upper surface of the robot running matrix 2, after the image information is shot, the robot positioning identifiers are easily separated from the whole image of the robot running matrix 2 by using an image segmentation algorithm and are identified, for example, each robot positioning identifier can be made to have a color which is obviously different from the bottom color of the upper surface of the robot running matrix 2, so that each robot positioning identifier has a clear boundary on the upper surface of the matrix, and the shape of the robot positioning identifier is easily extracted and identified by using an edge segmentation algorithm. In some other embodiments of the present application, the robot operation matrix comprises a grid arranged in an array, and the location identifiers are arranged in the grid (such as the squares shown in fig. 1) so as to enhance the accuracy of location identification.
In the rectangular array of the robot positioning identifiers, the size and the distance between each positioning identifier are designed in advance, so that the positioning identifiers are matched with the teaching robot 1 to be placed or moved on the robot operation array disc 2 and the sizes of the moving target objects and the obstacles; when the teaching robot 1, the moving object and the obstacle are placed on the robot running matrix 2, the teaching robot can completely cover and shield at least one robot positioning identifier, so that after being shot as image information, the completely covered and shielded robot positioning identifier is invisible in the whole image of the robot running matrix 2. Therefore, by judging the position of the robot positioning identifier covered and shielded by the teaching robot 1, the initial position and/or the end position of each operation of the teaching robot 1 can be positioned; the initial position in this embodiment and the subsequent embodiments refers to a current position where the teaching robot starts to execute the movement of this stage according to the movement path, and the end position is a movement end position of this stage specified in the movement path. In addition, the positions of the robot positioning identifiers covered and shielded by the moving target object and/or the obstacle object are judged, and the positions of the moving target object and the obstacle used in the course can be positioned.
The robot operation array disc 2 is used as a sports field of the teaching robot, and can be further provided with various supporting circuits and mechanisms, wherein the supporting circuits and the mechanisms comprise a motion driving mechanism and a start-stop synchronous signal generating circuit of a target object and/or an obstacle object. Specifically, the device supports the teaching robot 1 to perform autonomous movement and obstacle avoidance experiments facing a movable moving target and an obstacle, and when the moving target and/or the obstacle is placed on the robot operation matrix 2, the moving target and/or the obstacle can move on the matrix under the driving of the movement driving mechanism according to course design configuration information. The motion driving mechanism and the start-stop synchronous signal generating circuit are both arranged below the upper surface of the robot running array disc 2. As shown in fig. 3, the motion driving mechanism includes a magnetic head 201, an X-direction driving rail 202, an X-direction linear motor 203, a Y-direction extension plate 204, a Y-direction driving rail 205, and a Y-direction linear motor 206. An X-direction driving track 202 is suspended below a table top on the upper surface of the robot running array disc 2, the X-direction driving track 202 can be a single-row track, the length of the X-direction driving track is the same as that of the X-direction side edge of the robot running array disc 2 when the X-direction driving track extends from one end to the other end along one X-direction side edge of the robot running array disc 2, and the X-direction driving track can also be a double-row parallel track which extends from one end to the other end along two X-direction parallel side edges of the robot running array disc 2; the X-direction linear motor 203 is slidably fixed to the X-direction driving rail 202 (fixed to any one of the two parallel rails if the X-direction driving rail 202 is a double-row parallel rail), and can linearly move in the X-direction along the X-direction driving rail 202; the Y-direction extension plate 204 is fixedly installed on the X-direction linear motor 203, so that the Y-direction extension plate 204 can move in the X-direction along with the X-direction linear motor 203, and the Y-direction extension plate 204 extends for a certain length along the Y-direction of the robot operation matrix 2; the Y-direction driving track 205 is a single-row track, is fixedly installed on the Y-direction extending plate 204, extends along the Y-direction of the robot running matrix disc 2, and has the same length as the Y-direction side of the robot running matrix disc 2; the Y-direction linear motor 206 is slidably fixed on the Y-direction driving rail 205 and moves linearly in the Y-direction along the Y-direction driving rail 205; the magnetic head 201 is fixedly mounted on the Y-direction linear motor 206 by a holding portion. It can be seen that the magnetic head 201 can be driven to move up to any position below the upper surface of the robot operation matrix disc 2 in the X direction and the Y direction by the driving of the X-direction linear motor 203 and the Y-direction linear motor 206. The moving target and the obstacle base may be provided with magnets, so that the magnetic head 201 may drive the moving target and/or the obstacle to move on the upper surface of the robot operation matrix 2 by using magnetic attraction. The specific structure of the start-stop synchronization signal generating circuit is shown in fig. 4, and comprises an X-direction motor start-stop signal circuit 207, a Y-direction motor start-stop signal circuit 208, a synchronization signal transceiver circuit 209, a controller circuit 210, a near field communication circuit 211, and a networking communication circuit 212. The X-direction motor start-stop signal circuit 207 is connected with the X-direction linear motor 203 and outputs a start-stop signal to the X-direction linear motor, and the start-stop signal controls the start and the end of the linear movement of the X-direction linear motor; similarly, the Y-direction motor start/stop signal circuit 208 is connected to the Y-direction linear motor 206 and outputs a start/stop signal to control the start and end of the linear movement of the Y-direction linear motor. The close-range communication circuit 211 is used for realizing close-range pairing communication connection between the robot operation matrix disc 2 and the teaching robot 1, and for example, the close-range pairing communication circuit and the teaching robot can be paired in multiple communication modes such as bluetooth and infrared. The synchronous signal transceiver circuit 209 is configured to generate a synchronous start signal and transmit the synchronous start signal to the teaching robot through the near field communication circuit 211, or receive the synchronous start signal provided by the teaching robot through the near field communication circuit 211; the synchronous starting signal is used for triggering the teaching robot 1 and the X-direction motor start-stop signal circuit 207 and the Y-direction motor start-stop signal circuit 208 to synchronously execute the movement of each synchronous time interval. The networking communication circuit 212 is used for connecting the robot operation matrix 2 to the server 5 through networking, so that the networking communication circuit 212 can receive the course design configuration information issued by the server 5. The controller circuit 210 obtains the moving path of the moving target and the obstacle in the experimental course according to the course design configuration information, and controls the X-direction motor start-stop signal circuit 207 and the Y-direction motor start-stop signal circuit 208 to send out start-stop signals in stages according to the path so as to drive the moving target and the obstacle to move according to the course design requirement.
Teaching robot 1 is in can be according to direction and the speed of control command autonomous adjustment self operation on the robot operation matrix 2 to independently remove along the motion path, thereby accomplish the experiment course target of robot teaching. Generally, the user in this embodiment mainly refers to a student, the robot experiment course supported by the teaching apparatus includes a course traveling along a preset movement path, a course traveling according to a movement target position, an autonomous path setting travel, an autonomous obstacle bypassing travel, an autonomous path planning and travel facing a movable movement target and an obstacle, a robot soccer, and the like, and through these courses, the student can fully understand the basic principle of the autonomous travel of the robot, assemble hardware modules necessary for the autonomous travel of the robot, and participate in designing and controlling the software function of the robot to realize the autonomous travel.
The image acquisition device 3 is used for acquiring image information of relative positions of the teaching robot 1, the moving target object and the obstacle on the robot operation array disc 2 and sending the image information to the server 5. In this embodiment, the image capturing device 3 may be a camera, or may be other equipment with an image capturing function, such as a smart phone. The image acquisition device 3 can photograph the robot operation matrix disc 2 according to a preset time interval to acquire an image containing the teaching robot 1, a moving target object, an obstacle and the robot operation matrix disc 2, the image can show which robot positioning identifiers of the robot operation matrix disc 2 are invisible in the image because the robot positioning identifiers are covered and shielded by the teaching robot 1 and/or the moving target object and the obstacle, and the relative positions of the teaching robot 1, the moving target object and the obstacle in the robot operation matrix disc 2 can be further determined by using the visible and invisible robot positioning identifiers in the image. Therefore, the image acquisition device 3 simulates common visual perception hardware such as an industrial robot, a traffic robot, an entertainment home robot and the like in practice. The image information is sent to the server 5, and the server 5 can send the image information to the user terminal 4 to be acquired by the user.
The user side 4 is used for displaying image information of relative positions of the teaching robot 1, the moving target object and the obstacle on the robot running array 2, identifying the teaching robot, the moving target object and the obstacle according to the image information, extracting relative position coordinates of the teaching robot, the moving target object and the obstacle on the robot running array according to the image information, receiving an autonomous motion control program input by a user, and sending the autonomous motion control program to the server; the autonomous motion control program is used for planning a motion path of the robot according to the teaching robot, the relative position coordinates of the moving target object and the relative position coordinates of the obstacles, and sending out control instructions related to the motion direction and the motion speed in stages according to the motion path. In this embodiment, after the user acquires the image information acquired by the image acquisition device 3, the teaching robot, the moving target object and the obstacle in the image information can be identified according to the visible and invisible states of the robot positioning identifier in the image information, the relative position coordinates of the teaching robot 1, the moving target object and the obstacle and the robot running matrix 2 are determined, and the identification result and the relative position coordinates are displayed on the graphical programming interface of the user, so that the coordinate positioning step necessary for the autonomous movement of the robot in practice is simulated and visually displayed. Specifically, as shown in FIG. 5, there are three occluded areas where the robot locator is not visible, and the remaining robot locator is visible. The shape of each shielding area can be shown according to the number distribution of the robot positioning identifiers covered and shielded in the image information, and then which shielding area is the teaching robot, which shielding area is the moving target object and which shielding area belongs to the barrier is identified; in fig. 5, the invisible robot positioning identifiers in the first sheltered area are B1, C1, B2 and C2, and according to the serial numbers of the identifier numbers, the sheltered area can be judged to be square, and can be identified to belong to the teaching robot; similarly, the robot positioning identifiers E5 and E6, which are invisible in the second shielding area, can identify that the second shielding area belongs to the target object according to the shapes of the strip shielding areas indicated by the identifier codes; the invisible robot positioning identifier caused by the third occlusion region is D4, and the robot positioning identifier can identify the robot positioning identifier belonging to the obstacle; therefore, the visual object recognition is realized by using the invisible robot positioning identifier distribution rule, the recognition process can be easily and quickly realized, and students can easily and intuitively understand the principle of realizing the visual recognition based on the shape matching. And according to the serial number of the invisible robot positioning identifier, the serial number can be directly mapped to the position of the robot positioning identifier on the robot running matrix 2, so that the relative position coordinates of the teaching robot, the moving target object and the barrier on the robot running matrix are obtained. The user terminal 4 further provides a graphical programming interface for the user to input an autonomous motion control program for the teaching robot 1, wherein the autonomous motion control program is used for planning a motion path of the robot according to the teaching robot and the relative position coordinates of the moving target object and the obstacle. Specifically, as shown in fig. 5, a graphical program module — an identification module O1, a graphical program module — a positioning module O2, and a graphical program module — a path module O3, which respectively correspond to the identification function, the positioning function, and the path planning function, may be displayed on the graphical programming interface of the user end 4, each graphical program module O1-O3 has an IO interface for inputting and outputting a logic value, and a student may connect the IO interfaces corresponding to the graphical program modules O1-O3 on the graphical interface, so as to complete the design of the control program for autonomous movement of the robot in a graphical connection manner; furthermore, the student can set the control logic executed by each graphical program module O1-O3 and the input and output of the IO interface, for example, when clicking each module O1-O3, the control logic options supported by the module O1-O3 and the input and output options of the IO interface can be given through a menu, and the student can design the control logic options and the input and output options in a selected mode, so as to configure the control function for each module O1-O3. The control program for the autonomous movement of the robot further needs to perform basic logic operations such as judgment, loop, jump, etc., and the graphical programming interface of the user end 4 may also display logic modules representing various basic logic operations, such as a graphical logic module O4 representing whether to judge jump, and a graphical logic module O5 representing loop judgment, and students may connect IO interfaces of the graphical program modules O1-O3 with the graphical logic modules O4, O5, etc., so as to represent that the logical quantities input and output by the IO interfaces of the graphical program modules O1-O3 in the robot autonomous movement control program time sequence perform the basic logic operations such as judgment jump, loop jump, etc., represented by the graphical logic modules O4, O5. According to the connection relationship and option configuration established by the user through the graphical program modules O1-O3 and the graphical logic modules O4 and O5 on the graphical programming interface, the user terminal 4 further performs parsing by calling program code segments to generate an autonomous motion control program, and the autonomous motion control program can be uploaded to the server 5. By the simple and feasible mode of graphical operation, students can participate in the design process of the robot autonomous motion control program to understand the logical connection of the basic steps of identification, positioning, path setting and the like of the control program. For example, for autonomous path planning and traveling towards a movable moving object and an obstacle, for the acquired image information of the robot operation matrix 2, the identification module O1 is configured to identify whether the occluded area belongs to a teaching robot, a moving object or an obstacle according to the sequence number distribution of the invisible robot positioning identifier in the occluded area; furthermore, the output interface of the identification module O1 is connected to the input interface of the positioning module O2, and the positioning module O2 is configured to determine the relative position coordinates of the teaching robot, the moving object, and the obstacle on the robot operation matrix 2 according to the serial number of the invisible robot positioning identifier in the blocked area; by connecting the output interface of the positioning module O2 to the input interface of the path module O3, the path module O3 can determine the shortest path from the current position to the moving object without passing through the obstacle for the teaching robot according to the relative position coordinates of the teaching robot, the moving object, and the obstacle, thereby outputting the shortest path; the path block O3 may also connect the decision logic block O4 and the loop logic block O5 if it is a staged autonomous motion; the judgment logic module O4 judges whether the teaching robot has reached the moving target after the movement of the current stage, if so, the experiment is ended, if not, the non-reached judgment result is output to the circulation logic module O5, and the O5 circulates the control program back to the positioning module O2 to perform the relative position positioning and path control of the next stage again. The above robot autonomous movement control program can also be used for robot football course experiments, in which a football is used as a movable moving target and an opponent robot is used as a movable obstacle.
The server 5 is used for generating a control instruction according to the autonomous motion control program analysis, issuing to the teaching robot 1, and controlling the teaching robot 1, the teaching robot 1 is in autonomous motion on the robot running array disc 2 according to the control instruction, and therefore the relative position of the teaching robot on the robot running array disc 2 is changed. After receiving the autonomous motion control program sent by the user through the user side, the server 5 may send a control command in stages according to a motion path of each stage included in the autonomous motion control program to control the teaching robot 1 to move on the robot operation matrix 2. Moreover, the server 5 also issues course design configuration information to the robot running array 2 corresponding to each teaching robot 1 according to the pairing information between the teaching robots 1 and the robot running array 2 and the course catalog stored in the server or provided by the user terminal 4; as described above, the robot running matrix 2 drives the moving target object and the obstacle according to the course design configuration information, and cooperates with the autonomous movement of the teaching robot 1 to complete the whole course experiment target.
As an alternative embodiment of the present application, on the basis of the above embodiment, the image capturing apparatus 3 includes a first wireless communication module, configured to establish a communication connection with the server 5. In this embodiment, the first wireless communication module may be a wireless communication module with a bluetooth function and/or a wifi function, and is used for establishing a wireless communication connection with the server 5. In addition, various types of line interfaces may be provided on the image capturing device 3 to establish a wired connection with the server 5 through a line.
Fig. 6 is a schematic structural diagram of a server of a teaching apparatus for a robot experiment session according to a second embodiment of the present invention. As an embodiment of the present application, on the basis of the above embodiment, the server 5 may further include a second wireless communication module 501, configured to establish a communication connection with the image capturing apparatus 3. The second wireless communication module 501 establishes communication connection with the first wireless communication module on the image acquisition device 3, so that communication connection between the server and the image acquisition device is realized, and the image acquisition device 3 can transmit acquired image information to the server in real time through the communication connection channel and forward the acquired image information to the user terminal 4 through the server to be acquired by the user. In addition, the second wireless communication module 501 is further configured to establish a communication connection with the teaching robot 1 and the robot running matrix 2, so as to issue a control instruction to the teaching robot 1 and transmit the control instruction to the robot running matrix 2
Further, the server 5 further includes:
the analysis module 502 is used for receiving the autonomous motion control program uploaded by a user, analyzing and identifying a program code of the autonomous motion control program, generating a control instruction executable by the robot, and sending the control instruction to the execution module;
and an executing module 503, configured to issue the control instruction sent by the analyzing module 502 to the teaching robot 1 according to each synchronous time interval, and control the teaching robot 1 to execute a corresponding action. As described above, the teaching robot 1 and the robot operation matrix 2 are synchronized with each other for a synchronization time interval, and then perform relevant motion actions including maintaining or changing a motion direction and a motion speed according to control commands.
A loop logic module 504 to:
judging whether the current position and the end point position of the teaching robot coincide according to the current position of the teaching robot 1 and the end point position of each stage path in each stage motion process, and controlling the teaching robot to continue to move according to the motion path of the current stage in the next synchronous time interval through the execution module 503 when the current position and the end point position of the teaching robot do not coincide until the current position and the end point position of the teaching robot 1 coincide. Thus, the analysis module 502 analyzes the robot motion path generated by the autonomous motion control program into a staged motion path; in order to complete the motion path of each phase, the execution module 503 issues a control command to the teaching robot 1 in units of each synchronization time interval, and the loop logic module 504 determines whether the teaching robot 1 has reached the end position of the motion path of the phase after each synchronization time interval is ended, if not, the execution module 503 performs the next synchronization time interval in a loop, and if the end position has been reached, the analysis module 502 analyzes the motion path of the next phase.
In addition, the circulation logic module 504 may be further configured to determine whether the current position of the teaching robot is included in the current movement route, if the current position is included in the current movement route, the teaching robot is controlled to move according to the movement route through the execution module 503, and if the current position is not included in the current movement route, the circulation logic module 504 feeds back an error report message to the analysis module 502 to prompt that the current position deviates, and then stops the movement of the teaching robot 1 until the execution module 503 reacquires the movement route after the program is updated, and controls the teaching robot to move according to the newly acquired movement route.
As an alternative embodiment of the present application, as shown in fig. 7, the teaching robot includes: a sensing unit 101, a control unit 102 and an execution unit 103;
the sensing unit 101 is used for sensing ambient environment signals, and for example, an infrared distance sensor or the like may be arranged to avoid collision during an experiment; the control unit 102 is configured to jointly control the execution unit 103 (e.g., a wheel assembly) to execute a corresponding action in the control instruction according to the ambient environment signal and the control instruction issued by the server 5. In the teaching process, students can assemble various types of robots by selectively assembling and connecting the sensing unit 101, the control unit 102 and the execution unit 103, and detailed description is omitted.
The application provides a teaching device for robot experiment courses; the invention provides a multifunctional robot running array disc which is adaptive to teaching characteristics, the array disc is used for easily identifying teaching robots, moving targets, obstacles and the like and positioning the relative positions of the teaching robots, the moving targets, the obstacles and the like through camera shooting and visual perception, and output results which are intuitive in form, easy to understand and program are generated in the form of identifier serial numbers, so that students can conveniently understand concepts and principles of stages such as visual perception, target identification, relative positioning and the like in autonomous movement of the robots; the array plate can drive the target object and the obstacle according to the design of the experimental course, so that the array plate can be suitable for learning of various target courses, can realize rich learning contents, and effectively ensures the moving synchronicity of the robot, the target object and the obstacle; the user provides the visual show to target extraction, discernment, the positioning process based on visual perception, more importantly provides the graphical programming mode of image, allows the student to participate in the design of robot autonomous motion procedure, and according to autonomous motion control procedure is control stage by stage the teaching robot is in motion on the robot operation matrix disc has solved teaching robot and has relied on control program, does not have universal suitability to the middle and primary school student that programming ability is weak, is unfavorable for the technical problem of teaching robot in the popularization of middle and primary school.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (6)

1. A teaching device of a robot experiment course, comprising: the system comprises a teaching robot, a robot running array disc, an image acquisition device, a user side and a server;
the robot running array disc is marked with robot positioning identifiers arranged in an array manner, and the positioning identifiers are used for identifying any one or more of the teaching robot, the moving target object and the obstacle and positioning the position of the teaching robot; the robot running matrix disc is also used for driving a moving target object and/or an obstacle; the image acquisition device is used for acquiring image information of relative positions of the teaching robot, the moving target object and the obstacle on the robot operation array disc and sending the image information to the server; the user side is used for displaying image information of relative positions of the teaching robot, the moving target object and the obstacle on the robot running array disc, identifying the teaching robot, the moving target object and the obstacle according to the image information, extracting relative position coordinates of the teaching robot, the moving target object and the obstacle on the robot running array disc according to the image information, receiving an autonomous motion control program input by a user, and sending the autonomous motion control program to the server; the autonomous motion control program is used for planning a motion path of the robot according to the teaching robot and relative position coordinates of the moving target object and the obstacle, and sending control instructions related to the motion direction and speed in stages according to the motion path; the server is used for analyzing the autonomous motion control program, generating the control instruction and issuing the control instruction to the teaching robot;
the teaching robot is used for receiving the control instruction from the server and controlling the teaching robot according to the control instruction so as to change the relative position of the teaching robot on the robot running array disc;
the size and the distance of each positioning identifier on the robot running array disc are matched with the size of any one of a teaching robot, a moving target object and an obstacle which are placed or moved on the robot running array disc, so that the teaching robot, the moving target object and the obstacle completely cover and shade at least one robot positioning identifier;
the robot operation matrix disc comprises: the motion driving mechanism and the start-stop synchronous signal generating circuit; the motion driving mechanism is used for driving the moving target object and/or the barrier to move on the robot running matrix disc; the start-stop synchronous signal generating circuit is used for controlling the motion driving mechanism to drive the motion target object and/or the barrier to move on the robot running matrix disc in a time-synchronous manner with the teaching robot;
the user side is used for identifying the teaching robot and any one of the moving target object and the obstacle in the image information according to the sequence number distribution of the invisible robot positioning identifier in the image information, and determining the relative position coordinates of the teaching robot and any one of the moving target object and the obstacle in the robot operation matrix;
the user side displays a graphical programming interface, and a graphical program module corresponding to the identification function, the positioning function and the path planning function and a graphical logic module corresponding to the operation logic of judgment, circulation and skip are displayed in the graphical programming interface; and analyzing by calling a program code segment according to the connection relation and option configuration established by the graphical program module and the graphical logic module on the graphical programming interface by the user to generate the autonomous motion control program.
2. The lesson teaching device of claim 1, wherein the image capturing device comprises a first wireless communication module configured to establish a communication link with the server.
3. The device for teaching in robot experiment course as claimed in claim 2, wherein the server comprises a second wireless communication module for establishing communication connection with the teaching robot, the image capturing device and the robot running matrix.
4. The teaching device of robotic experiment lesson of claim 3, wherein said server further comprises:
the analysis module is used for receiving the autonomous motion control program uploaded by the user side, analyzing and identifying the program code of the autonomous motion control program, generating a control instruction executable by the robot and sending the control instruction to the execution module;
and the execution module is used for issuing the control instruction to the teaching robot according to each synchronous time interval according to the control instruction sent by the analysis module, and controlling the teaching robot to execute corresponding actions.
5. The teaching device of robotic experimental lesson of claim 4, wherein said teaching robot comprises: the device comprises a sensing unit, a control unit and an execution unit;
the sensing unit is used for sensing surrounding environment signals, and the control unit is used for controlling the execution unit to execute corresponding actions in the control instructions together according to the surrounding environment signals and the control instructions issued by the server.
6. The lesson teaching device of claim 5, wherein the robotic performance matrix comprises a grid of arrays, and wherein the location identifiers are disposed within the grid.
CN201811063985.7A 2018-09-12 2018-09-12 Teaching device of robot experiment course Expired - Fee Related CN109118884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811063985.7A CN109118884B (en) 2018-09-12 2018-09-12 Teaching device of robot experiment course

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811063985.7A CN109118884B (en) 2018-09-12 2018-09-12 Teaching device of robot experiment course

Publications (2)

Publication Number Publication Date
CN109118884A CN109118884A (en) 2019-01-01
CN109118884B true CN109118884B (en) 2020-05-08

Family

ID=64859127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811063985.7A Expired - Fee Related CN109118884B (en) 2018-09-12 2018-09-12 Teaching device of robot experiment course

Country Status (1)

Country Link
CN (1) CN109118884B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114055468B (en) * 2019-02-26 2023-08-18 深圳市越疆科技有限公司 Track reproduction method, track reproduction system and terminal equipment
WO2021053760A1 (en) * 2019-09-18 2021-03-25 三菱電機株式会社 Mobile interactive robot, control device, control method, and control program
CN112201116B (en) * 2020-09-29 2022-08-05 深圳市优必选科技股份有限公司 Logic board identification method and device and terminal equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1272145C (en) * 2002-03-21 2006-08-30 上海广茂达电子信息有限公司 Personal robot
CN101537618B (en) * 2008-12-19 2010-11-17 北京理工大学 Visual system for ball picking robot in stadium
CN201927271U (en) * 2011-03-14 2011-08-10 黑龙江科技学院 Multi-functional robot for teaching
CN102520721B (en) * 2011-12-08 2015-05-27 北京控制工程研究所 Autonomous obstacle-avoiding planning method of tour detector based on binocular stereo vision
CN103279949B (en) * 2013-05-09 2015-10-07 浙江大学 Based on the multi-camera parameter automatic calibration system operation method of self-align robot
CN105511457B (en) * 2014-09-25 2019-03-01 科沃斯机器人股份有限公司 Robot static path planning method
CN106354161A (en) * 2016-09-26 2017-01-25 湖南晖龙股份有限公司 Robot motion path planning method
WO2018097203A1 (en) * 2016-11-25 2018-05-31 株式会社村田製作所 Elastic wave filter device, multiplexer, high frequency front-end circuit, and communication device
CN108459599B (en) * 2017-12-21 2020-08-07 华为技术有限公司 Motion path planning method and device
CN108230869A (en) * 2018-03-19 2018-06-29 重庆鲁班机器人技术研究院有限公司 Teaching robot and teaching machine system

Also Published As

Publication number Publication date
CN109118884A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
CN109118884B (en) Teaching device of robot experiment course
Cappelleri et al. The robotic decathlon: Project-based learning labs and curriculum design for an introductory robotics course
WO2017028571A1 (en) An education system using connected toys
CN102411854A (en) Classroom teaching mixing technology application system based on enhanced reality and method thereof
CN106022305A (en) Intelligent robot movement comparing method and robot
Brand et al. Pidrone: An autonomous educational drone using raspberry pi and python
CN106250090A (en) A kind of three-dimensional scenic interactive exhibition system and methods of exhibiting
CN108230828B (en) Physical programming system and programming method
KR102185618B1 (en) Application system for teenager programming education using drones
KR102290951B1 (en) Remote robot coding education system
Pedersen et al. Using human gestures and generic skills to instruct a mobile robot arm in a feeder filling scenario
CN209842951U (en) Educational system and kit for teaching programming
Ma et al. Enhancement of a VEX robot with an onboard vision system
Wong et al. CPSBot: A low-cost reconfigurable and 3D-printable robotics kit for education and research on cyber-physical systems
KR101505598B1 (en) Apparatus for educating a robot
Stein et al. Mixed reality robotics for stem education
CN109857258B (en) Virtual remote control method, device and system
Yuen et al. Mobile app controlled modular combat robot for STEM education
CN108108163A (en) Distribution core business 3D trains courseware APP development method
Zhuo et al. Hands-on learning through racing: Signal processing and engineering education through the China National Collegiate Intelligent Model Car Competition
KR101937855B1 (en) Robot control evaluation system based on Augmented Reality
Dickinson et al. Roomba Pac-Man: Teaching Autonomous Robotics through Embodied Gaming.
CN106816101B (en) Robot system and display method
CN217287146U (en) Electronic chessboard
Morcego Coaxial UAV helicopter control laboratory design

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200508

Termination date: 20210912