CN110757448A - Interaction implementation method and system oriented to robot programming - Google Patents

Interaction implementation method and system oriented to robot programming Download PDF

Info

Publication number
CN110757448A
CN110757448A CN201810842921.0A CN201810842921A CN110757448A CN 110757448 A CN110757448 A CN 110757448A CN 201810842921 A CN201810842921 A CN 201810842921A CN 110757448 A CN110757448 A CN 110757448A
Authority
CN
China
Prior art keywords
robot
data
component
data acquisition
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810842921.0A
Other languages
Chinese (zh)
Inventor
吴超
丁烁宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chuangkegongchang Science & Technology Co Ltd
Original Assignee
Shenzhen Chuangkegongchang Science & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chuangkegongchang Science & Technology Co Ltd filed Critical Shenzhen Chuangkegongchang Science & Technology Co Ltd
Priority to CN201810842921.0A priority Critical patent/CN110757448A/en
Publication of CN110757448A publication Critical patent/CN110757448A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an interaction implementation method and system oriented to robot programming. The method comprises the following steps: the data acquisition component adapted to the robot control component acquires data carried by the image blocks; the robot control part acquires the mapped action description information through the acquired data; and executing action control on the corresponding robot device in the built robot hardware system according to the action description information. For the established robot hardware system, programming of the robot can be realized only under the action of the image blocks and the data acquisition component in an educational scene facing children, programming of the robot is not required to be performed on terminal equipment any more, the aim of enlightening education is achieved, the training effect of logic thinking is greatly enhanced, action control is realized for robot devices under the action of the image blocks and the data acquisition component, a game function is given to the robot hardware system, and the effect of the enlightening education in the educational field is greatly improved.

Description

Interaction implementation method and system oriented to robot programming
Technical Field
The invention relates to the technical field of artificial intelligence interaction, in particular to an interaction implementation method and system for robot programming.
Background
With the development of artificial intelligence interaction technology, more and more robots of various types enter the visual field of people, and the robots also change from accomplishing a certain task according to a fixed set action sequence to performing actions under programmable control so as to accomplish the current required tasks. The robot is not limited to the execution of fixed action and fixed task, and the robot can flexibly and changeably achieve various action execution processes through the programming of the robot.
Robot programming is a process of programming and controlling the motion and operation of a robot by people, so as to set action description of tasks required to be completed for the robot. The implementation of robot programming relates to hardware on one hand, namely the construction of a robot hardware system is realized, and in addition, the implementation of a corresponding program is also related, the hardware programming is performed on the robot hardware system, the program is written through a programming language, and then a robot device in the robot hardware system is operated.
In an educational scenario, the programming of the robot is performed depending on the programming base owned by the child and the terminal device. After a child obtains a certain programming skill, the child can realize a programming process in the terminal device according to a programming basis owned by the individual, namely, a line program text content is written for a robot hardware system to set an action description process, once the action description process is not carried out by the child, the child often has a dilemma in the education field of the robot programming at the child stage because of depending on the terminal device and the high threshold.
Therefore, robot programming has a very high threshold, is difficult to adapt to different crowds, particularly children, and depends on terminal equipment, and further, the robot programming is limited to teaching programming and off-line programming, for example, teaching and reproducing motion tracks through teaching programming, or programming obtained program instructions to a robot hardware system through off-line programming so as to execute setting motions through the running of the program instructions.
Therefore, the existing robot programming is not suitable for all-around robot devices, and it is urgent to provide interactive implementation of robot programming for an educational scene to solve the dilemma that the robot programming performed by children in the educational scene depends on terminal equipment and is high on a threshold.
Disclosure of Invention
In order to solve the technical problems that in the related art, robot programming performed by children in an educational scene depends on terminal equipment and a threshold is high, the invention provides an interactive implementation method and system for robot programming.
An interactive implementation method oriented to robot programming, the method comprising:
the data acquisition part is adapted to the robot control part and used for acquiring data carried by the image blocks, wherein the data carried by the image blocks corresponds to action control description of the built robot hardware system;
the robot control component acquires mapped action description information through the acquired data;
and the robot control component executes action control to the corresponding robot device in the built robot hardware system according to the action description information.
In an exemplary embodiment, the data collection component adapted to the robot control component performs collection of data carried by the tile, including:
the data acquisition part acquires data of the image block to obtain data carried by the image block;
and transmitting the data carried by the image blocks to the robot control component through the communication.
In an exemplary embodiment, the data acquisition component includes an optical image acquisition device, and the data acquisition component performs data acquisition on a tile where the data acquisition component is located, and obtains data carried by the tile, including:
the optical image acquisition equipment scans the image block, and obtains the feature code contained in the image block through the scanning, wherein the feature code is data carried by the image block.
In an exemplary embodiment, the data acquisition component further includes an auxiliary acquisition device, and the data acquisition component acquires data of a tile where the data acquisition component is located, and obtains data carried by the tile, and further includes:
the method is suitable for the type of the image block, and acquires data in the image block through corresponding auxiliary acquisition equipment to obtain data carried in the image block, wherein the data is a code corresponding to the characteristics of the image block.
In one exemplary embodiment, the method further comprises:
the data acquisition component adapted to the robot control component performs acquisition of natural interaction data, wherein the natural interaction comprises voice interaction and/or image interaction;
sending the collected natural interaction data to the robot control component;
and the robot control component processes the natural interaction data to generate action description information, and the action description information is used for controlling a robot device in the built robot hardware system.
In an exemplary embodiment, the robot control unit processes the natural interaction data to generate action description information, including:
the robot control component is adapted to the natural interaction recognition environment, and performs recognition of the natural interaction data to generate action description information.
In an exemplary embodiment, the recognition environment of the natural interaction includes an online environment and an offline environment, the robot control component is adapted to the recognition environment of the natural interaction, and the recognition of the natural interaction data is performed to generate the action description information, including:
the robot control component detects whether the robot control component can connect with the server resource;
if the connected server resources are in the online environment of natural interaction identification, performing online identification on natural interaction data through the connected server to generate action description information;
and if the unconnected server resource is in the offline environment of natural interaction identification, performing offline identification on the natural interaction data to generate action description information.
In an exemplary embodiment, the robot control component performs motion control on the built robot hardware system to the corresponding robot device according to the motion description information, and the motion control method includes:
the robot control component determines a robot device which is at a near end or a far end relative to the robot device in the robot hardware system according to the action description information;
and executing program instructions according to the action description information, and controlling the robot device to execute actions through the execution of the program instructions.
In one exemplary embodiment, the system comprises a data acquisition component and a built robot hardware system, wherein the robot hardware system comprises a robot control component and a robot device;
the data acquisition component is adapted to the robot control component and is configured to acquire data carried by tiles, and the data carried by the tiles corresponds to action control description of the built robot hardware system;
the robot control component is configured to:
acquiring mapped action description information through the acquired data;
and executing action control on the corresponding robot device in the built robot hardware system according to the action description information.
In one exemplary embodiment, the data acquisition component comprises:
the image block data acquisition module is used for acquiring the image block where the image block is located and acquiring data carried by the image block;
and the communication module is used for transmitting the data carried by the image blocks to the robot control component through communication.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects:
in the educational scene of the children stage, children in the children stage construct a robot hardware system for a given robot device, the robot hardware system comprises a robot control component and the robot device, in order to integrate the constructed robot hardware system into the entertainment of the children, the children in the children stage firstly set up image blocks according to the actions required to be executed by the robot hardware system, the data carried by the image blocks correspond to the action control description of the robot hardware system, at the moment, on the set image blocks, the children collect the data carried by the image blocks by controlling a data collection component adapted to the robot control component, then the robot control component obtains the mapped action description information through the collected data, and the action control adapted to the entertainment of the children is executed to the corresponding robot device according to the action description information in the constructed robot hardware system, at the moment, for the built robot hardware system, the robot can be programmed under the action of the image block and the data acquisition component in the educational scene facing children, the terminal equipment is not relied on, the robot programming is not required to be carried out on the terminal equipment, in addition, the image block described by the action control of the built robot hardware system can be corresponded, the robot programming is not written in a row program text content, but the image block setting process is obtained, the aim of enlightening education is achieved, the threshold used by children is greatly reduced, the action control can be rapidly and randomly realized for the robot device which can be seen everywhere under the action of the image block and the data acquisition component, the obtained game function is endowed to the robot hardware system, under the interaction mode of the physical programming of the robot programming, the programming learning threshold can be effectively reduced for the enlightening programming education in the education field, people in the child stage can contact the programming idea, and the programming skill can be obtained. Thereby greatly enhancing the training effect of logic thinking and greatly improving the effect of elementary education in the education field.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic illustration of an implementation environment in which the present invention is concerned;
FIG. 2 is a flow diagram illustrating a method for interactive implementation oriented to robotic programming in accordance with an exemplary embodiment;
FIG. 3 is a flowchart illustrating a description of step 310 according to a corresponding embodiment of FIG. 2;
FIG. 4 is a flowchart illustrating a method for interactive implementation oriented to robotic programming in accordance with another exemplary embodiment;
FIG. 5 is a flowchart illustrating a description of step 550 according to a corresponding embodiment of FIG. 4;
FIG. 6 is a flowchart illustrating a description of step 350 according to a corresponding embodiment of FIG. 2;
FIG. 7 is a block diagram illustrating an interactive implementation system oriented to robotic programming in accordance with an exemplary embodiment;
FIG. 8 is a block diagram illustrating a data acquisition component according to the corresponding embodiment of FIG. 7;
FIG. 9 is a block diagram illustrating an interactive implementation system oriented to robotic programming in accordance with another exemplary embodiment;
FIG. 10 is a block diagram illustrating a robot control component according to the corresponding embodiment of FIG. 7;
FIG. 11 is a schematic diagram illustrating interaction between a smart cart and a data acquisition component, according to an exemplary embodiment;
FIG. 12 is a schematic diagram of the physical assembly between the smart cart and the data acquisition component shown in the corresponding embodiment of FIG. 11;
FIG. 13 is a schematic diagram of an application of intelligent car control realized by blocks in a game scene;
FIG. 14 is a schematic diagram of an application of intelligent car control realized by the blocks in another game scene;
FIG. 15 is a schematic view of a smart cart and data collection components combined in another game scenario;
fig. 16 is a schematic view of a smart cart and data collection components combined in another game scenario.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
FIG. 1 is a schematic illustration of an implementation environment in which the present invention is directed. In an exemplary embodiment, the execution environment includes a robot 110 and a data acquisition component 130. The robot 110 and the data acquisition component 130 are connected through bluetooth to realize data interaction.
The robot 110 includes a robot control component and other robot devices, and at least the robot control component and the data acquisition component 130 are in the same communication space to realize bluetooth connection therebetween. The communication space is adapted to the communication method used between the robot control unit and the data acquisition unit 130, for example, the communication between the robot control unit and the data acquisition unit 130 is implemented by bluetooth, and then the communication space corresponds to the bluetooth communication range between the robot control unit and the data acquisition unit; for another example, when the robot control unit and the data collection unit 130 communicate by wireless transmission, the communication space is a communication range that can be covered by the wireless transmission.
It should be understood that the robot control component and the data acquisition component 130 are flexibly deployed, and therefore, the communication manner and the communication space adopted by the robot control component and the data acquisition component will be different according to the flexible deployment and configuration, and are suitable for different situations and scenes.
The data collection component 130 senses action description information set by hardware programming of the robot 110 through data collection of tiles disposed in the communication space, for example, program logic, algorithm design and assignment corresponding to executing actions and even completing tasks.
The robot 111 performs motion control of the corresponding robot device by sensing data of the image block by the data acquisition unit 130.
That is, the corresponding program control setting is realized through the blocks, that is, the robot programming is realized to obtain the logic, algorithm, syntax, structure, assignment, and the like for the robot 110 to move, the blocks are the obtained motion control process, and the robot 110 is arbitrarily created under the interaction between the blocks and the data acquisition component 130.
Therefore, no matter young children or old children, the robot hardware system can participate in the building and physical programming of the robot hardware system, and the entertainment and interaction performance can be obtained through the picture block placing and the response of the robot hardware system to the placed picture blocks, so that the robot hardware system can step into the programming world to gradually obtain the programming skills.
FIG. 2 is a flow diagram illustrating a method for interactive implementation oriented to robotic programming in accordance with an exemplary embodiment. In an exemplary embodiment, the interactive implementation method oriented to robot programming, as shown in fig. 2, includes at least the following steps.
In step 310, a data collection component adapted to the robot control component collects data carried by the tiles, where the data carried by the tiles corresponds to the action control description of the built robot hardware system.
Wherein the robot control unit is a programmable host, an electronic device that can be programmed to execute the programmed instructions. The robot control component and other robot devices are built together to form a robot hardware system, and then the robot hardware system is controlled by the action executed by the robot control component, so that the other robot devices execute corresponding actions, and the task set for the robot hardware system is completed.
And relative to the free movement of the robot hardware system, the data acquisition component is used for acquiring data mapped to the action description information in the robot programming, and an object for acquiring the data is the image block where the data is located. The data acquisition component is oriented to the block for data acquisition, and therefore, it should be understood that the data acquisition component is adapted to the block. In an exemplary embodiment, the data acquisition component comprises an optical image acquisition device or the like, and the selected pattern is adapted for this purpose. The data acquisition components will be different in the manner in which data is acquired, and therefore, the data carried by the tiles will be different in the form.
The data acquisition component and the robot device in the robot hardware system are mutually separable, namely, the data acquisition component and the robot device can be quickly disassembled and assembled together. According to robot hardware system's motion needs to and the data acquisition that the data acquisition part needs to go on, can be quick at will with the assembly of data acquisition part and the robot device that corresponds together, in addition, also can be quick at will dismantle the data acquisition part from the robot device, the two phase separation, and then can not cause the obstacle to required data acquisition and the action execution that goes on, guarantee robot hardware system's motion performance and control performance.
The robot hardware system program is obtained by programming a plurality of blocks for robot programming performed by the robot hardware system. That is, the robot programming oriented to the robot hardware system is realized by splicing a plurality of image blocks, and the action description information corresponding to the image blocks spliced together form the control program of the robot hardware system.
After the robot hardware system is built, people select and splice the image blocks aiming at the robot devices in the robot hardware system, the actions to be executed and the tasks to be completed, and each image block has corresponding action description information, so that the seen image blocks, namely the obtained image blocks, can be placed for the seen robot hardware system based on the image blocks.
The image blocks are set for the action execution of the robot device to build the program control logic of the robot hardware system, so that people can build the program control logic by splicing and placing the image blocks. Thus, a tile is a tileable tile, which is a zero piece corresponding to a certain motion control description. Optionally, the edges of the segments are designed to be able to be pieced together with other segments, so that the segments can be connected together, even interlocked, to ensure the reliability of the robot programming. In addition, the blocks can be loosely arranged, discretely arranged in a required area without limitation, and flexibly arranged according to the motion requirement of the built robot hardware system.
The shape of the block is shown, and it should be understood that the block may exist physically or virtually in the electronic device, but to reduce the threshold without depending on the terminal device, the block may exist in a physical form, which is also referred to as a physical block. The image blocks are distinguished by the set patterns or colors, and the motion control description of the robot hardware system corresponding to the image blocks is obtained according to the characteristics, and then the image blocks are placed.
In one exemplary embodiment, the tiles have printed thereon a pattern indicating the corresponding motion control description, and a feature code. For example, the pattern and the feature code for indicating the display screen to display the specific content are printed on the graphic block, and at this time, if the display screen arranged in the robot hardware system needs to be controlled to be lightened to display the specific content in a time sequence, the graphic block can be placed under the control logic of the time sequence corresponding to the program.
The selection and placement of the blocks, such as which block is selected, the timing relationship between the selected block and other blocks, the placement position and order, etc., are performed according to the corresponding program control logic.
For the blocks, on the other hand, the blocks can be configured by the obtained robot suite, and on the basis of the blocks, the blocks can also be self-made for program control required to be carried out. For example, homemade may be made using proprietary hard materials such as cardboard or the like, thereby further enhancing the flexibility and expandability of the robot programming.
And the data acquisition part acquires data of each image block in order on the placed image blocks, moves to the next image block to acquire data again along with the completion of data acquisition on one image block and the completion of execution of corresponding actions in the robot hardware system, and so on until all the placed image blocks complete data acquisition, and the actions programmed by the robot hardware system are also completed, thereby completing the set tasks.
The movement of the data acquisition unit between the blocks is carried out under the control of a human or a robot control unit. On one hand, the data acquisition component is driven by people to move so as to acquire data on the next image block; on the other hand, the data acquisition part is also a robot device and is controlled by the robot control part, so that the data acquisition part is driven by the robot control part to move to the next image block for data acquisition.
The form of the data acquisition component and the relationship between the data acquisition component and the machine control component are not limited herein, and a device capable of acquiring data carried by the image block can exist as the data acquisition component. Therefore, compared with a robot hardware system which is selected and constructed flexibly, the data acquisition components are very flexible in quantity and form, and can be flexibly set according to the application of the robot which is realized at will.
The data carried by the image blocks is encapsulated in the patterns and/or colors presented by the image blocks. For example, the feature codes presented by the tiles store the data carried by the tiles; for another example, the colors presented by the tiles and the corresponding codes are also the data carried by the tiles, which are not listed here.
Of course, it should be understood that in one exemplary embodiment implementation, the tiles also provide guidance patterns for the robot programming being performed to ensure ease of use of the robot programming by the tiles and to reduce the threshold of the robot programming.
The adaptation between the robot control component and the data acquisition component refers to the pairing connection between the robot control component and the data acquisition component, so that the data acquisition component and the robot control component can exchange data in a short distance, and the data acquired by the data acquisition component can be transmitted to the robot control component.
It should be noted that, for the blocks carrying data, that is, the blocks laid out for implementing robot programming, in an exemplary embodiment, the blocks are in the form of entity blocks, and each entity block carries data mapped to the action description information.
In step 330, the robot control part acquires the mapped motion description information through the collected data.
And through the execution of the steps, the data acquisition part acquires the data carried by the image block and transmits the data to the robot control part. Correspondingly, the robot control component acquires the mapped action description information correspondingly. The action description information is specific to a particular robot device, that is, indicates an action execution subject, and also describes how the indicated action execution subject executes a specified action.
The motion description information is data stored in advance, that is, it is pre-stored with the collected data as an index for the robot control part to acquire mapped motion description information through the collected data, and the robot control part is also able to acquire mapped motion description information through the received data.
As understood from the foregoing, the action description information is a logical statement that causes an action to be performed, and the robot hardware system performs control on the corresponding robot device through acquisition of the action description information in the robot control section.
The robot control component directly obtains the mapped action description information through the data collected by the data collection component, and the process is realized locally, so that the response speed and efficiency of robot programming on image blocks are ensured, and the action execution efficiency of a robot hardware system is further ensured.
In step 350, the robot control component performs motion control on the built robot hardware system to the corresponding robot device according to the motion description information.
In addition to the robot control components, there are many robot devices, such as a cart with a walking function, which are independent or interact with each other to implement the actions. Specifically, the robot device includes a core robot device, a structural component, and the like, for example, the core robot device may be a joint constituting a robot arm, and the structural component may be various connecting members, and the like, which are not listed here.
The motion description information is to be run as program control code of the robot hardware system to perform motion control on the corresponding robot device so that the robot device can perform the corresponding motion. The action to be performed is determined by the corresponding robot device. For example, if the robot device is a display screen, the executed motion is display of specific content, and if the robot device is a member that can be driven to walk, the executed motion is walking motion.
According to the robot programming method and the robot programming device, robot programming on the image blocks is achieved, the data acquisition component can freely move, namely, the data acquisition component freely moves to one image block to acquire data carried by the image block, and the robot control component of the robot programming device is matched with the data acquisition component, so that the robot programming which is obtained by what you see is can be performed on the built robot hardware system, the realizability of the robot programming on a plurality of people is greatly enhanced, the robot programming can be performed without terminal equipment, namely, programming in the terminal equipment, such as graphical programming, is not needed, the starting speed of the robot programming is increased, namely, the terminal equipment does not need to be started, and the programmability and the control performance of the robot hardware system are enhanced.
Through the exemplary embodiment as described above, the robot control component in the robot hardware system can be separated from the data acquisition component, and therefore, the degree of freedom of the robot hardware system for executing the motion is enhanced, and the robot hardware system has very good motion performance.
In addition, compared with the robot control component, other robot devices in the robot hardware system can be separated from the robot control component, and the motion performance of the robot hardware system is further enhanced under the action.
Through the exemplary embodiment, the interaction between the human and the image block is realized, and further the robot programming is realized, so that the interaction between the human and the robot hardware system is realized, and the controllability of the robot hardware system is enhanced.
The data acquisition component and the robot hardware system, especially the robot control component, can be quickly separated, that is, the data acquisition component can be installed on the robot hardware system, especially the robot control component, through the set installation position according to the requirement, and also can be separated from each other, and the data acquisition component and the robot hardware system, especially the robot control component, are not limited here, and can be flexibly configured according to the interactive scene required to be realized.
Through the exemplary embodiment, the robot kit held by people realizes the random construction and the random control of the robot device, the robot device can be controlled to perform actions under the action of the data acquisition component only through the arrangement of the pattern blocks, so that children of low ages, namely infants can participate in the action, the robot hardware system and the robot programming are not limited to adults, or the teaching is used, the threshold is reduced, and the realized product can be oriented to all people.
Fig. 3 is a flow chart illustrating a description of step 310 according to a corresponding embodiment of fig. 2. In an exemplary embodiment, as shown in FIG. 3, this step 310 includes:
in step 311, the data acquisition component acquires data of the image block, and obtains data carried by the image block.
As mentioned above, the data acquisition component is adapted to the robot control component, so that the short-distance data exchange can be realized, and therefore, the data acquisition component is in communication connection with the robot control component. In an exemplary embodiment, the implemented close range data exchange is implemented by bluetooth, i.e. the connection between the data acquisition component and the robot control component is a bluetooth connection.
For the robot hardware system which is built and started, once the robot hardware system starts to work, the robot control component is connected with the data acquisition component, and the reading and the transmission of data on the image blocks which are arranged by the programming of the robot are waited.
The data acquisition component acquires the data carried by the image blocks along with the data acquisition of the image blocks, and the robot control component is connected to have the condition of transmitting the data.
In step 313, the data carried by the tiles is transmitted to the robot control component via the communication performed.
As can be understood from the foregoing description, the data acquisition component and the robot control component are located in the same communication space, but in the robot hardware system, with respect to the robot control component, other robot devices may be located in the communication space, or may be located in other spaces, and are not limited herein. By analogy, a device which is only used as a data acquisition component must be in the same communication space with the robot control component, so that data transmission can be realized through unidirectional communication, for example, unidirectional near field communication, without depending on a network, and the volume and cost of the data acquisition component can be ensured.
However, the data acquisition component which can also be used as the action execution main body has double roles, and data exchange with the robot control component is not limited to near field communication any more, so that the data acquisition component can be positioned in different communication spaces with the robot control component, and has stronger and more flexible programmable free execution functions, thereby enhancing the linkage performance in a robot hardware system.
Along with the data acquisition of the image blocks by the data acquisition component, the acquired data is correspondingly transmitted to the robot control component, so that the action control which needs to be executed at present is indicated through the data carried by the image blocks, namely, the action execution main body and the action execution process described by the action description information, the action execution of the built robot hardware system is simply and freely set, the building of the robot hardware system can be freely carried out under the action of the robot device provided by the robot suite, the hardware programming of the action which needs to be executed can also be freely carried out, and the robot hardware programming can be suitable for any situation and scene.
In this exemplary embodiment, it can be appreciated that the data collection component will implement the collection of data carried by the tiles for the robotic control component and the one-way communication of the collected data to the robotic control component. Further, for data acquisition performed by the data acquisition component, the data acquisition component can transmit acquired data to the robot control component in real time along with the data acquisition; in addition, when the robot control unit can be connected, the data acquired again by the data acquisition unit may be transmitted to the robot control unit, which is not limited herein.
In an exemplary embodiment, the data acquisition component comprises an optical image acquisition device, the step 311 comprising:
the optical image acquisition equipment scans the image block, and obtains the feature code contained in the image block through scanning, wherein the feature code is data carried by the image block.
The optical image acquisition equipment senses digital information on an image block through photoelectric recognition. The optical image acquisition equipment is internally provided with a photoelectric recognizer to sense the digital information on the image block. The optical image acquisition equipment touches the image block, and at the moment, the built-in photoelectric recognizer starts to scan the contact part so as to obtain the feature code by scanning.
For the robot programming, all the action description information has the corresponding feature codes, so the control program programming of the robot hardware system can be realized by the arrangement and the splicing of the picture blocks.
For example, for the scanning of the feature code, the feature code is successfully recognized by performing infrared projection and photographing and image reading after infrared absorption and penetration on the surface of the pattern block. In a specific implementation of an exemplary embodiment, the optical image capturing device is a touch and talk pen with only a feature code recognition function, and is connected to the robot control component in a short distance, and in response to this, the scanned feature code may be a two-dimensional graphic code adapted to the touch and talk pen, which is only exemplified herein and not limited thereto.
Through the exemplary embodiment, the optical image acquisition equipment which only acquires and transmits the number of the image blocks is provided for the machine programming of the robot hardware system, so that the small size, the low cost and the easy carrying are ensured, the condition influence in a natural scene is avoided, and the reliability and the stability are enhanced.
In another exemplary embodiment, the data acquisition component further comprises an auxiliary acquisition device, and step 311 further comprises:
and the data acquisition in the image block is carried out through the corresponding auxiliary acquisition equipment to obtain the data carried in the image block, wherein the data is the code corresponding to the characteristics of the image block.
Besides the collection of data carried by the image blocks by the optical image collection equipment, the data collection component also comprises auxiliary collection equipment, so that the collection of data in more forms of image blocks is realized, and abundant available image blocks are provided for the programming of the robot. In one specific implementation of the exemplary embodiment, the auxiliary acquisition device includes an NFC (near field Communication) data acquisition device, a color sensor, and the like.
And for the NFC data collector, the data collection is carried out on the attached image blocks, so that the collection of the data carried by the image blocks is realized. However, the image blocks that the NFC data collector can be adapted to need to be specially made, and cannot be made by self at will, so that the action description information that can be mapped to is limited, and therefore, the map data collector can only exist as an aid.
The limited colors that can be recognized by the color sensor also results in very limited motion description information that can be mapped and therefore can only be present as an aid.
The color sensor senses the color of the image block, the sensed color is the image block characteristic, so that the sensed color mapping code is obtained and is mapped to the action description information through the code.
By this example, the robot programming is made compatible with various types of blocks, and different data acquisition components are enabled adaptively, so that the applicable range and applicable scenes are greatly widened.
FIG. 4 is a flowchart illustrating a method for interactive implementation oriented to robotic programming in accordance with another exemplary embodiment. In an exemplary embodiment, as shown in fig. 4, the interactive implementation method facing robot programming further includes at least the following steps.
In step 510, a data acquisition component adapted to the robot control component performs acquisition of natural interaction data, the natural interaction comprising voice interaction and/or image interaction.
Wherein the natural interaction is voice and/or video interaction between a person and the data acquisition component in a natural scene; in response, the obtained natural interaction data includes audio data and/or video data.
That is, a natural interactive control implementation is provided for the built robot hardware system, and therefore, voice acquisition hardware and/or image acquisition hardware, such as a microphone, a camera and the like, are also installed on the data acquisition component to acquire audio data and/or video data.
In step 530, the collected natural interaction data is sent to the robot control component.
Wherein, along with the acquired natural interaction data, the acquired natural interaction data is transmitted to the robot control component. Specifically, as mentioned above, under the bluetooth connection between the data acquisition component and the robot control component, the natural interaction data acquired continuously is continuously transmitted to the robot control component, and the data acquisition component does not perform the processing of the acquired natural interaction data to ensure the self-volume, i.e. portability and cost.
In step 550, the robot control component processes the natural interaction data to generate action description information for controlling the robotic device in the built machine hardware system.
The robot control component receiving the natural interaction data needs to convert the received natural interaction data into action description information, so that control of the corresponding robot device can be achieved.
At this time, it should be understood that, in this exemplary embodiment, interaction between a human and a robot hardware system in a natural scene is provided for the performed robot programming, and is not limited to the placement of the tiles and the collection of data carried by the tiles, so that the interaction performance of the performed robot programming is enhanced.
The robot control component processes the received natural interaction data, including audio and video content recognition, namely voice recognition and/or video recognition, so as to obtain the input control content, and further generates the action description information according to the input control content.
In an exemplary embodiment, the processing of the natural interaction data may be an online identification with respect to cloud server processing, or an offline identification, which will be determined based on whether the robot control component is able to access the network and thus connect to the server resources.
That is, the robot control unit determines the recognition environment of the natural interaction based on the network environment currently available, and performs the processing of the natural interaction data in accordance with the determined recognition environment.
In one exemplary embodiment, this step 550 includes: the robot control component is adapted to the natural interaction recognition environment, and performs natural interaction data recognition to generate action description information.
Further, fig. 5 is a flowchart illustrating the description of step 550 according to the corresponding embodiment of fig. 4. In an exemplary embodiment, the natural interaction recognition environment includes an online environment and an offline environment, as shown in FIG. 5, this step 550 includes at least:
in step 551, the robot controller detects whether it can connect to the server resource.
In step 553, if the connected server resource is in an online environment of natural interaction identification, generating action description information by online identification of natural interaction data by the connected server.
In step 555, if the unconnected server resource is in the offline environment of natural interaction identification, performing offline identification of natural interaction data to generate action description information.
In the natural interaction identification environment, the online environment has the capability of natural interaction online identification and can perform the online identification of natural interaction; the offline environment refers to that the natural interaction cannot be identified online, and only the offline identification of the natural interaction can be performed.
For the robot control component, if the internet can be accessed in the coverage range of the wireless signal, the robot control component can be connected to a cloud server with identification capability, and the conversion of natural interaction data into action description information is realized by means of the available server resources.
However, if the robot control component is lost in the server network, that is, the robot control component cannot be connected to the cloud server with the recognition capability, at this time, the off-line recognition of the natural interaction is performed, that is, the conversion of the natural interaction data into the action description information is performed through a recognition library stored locally.
Therefore, richer control modes are provided for the interaction control of the robot hardware system, and the control performance is ensured through natural interaction data processing suitable for the environment where the robot hardware system is located.
For the data acquisition component, no matter the natural interaction is identified or the server interaction is carried out for the natural interaction, the data acquisition component is executed by the machine control component, only data acquisition and short-distance data exchange are carried out, and only a one-way link is established between the data acquisition component and the robot control component, so that the simplicity of the data acquisition component is ensured, and the data acquisition component is not continuously redundant and bloated due to the enhancement of performance and the richness of functions.
In an exemplary embodiment, the online recognition is performed by sending the natural interaction data to the cloud server through internet access provided by the wireless network, and obtaining a recognition result returned by the cloud server, that is, action description information generated by converting the natural interaction data, so that the robot device performs control.
The off-line recognition is performed by the robot control unit based on an off-line recognition library, such as an off-line voice library, to obtain corresponding action description information.
However, the offline recognition library is limited by a storage space, and is often not as complete as the online recognition library deployed by the cloud server, online recognition is preferentially performed to ensure the recognition rate, and offline recognition is performed only when online recognition cannot be performed, so that the recognition rate is ensured and the reliability and stability of recognition are ensured.
Fig. 6 is a flow chart illustrating a description of step 350 according to a corresponding embodiment of fig. 2. In an exemplary embodiment, as shown in FIG. 6, this step 350 includes at least:
in step 351, the robot control component determines, from the action description information, a robotic device in the robot hardware system that is proximal or distal to itself.
In step 353, the program command is executed according to the operation description information, and the robot device is controlled to execute the operation by executing the program command.
The action execution main body corresponding to the action description information can be a robot device which is positioned at a near end relative to the robot control component, namely the robot device which is in close contact with the robot control component; the robot device may be a robot device located at a far end with respect to the robot control unit, that is, a robot located at a far distance from the robot control unit, and is not limited herein. The robotic device may be connected to the robotic control member or may exist separate from the robotic control member. The proximal and distal ends are referred to with respect to whether the robotic control unit and the robotic device need to be remotely controlled, e.g. the robotic device assembled with the robotic control unit, or the robotic control unit and the robotic device are in the same space, i.e. the robotic device is considered to be proximal with respect to the robotic control unit.
For another example, the robot device and the robot control unit are located on different floors, or the robot control unit is deployed in an indoor environment where people are located, the robot device moves in an outdoor environment, and the robot control unit performs remote control on the robot device, so that the robot device is located at a far end of the robot control unit.
The action description information is a logic statement formed by a plurality of program instructions, so that the robot device executes a specified action through the execution of the program instructions, and further, the set task is completed.
By the exemplary embodiments as described above, threshold-free robot programming is made possible, whereby the building and motion control of the robot hardware system can be performed at will based on the obtained robot suite, and the motion of the robot hardware system is adapted to the set rules by the interactively implemented robot programming, for example, the set robot breakthrough rules, and the application of the robot programming is broadened.
Taking the example that the built robot hardware system needs to go out of the maze according to set rules, expounding by combining the method The above-mentioned processes are described.
The built robot hardware system has a walking function, in a communication space, one area is a motion area of the built robot hardware system, and the other area is an area for robot programming through block layout of people.
Therefore, the image blocks are placed for the existence of the robot hardware system in the motion area and the motion required to be executed, and the placed image blocks are realized according to the walking of the robot hardware system according to the set rule.
People can collect data on each image block one by one through the held data collecting component, so that the data are mapped to the action description language one by one, and the robot hardware system can complete movement as people want.
Through the implementation of the method, various interactive playing methods are provided for the construction of a robot hardware system and the programming of the robot, so that the interactive performance of the robot is enhanced.
The following is an embodiment of the apparatus of the present invention, which is used to implement the above-mentioned embodiment of the interaction implementation method oriented to robot programming of the present invention. For details not disclosed in the embodiments of the apparatus of the present invention, please refer to the embodiments of the interaction implementation method for robot programming of the present invention.
FIG. 7 is a block diagram illustrating an interactive implementation system oriented to robotic programming in accordance with an exemplary embodiment. In an exemplary embodiment, as shown in fig. 7, the interaction implementation system for robot programming at least includes: a data acquisition component 710 and a robot hardware system 730, wherein the robot hardware system 730 comprises a robot control component 731 and a robot device 733.
A data acquisition part 710 adapted to the robot control part 731, configured to perform acquisition of data carried by tiles corresponding to action control descriptions for the set up robot hardware system 730;
the robot control part 733 is configured to:
acquiring mapped action description information through the acquired data;
and performing motion control on the built robot hardware system 730 to the corresponding robot device 733 according to the motion description information.
Fig. 8 is a block diagram illustrating a data acquisition component according to a corresponding embodiment of fig. 7. In one exemplary embodiment, as shown in FIG. 8, the data acquisition component 710 includes at least a tile data acquisition module 711 and a close range communication module 713.
The image block data acquisition module 711 is used for acquiring the image block where the image block is located and acquiring data carried by the image block;
a close range communication module 713, configured to transmit the data carried by the tile to the robot control part 731 through the performed communication.
In an exemplary embodiment, the data acquisition component 710 includes an optical image acquisition device, and the block data acquisition module 711 is further configured to perform a scan on a block where the block is located, so as to obtain a feature code included in the block, where the feature code is data carried by the block where the block is located.
In another exemplary embodiment, the data acquisition component 710 further includes an auxiliary acquisition device, and the data acquisition module 711 is further configured to perform data acquisition in the block where the data acquisition module is adapted to the type of the block where the data acquisition module is located, to obtain data carried in the block where the data is a code corresponding to the feature of the block where the data acquisition module is located.
FIG. 9 is a block diagram illustrating an interactive implementation system for robotic programming in accordance with another exemplary embodiment. In an exemplary embodiment, as shown in fig. 9, the interactive implementation system for robot programming includes: a natural interaction collection module 910 and a sending module 930.
A natural interaction acquisition module 910, configured to perform acquisition of natural interaction data in adaptation with the robot control component 731, where the natural interaction includes voice interaction and/or image interaction;
a sending module 930, configured to send the acquired natural interaction data to the robot control component 731;
and a robot control part 731, configured to process the natural interaction data to generate action description information, where the action description information is used to control the robot device 733 in the built robot hardware system.
In an exemplary embodiment, the robot control part 731 is adapted to the recognition environment in which the natural interaction is present, performing the recognition of the natural interaction data generating the action description information.
Further, the recognition environment of natural interaction includes an online environment and an offline environment, and the robot control part 731 is configured to:
detecting whether the server resource can be connected or not;
if the connected server resources are in the online environment of natural interaction identification, performing online identification on natural interaction data through the connected server to generate action description information;
and if the unconnected server resource is in the offline environment of natural interaction identification, performing offline identification on the natural interaction data to generate action description information.
Fig. 10 is a block diagram illustrating a description of a robot control component according to a corresponding embodiment of fig. 7. In an exemplary embodiment, the robot control part 731 includes at least:
a main body determination module 1001, configured to determine, according to the motion description information, a robot device 733 that is located near or far from the robot hardware system 730 relative to the main body;
an instruction execution module 1003, configured to execute a program instruction according to the motion description information, and control the robot device 733 to execute a motion through execution of the program instruction.
The intelligent trolley in the educational robot suite and the configured data acquisition component are taken as examples, and the method is combined The rows are illustrated.
The teaching robot kit is a educational toy for children, and the intelligent car is a robot toy in essence corresponding to the educational toy. FIG. 11 is a schematic diagram illustrating interaction between a smart cart and a data collection component, according to an exemplary embodiment.
As shown in fig. 11, the smart cart 1110 and the data collecting component 1130 perform wireless communication, and the data collecting component 1130 performs unidirectional communication with the smart cart 1110 under the action of the wireless communication. Fig. 12 is a schematic diagram of the physical assembly between the smart cart and the data acquisition component according to the corresponding embodiment of fig. 11.
As shown in fig. 12, the smart cart 1110 and the data collection part 1130 are separable or combined, for example, the data collection part 1130 is combined with the smart cart 1110 by clamping, magnetic attraction, etc. on the smart cart 1110.
The programming for the smart cart 1110 is implemented by the aforementioned blocks. The picture blocks are in the form of cards, drawings, props, etc. For example, fig. 13 is a schematic diagram of an application of smart car control implemented by the tiles in a game scene. As shown in fig. 13, the data acquisition component 1130 performs data acquisition on image blocks in the form of cards, sketches, props, and the like, and accordingly, the smart cart 1110 is placed on a map, a scene graph, or a sketches, and program control of the smart cart 1110 is performed according to data read by the data acquisition component 1130, so as to perform corresponding actions.
It can be seen that the data collection component 1130 above the area where the tiles are placed reads data for programming by the child, and the smart cart 1110 performs actions on the map, the scene (which may be a stage), and even the motion area provided by the picture book. Furthermore, the two regions may be combined into one, as shown in fig. 14.
Fig. 14 is a schematic diagram of an application of the intelligent car control realized by the blocks in another game scene. As shown in fig. 14, the target of the data collection component 1130 is the map itself, and the smart cart 1110 is guided to perform different actions by clicking different parts on the map.
Fig. 15 is a schematic view of a smart cart and data collection components combined in another game scenario. As shown in fig. 15, smart cart 1110 and data acquisition component 1130 are combined to control the execution of actions by smart cart 1110 by performing in-map interactions.
Specifically, in the map in which the smart car 1110 moves, graphic blocks in the form of cards, prop stickers, and the like are placed, so that when the smart car 1110 moves to a certain position, the data acquisition component 1130 controls the smart car 1110 to perform a specific action on the data acquired by the graphic blocks. Of course, the map and the map tiles may be independent from each other, but the map tiles may also be a part of the map, for example, in the form of map tiles.
Fig. 16 is a schematic view of a smart cart and data collection components combined in another game scenario. As shown in fig. 16, smart cart 1110 and data acquisition component 1130 in combination provide for off-map interaction. Specifically, the graphic blocks in the form of cards, prop stickers, and the like are placed outside the map, and data acquisition is performed on the graphic blocks by the data acquisition component 1130, so that the smart cart 1110 travels to a certain position on the map and makes a specific action.
From this, will make educational robot external member can be adapted to going on of children's programming, through visual tangible hardware programming, very big reduction education field is towards the programming learning threshold that children in the children stage went on.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. An interactive implementation method oriented to robot programming, the method comprising:
the data acquisition part is adapted to the robot control part and used for acquiring data carried by the image blocks, wherein the data carried by the image blocks corresponds to action control description of the built robot hardware system;
the robot control component acquires mapped action description information through the acquired data;
and the robot control component executes action control to the corresponding robot device in the built robot hardware system according to the action description information.
2. The method of claim 1, wherein the data collection component adapted to the robot control component performs collection of data carried by a tile, comprising:
the data acquisition part acquires data of the image block to obtain data carried by the image block;
and transmitting the data carried by the image blocks to the robot control component through the communication.
3. The method of claim 1, wherein the data acquisition component comprises an optical image acquisition device, and the data acquisition component performs data acquisition on a block where the data acquisition component is located, and obtains data carried by the block, and comprises:
the optical image acquisition equipment scans the image block, and obtains the feature code contained in the image block through the scanning, wherein the feature code is data carried by the image block.
4. The method according to claim 3, wherein the data acquisition component further comprises an auxiliary acquisition device, and the data acquisition component acquires data of a block where the data acquisition component is located, and obtains data carried by the block, and further comprising:
the method is suitable for the type of the image block, and acquires data in the image block through corresponding auxiliary acquisition equipment to obtain data carried in the image block, wherein the data is a code corresponding to the characteristics of the image block.
5. The method of claim 1, further comprising:
the data acquisition component adapted to the robot control component performs acquisition of natural interaction data, wherein the natural interaction comprises voice interaction and/or image interaction;
sending the collected natural interaction data to the robot control component;
and the robot control component processes the natural interaction data to generate action description information, and the action description information is used for controlling a robot device in the built robot hardware system.
6. The method of claim 5, wherein the robotic control component processes the natural interaction data to generate action description information, comprising:
the robot control component is adapted to the natural interaction recognition environment, and performs recognition of the natural interaction data to generate action description information.
7. The method of claim 6, wherein the natural interaction recognition environment comprises an online environment and an offline environment, wherein the robot control component is adapted to the natural interaction recognition environment, and wherein performing the natural interaction data recognition generates the action description information, comprising:
the robot control component detects whether the robot control component can connect with the server resource;
if the connected server resources are in the online environment of natural interaction identification, performing online identification on natural interaction data through the connected server to generate action description information;
and if the unconnected server resource is in the offline environment of natural interaction identification, performing offline identification on the natural interaction data to generate action description information.
8. The method according to claim 1, wherein the robot control component executes action control to the corresponding robot device in the built robot hardware system according to the action description information, and the action control comprises the following steps:
the robot control component determines a robot device which is at a near end or a far end relative to the robot device in the robot hardware system according to the action description information;
and executing program instructions according to the action description information, and controlling the robot device to execute actions through the execution of the program instructions.
9. An interaction implementation system oriented to robot programming is characterized by comprising a data acquisition component and a built robot hardware system, wherein the robot hardware system comprises a robot control component and a robot device;
the data acquisition component is adapted to the robot control component and is configured to acquire data carried by tiles, and the data carried by the tiles corresponds to action control description of the built robot hardware system;
the robot control component is configured to:
acquiring mapped action description information through the acquired data;
and executing action control on the corresponding robot device in the built robot hardware system according to the action description information.
10. The system of claim 9, wherein the data acquisition component comprises:
the image block data acquisition module is used for acquiring the image block where the image block is located and acquiring data carried by the image block;
and the communication module is used for transmitting the data carried by the image blocks to the robot control component through communication.
CN201810842921.0A 2018-07-27 2018-07-27 Interaction implementation method and system oriented to robot programming Pending CN110757448A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810842921.0A CN110757448A (en) 2018-07-27 2018-07-27 Interaction implementation method and system oriented to robot programming

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810842921.0A CN110757448A (en) 2018-07-27 2018-07-27 Interaction implementation method and system oriented to robot programming

Publications (1)

Publication Number Publication Date
CN110757448A true CN110757448A (en) 2020-02-07

Family

ID=69327663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810842921.0A Pending CN110757448A (en) 2018-07-27 2018-07-27 Interaction implementation method and system oriented to robot programming

Country Status (1)

Country Link
CN (1) CN110757448A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111625003A (en) * 2020-06-03 2020-09-04 上海布鲁可科技有限公司 Mobile robot toy and use method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6175954B1 (en) * 1997-10-30 2001-01-16 Fuji Xerox Co., Ltd. Computer programming using tangible user interface where physical icons (phicons) indicate: beginning and end of statements and program constructs; statements generated with re-programmable phicons and stored
CN102136208A (en) * 2011-03-30 2011-07-27 中国科学院软件研究所 Material object programming method and system
CN105137887A (en) * 2015-09-24 2015-12-09 苏州乐派特机器人有限公司 Materialized programming method based on programming plate and application thereof in robot field
CN105893060A (en) * 2016-05-09 2016-08-24 福建省闽骏科教设备有限公司 Graphical programming system and graphical programming method
CN106095096A (en) * 2016-06-12 2016-11-09 苏州乐派特机器人有限公司 Utilize the method for block in kind programming and in the application of robot field

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6175954B1 (en) * 1997-10-30 2001-01-16 Fuji Xerox Co., Ltd. Computer programming using tangible user interface where physical icons (phicons) indicate: beginning and end of statements and program constructs; statements generated with re-programmable phicons and stored
CN102136208A (en) * 2011-03-30 2011-07-27 中国科学院软件研究所 Material object programming method and system
CN105137887A (en) * 2015-09-24 2015-12-09 苏州乐派特机器人有限公司 Materialized programming method based on programming plate and application thereof in robot field
CN105893060A (en) * 2016-05-09 2016-08-24 福建省闽骏科教设备有限公司 Graphical programming system and graphical programming method
CN106095096A (en) * 2016-06-12 2016-11-09 苏州乐派特机器人有限公司 Utilize the method for block in kind programming and in the application of robot field

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111625003A (en) * 2020-06-03 2020-09-04 上海布鲁可科技有限公司 Mobile robot toy and use method thereof
CN111625003B (en) * 2020-06-03 2021-06-04 上海布鲁可积木科技有限公司 Mobile robot toy and use method thereof

Similar Documents

Publication Publication Date Title
US20240157263A1 (en) Connecting structures in a modular construction kit
US20170053550A1 (en) Education System using Connected Toys
JP6077016B2 (en) Board assembly used with toy pieces
Ishii et al. Designing laser gesture interface for robot control
US11688299B2 (en) Programming device and recording medium, and programming method
CN108898918B (en) Programming education control object device easy for programming education
CN108777100B (en) Programming education system based on tangible programming instruction building blocks
US11379245B2 (en) Controlling device and drone controlling method
CN108766118B (en) Tangible programming building block easy for programming education
WO2019216016A1 (en) Information processing device, information processing method, and program
US20170004806A1 (en) Method and apparatus to enable smartphones and computer tablet devices to communicate with interactive devices
US11498014B1 (en) Configurable devices
CN110757448A (en) Interaction implementation method and system oriented to robot programming
KR20190059068A (en) A puzzle assembling system for a grid map using augmented reality and the method thereof
KR101966914B1 (en) Toy robot control system for coding education
CN109118880B (en) Robot learning and cooperation system based on Bluetooth mesh
JP7363823B2 (en) Information processing device and information processing method
Tian et al. A cloud-based robust semaphore mirroring system for social robots
JP2018163546A (en) Programming device, control program of the same, and method for programming
KR20210056019A (en) Artificial intelligence device and operating method thereof
TWI725911B (en) Board game teaching aids
Piumatti et al. Enabling autonomous navigation in a commercial off-the-shelf toy robot for robotic gaming
KR101937855B1 (en) Robot control evaluation system based on Augmented Reality
Magnússon et al. Fable: Socially interactive modular robot
KR20210142305A (en) AI-based coding coaching service system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200207