CN117519469A - Space interaction device and method applied to man-machine interaction - Google Patents

Space interaction device and method applied to man-machine interaction Download PDF

Info

Publication number
CN117519469A
CN117519469A CN202311360908.9A CN202311360908A CN117519469A CN 117519469 A CN117519469 A CN 117519469A CN 202311360908 A CN202311360908 A CN 202311360908A CN 117519469 A CN117519469 A CN 117519469A
Authority
CN
China
Prior art keywords
interaction
robot
task
sensor data
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311360908.9A
Other languages
Chinese (zh)
Inventor
王舒捷
李延征
王文林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaoyu Intelligent Manufacturing Technology Co ltd
Original Assignee
Beijing Xiaoyu Intelligent Manufacturing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaoyu Intelligent Manufacturing Technology Co ltd filed Critical Beijing Xiaoyu Intelligent Manufacturing Technology Co ltd
Priority to CN202311360908.9A priority Critical patent/CN117519469A/en
Publication of CN117519469A publication Critical patent/CN117519469A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a space interaction device and a method applied to man-machine interaction, wherein the device comprises: the sensor unit is used for acquiring sensor data generated by a user according to a task to be executed by the robot and a working space moving space interaction device of the task to be executed; the interaction unit is used for acquiring interaction information generated by interaction between a user and the space interaction device; the computing unit is used for generating a robot control instruction according to the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a task to be executed of the robot, which are determined according to the sensor data and the interaction information; and the sending unit is used for sending the robot control instruction to the robot so that the robot can execute the task to be executed according to the moving path and the target operation. The application provides a novel space interaction device for interaction with a robot, which improves the interaction efficiency and accuracy of a user and the robot.

Description

Space interaction device and method applied to man-machine interaction
Technical Field
The application relates to the technical field of robots, in particular to a space interaction device and an interaction method applied to man-machine interaction.
Background
In the process of man-machine interaction, it is often necessary to communicate the work object, work area and workflow to the robot, i.e. the robot needs to know where it works, what tasks are performed, which specific actions or steps are performed. Taking the article to be handled as an example, the robot needs to know the following information: what the article to be handled is (e.g., a box or a piece of sheet metal); where the item is (e.g., the item may be on a shelf, or on a table); where the item needs to be moved (e.g., another shelf, or another workstation). In addition, the robot needs to know how to accomplish this task, including the steps of: moving to the position of the article; properly operating the robotic arm to pick up the item in the proper pose (e.g., the item may need to be picked up at a particular angle or position to prevent slipping or damage); moving to a target position; put down the article in the correct posture.
To achieve the above object, the following two main methods are generally adopted at present:
1. a separate teach pendant is added. In this method, an operator uses a teach pendant to instruct the robot how to perform a transport task. For example, the operator uses the teach pendant to input the position of the item and then instructs the robot to move to that position. Next, the operator may need to manually operate the robotic arm of the robot to pick up the item in the correct pose. The operator then needs to instruct the robot to move to the target location and finally instruct the robot to put down the item in the correct pose. The problem with this approach is that each time the position or target position of the item is changed, or each time a different item needs to be handled, the operator needs to manually enter a new command, which is very time consuming and inefficient. Furthermore, the operator must always be in the work area around the robot in order to be able to use the teach pendant, which has a strict limitation on the work environment.
2. In this approach, the work area is placed with markers, which may be physical, such as bar codes or two-dimensional codes, or virtual, such as augmented reality markers. The robot can recognize these markers and locate and perform tasks based on them. For example, the item may be placed in a location with a marking that the robot can recognize and then move to that location. The robot can then determine from the other mark how to pick up the item in the correct pose. Finally, the robot may determine how to move and drop the item based on the markings on the target location. However, this approach also has problems: first, the installation and maintenance of the markers requires time and effort; second, if the work environment changes, for example, if the item is moved, or the target location changes, the tag may need to be reconfigured. Finally, this approach places a limit on the working environment, as not all environments are suitable for placing the markers.
Both of the above methods have problems of low efficiency and severe restrictions on the working environment. Therefore, a new interaction mode between the operator and the robot is needed, so that the interaction efficiency and accuracy of the operator and the robot can be improved, and the requirements of the application field are really met.
Disclosure of Invention
In order to overcome the problems in the related art, the embodiment of the invention provides a space interaction device and a method applied to man-machine interaction, and the technical scheme is as follows:
according to a first aspect of an embodiment of the present invention, there is provided a spatial interaction device applied to man-machine interaction, including:
the sensor unit is used for acquiring sensor data generated by a user according to a task to be executed by the robot and a working space moving space interaction device of the task to be executed;
the interaction unit is used for acquiring interaction information generated by interaction between a user and the space interaction device;
the computing unit is used for generating a robot control instruction according to the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be executed task, and the moving path and the target operation are determined according to the sensor data and the interaction information;
and the sending unit is used for sending the robot control instruction to the robot so as to enable the robot to execute the task to be executed according to the moving path and the target operation.
In an embodiment, the calculating unit generates the robot control instruction according to the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, and includes:
determining position information of a space interaction device according to the interaction information and the sensor data;
determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
In an embodiment, the calculating unit determines the position information of the spatial interaction device according to the interaction information and the sensor data, and includes:
modeling surrounding environment to determine a robot coordinate system;
and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
In an embodiment, the spatial interaction device includes a preset portion, and the sensor unit is configured to acquire sensor data generated by the user moving the spatial interaction device and using the preset portion as a reference object.
According to a second aspect of the embodiments of the present invention, there is provided a spatial interaction method, which is applied to the spatial interaction device, including:
acquiring sensor data generated by a user according to a task to be executed by the robot and a working space mobile space interaction device of the task to be executed from a sensor unit;
acquiring interaction information generated by interaction of a user and a space interaction device from an interaction unit;
generating a robot control instruction according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be subjected to a task determined according to the sensor data and the interaction information;
and sending the robot control instruction to the robot so that the robot executes the task to be executed according to the moving path and the target operation.
In an embodiment, the generating the robot control command according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit includes:
determining the position information of the space interaction device according to the interaction information and the sensor data;
determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
In an embodiment, the determining the location information of the spatial interaction device according to the interaction information and the sensor data includes:
modeling surrounding environment to determine a robot coordinate system;
and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
In an embodiment, the spatial interaction device includes a preset portion; acquiring sensor data generated by a user at a robot work space movement space interaction device from a sensor unit, comprising:
sensor data generated by the user moving the spatial interaction device and referenced to the preset portion is acquired from the sensor unit.
According to a third aspect of embodiments of the present invention, there is provided a spatial interaction device applied to man-machine interaction, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring sensor data generated by a user according to a task to be executed by the robot and a working space mobile space interaction device of the task to be executed from a sensor unit;
acquiring interaction information generated by interaction of a user and a space interaction device from an interaction unit;
generating a robot control instruction according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be subjected to a task determined according to the sensor data and the interaction information;
and sending the robot control instruction to the robot so that the robot executes the task to be executed according to the moving path and the target operation.
According to a fourth aspect of embodiments of the present invention there is provided a computer readable storage medium having stored thereon computer instructions which when executed by a processor implement the steps of the method of any of the second aspects of embodiments of the present invention.
The space interaction device provided by the embodiment of the invention is simple and easy to use, does not need professional training, and can acquire the moving path and target operation of the task to be executed by the robot only by clicking a button and moving the space interaction device in space; the robot has strong universality, can directly transmit tasks to the robot through actions of users, and is simple in movement, complex in shape sketching and even track planning; and the method has little dependence on the environment, does not need to modify the environment or set special marks, and realizes the precise control of the robot.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention as claimed.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a spatial interaction device applied to man-machine interaction according to an embodiment of the present application;
FIG. 2 is a flowchart of a spatial interaction method according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a spatial interaction device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms first and second and the like in the description and in the claims of the present application and in the above-described figures are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to the listed steps or elements but may include steps or elements not expressly listed.
The space interaction device provided by the application can be applied to man-machine interaction, including but not limited to man-machine interaction with various robots and intelligent devices.
In an embodiment of the present application, a robot may include a mobile base and a robotic arm. The moving base can drive the robot to integrally move, and the mechanical arm can move relative to the moving base. The robot arm tip also typically includes a replacement module for replacing the tip tool to perform different operations. End tools where the robot arm end may be replaced include, but are not limited to: welding tools, paint tools, sanding tools, handling tools, etc., the corresponding end tools may be replaced according to the use scenario.
In an embodiment of the present application, the spatial interaction device is a completely robot independent device, not belonging to any part of the robot. As shown in fig. 1, a spatial interaction device according to an embodiment of the present application is applied to interaction with a robot, and includes:
a sensor unit 101, configured to acquire sensor data generated by a user according to a task to be performed by the robot and a space interaction device moved in a working space of the task to be performed;
the interaction unit 102 is configured to obtain interaction information generated by interaction between a user and the spatial interaction device;
a calculating unit 103, configured to generate a robot control instruction according to the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, where the robot control instruction at least includes a movement path and a target operation of a task to be executed of the robot determined according to the sensor data and the interaction information;
and a sending unit 104, configured to send the robot control instruction to the robot, so that the robot executes the task to be executed according to the movement path and the target operation.
In an embodiment of the present application, to facilitate the operation of the operator, the shape and size of the spatial interaction device may be a shape and size suitable for the single hand holding, moving, and posture changing of the operator. In an embodiment of the present application, the spatial interaction device may include a preset portion, which may be, for example, a rigid metal rod (may be referred to as a "probe"), and in an embodiment of the present application, the preset portion may be detached from the spatial interaction device body; the preset portion may be a portion on the body of the space interaction device, for example, a portion that is more convex (for example, a corner). The sensor unit 101 is configured to acquire sensor data generated by the user mobile space interaction device and referenced to a preset portion. The preset part is arranged, so that the position and the route of the user mobile space interaction device are more accurate.
In an embodiment of the present application, the sensor unit 101 may include various sensors, such as a position sensor, an attitude sensor, an image sensor (e.g., a camera), a vision sensor, a force sensor, a temperature and humidity sensor, a barometer, a speed sensor, an acceleration sensor, and the like. The space interaction device comprises a sensor unit, and a user can move the space interaction device in a working space of the task to be executed according to the task to be executed by the robot. In an embodiment, a task object, such as steel to be welded, is already provided in the work space in which the task is to be performed. The task to be executed by the robot is to weld the welding seam of the steel to be welded. Then the user holds the spatial interaction device and places the probe of the spatial interaction device at the beginning of the weld and starts moving until the end of the weld before the welding task begins. During this movement, the various sensors comprised by the sensor unit may provide various sensor data, for example sensor data about position, attitude, speed, etc.
In an embodiment of the present application, the interaction unit 102 may include a variety of interaction components, for example, including but not limited to a touchable screen, various physical keys (buttons, authorities, snaps, etc.), and so on. The interactive information may be generated according to various interactive operations performed by the user with the interactive component. Types of interactions include, but are not limited to: single click, double click, multiple continuous click, long press, combined key, dragging, etc. The interaction information includes, but is not limited to: the type of interaction, duration, pressure, spacing, drag path, etc.
In an embodiment of the present application, the computing unit 103 may be a microprocessor, such as a System-on-a-Chip (SoC), for receiving data and information and processing to generate robot control instructions. And various algorithms are operated on the computing unit and are used for processing and summarizing the original sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit into a robot control instruction, so that the robot can complete tasks indicated by a user through the space interaction device.
In an embodiment of the present application, the sending unit 104 may use any available wired or wireless sending technology, where the wireless sending technology includes, but is not limited to, bluetooth, infrared, zigbee, 4G, or 5G mobile communication technology, and the wired sending technology includes, but is not limited to, a network cable, an optical fiber, and the like. This embodiment is not limited thereto. In an embodiment of the present application, the sending unit 104 may send the robot control command to an industrial personal computer of the robot.
In an embodiment of the application, a task object, such as steel to be welded, is already provided in the work space in which the task is to be performed. The task to be executed by the robot is to weld the welding seam of the steel to be welded. Before the welding task starts, the user holds the space interaction device, places the probe of the space interaction device at the starting point of the welding line of the steel to be welded, clicks the physical key 1 in the interaction unit, starts to move along the welding line, and clicks the physical key 1 in the interaction unit again until the end point of the welding line is reached. During this movement, various sensors included in the sensor unit 101 may provide various sensor data, such as sensor data regarding position, attitude, speed, and the like; the interaction unit 102 may also provide interaction information of the physical key 1 clicked twice by the user; the calculation unit 103 may generate a robot control instruction according to the sensor data and the interaction information, where the robot control instruction includes at least a movement path of the robot to perform the task (i.e., a movement path of the probe from the weld start point to the weld end point) and a target operation (welding) determined according to the sensor data and the interaction information, and the transmission unit 104 transmits the robot control instruction to the robot so that the robot performs the task to be performed according to the movement path and the target operation.
In an embodiment of the present application, a task object, such as the target object 1, is already provided in the workspace in which the task is to be performed. The task to be performed by the robot is to carry the target object 1 from position a to position B. Before the handling task starts, the user holds the spatial interaction device, places the probe of the spatial interaction device at the position a where the target object 1 is located, and double-clicks the physical key 1 in the interaction unit and then starts moving until reaching the position B to double-click the physical key 1 in the interaction unit again. During this movement, various sensors included in the sensor unit 101 may provide various sensing data, such as sensor data regarding position, posture, speed, etc.; the interaction unit 102 may also provide interaction information of the physical key 1 for two double-clicking by the user; the calculation unit 103 may generate a robot control instruction according to the sensor data and the interaction information, where the robot control instruction includes at least a movement path of the robot to perform the task (i.e., a movement path of the probe from the position a to the position B) and a target operation (pick-up at the position a and drop-down at the position B) determined according to the sensor data and the interaction information, and the transmission unit 104 transmits the robot control instruction to the robot so that the robot performs the task to be performed according to the movement path and the target operation.
The space interaction device provided by the application is simple and easy to use, does not need professional training, and a user can obtain the moving path and target operation of the task to be executed by the robot only by clicking a button and moving equipment in the space; the robot has strong universality, can directly transmit tasks to the robot through actions of users, and is simple in movement, complex in shape sketching and even track planning; and the method has little dependence on the environment, does not need to modify the environment or set special marks, and realizes the precise control of the robot.
In an embodiment of the present application, the calculating unit 103 generates a robot control instruction according to the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, including:
determining position information of a space interaction device according to the interaction information and the sensor data;
determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
Wherein determining the position information of the spatial interaction device may comprise determining the position of the spatial interaction device (the sampling accuracy of the sensor may be set as desired), the movement of the position, the speed of the movement, the gesture of the movement, etc. within a period of time, which may be determined by the interaction information, e.g. the period of time between two single clicks of the first key.
In an embodiment of the present application, the movement path of the spatial interaction device may be determined according to the position information of the spatial interaction device, and the movement path of the spatial interaction device is directly used as the movement path of the task to be executed by the robot. The moving path of the task to be executed by the robot may be a moving path of an end tool of a robot arm.
In an embodiment of the present application, a movement path of the spatial interaction device may be determined according to the position information of the spatial interaction device, and the movement path may be closed loop, where an area surrounded by the movement path of the spatial interaction device may be used as a range of movement paths of the robot to perform the task. For example, in a paint application scenario, a user may move a spatial interaction device over a wall surface to frame an area to be painted by a robot, the path of movement of the robot to perform a task is within the area, and the area is filled.
In an embodiment of the present application, not only the position information of the spatial interaction device may be determined according to the interaction information and the sensor data, but also the posture information of the spatial interaction device may be determined, and the posture of the spatial interaction device may be used as the posture of the robot to perform the task.
In an embodiment of the present application, the target operation of the robot task may be determined according to the interaction information, for example, clicking the first key indicates that the mechanical arm is to perform the welding operation, clicking the first key twice indicates that the mechanical arm is to perform the pick-up operation, and clicking the first key again indicates that the mechanical arm is to perform the drop-down operation.
In an embodiment of the present application, when the computing unit 103 determines the position information of the spatial interaction device according to the interaction information and the sensor data, the surrounding environment may be modeled first to determine the robot coordinate system; and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
In an embodiment of the present application, when the calculation unit generates the robot control instruction, the first movement path of the moving base of the robot and the second movement path of the robot arm may be further determined according to the movement path of the tool at the end of the robot arm, so as to generate the first control instruction and the second control instruction, so that the robot moves the moving base according to the first control instruction to the target area according to the first movement path, and then moves the robot arm according to the second movement path and the target operation according to the second control instruction. The second control instruction function may further include a control instruction of a component of the mechanical arm required by the mechanical arm to complete the target operation, for example, operations of components of the mechanical arm, such as joints, a motor, a transmission device, and the like, where the components cooperate to enable the mechanical arm to perform the target operation.
As shown in fig. 2, an embodiment of the present application provides a spatial interaction method applied to the spatial interaction device, which includes steps S201 to S204 as follows:
in step S201, sensor data generated by a user according to a task to be performed by the robot, moving the spatial interaction device in a work space where the task is to be performed, is acquired from the sensor unit.
In step S202, interaction information generated by the user interacting with the spatial interaction device is acquired from the interaction unit.
In step S203, a robot control instruction is generated according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, where the robot control instruction at least includes a movement path and a target operation of a task to be executed by the robot determined according to the sensor data and the interaction information.
In step S204, the robot control command is sent to the robot, so that the robot executes the task to be executed according to the movement path and the target operation.
In an embodiment of the present application, the generated control instruction may include a first control instruction for controlling the movement base of the robot to move and a second control instruction for controlling the mechanical arm to move, so that the robot moves the base to the target area according to the first control instruction, and then moves the mechanical arm according to the movement path and the target operation according to the second control instruction.
According to the space interaction method, the user moves the space interaction device according to the task to be executed by the robot, the robot instruction is directly generated to at least transmit the moving path and the target operation of the task to be executed by the robot to the robot, mapping from human actions to the robot instruction is achieved, the method has wide applicability, accurate positioning is guaranteed without depending on environment marks, the space interaction device is easy to operate, professional training is not needed, and therefore the efficiency and the accuracy of interaction between the user and the robot are greatly improved.
In an embodiment of the present application, the generating a robot control command according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit includes steps A1-A3:
step A1: determining the position information of the space interaction device according to the interaction information and the sensor data;
step A2: determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
step A3: and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
In an embodiment of the present application, the determining the location information of the spatial interaction device according to the interaction information and the sensor data includes steps B1-B2:
step B1: modeling surrounding environment to determine a robot coordinate system;
step B2: and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
When the user uses the spatial interaction device for the first time, the spatial interaction device coordinate system needs to be aligned with the robot coordinate system, that is, the surrounding environment is modeled to determine the robot coordinate system, and the position information of the spatial interaction device in the robot coordinate system is determined, so as to ensure that the map established in the VSLAM (Visual Simultaneous Localization And Mapping, vision-based synchronous positioning and mapping) algorithm can accurately track the absolute positions of the spatial interaction device and the robot in space.
The following are examples of the apparatus of the present invention that may be used to perform the method embodiments of the present invention.
Fig. 3 is a block diagram of a spatial interaction device, which may be a terminal or a part of a terminal, implemented as part or all of an electronic device by software, hardware, or a combination of both, according to an example embodiment. As shown in fig. 3, the apparatus includes:
a first obtaining module 301, configured to obtain, from a sensor unit, sensor data generated by a user according to a task to be performed by the robot and a workspace moving space interaction device in a workspace to be performed by the robot;
a second obtaining module 302, configured to obtain, from the interaction unit, interaction information generated by interaction between the user and the spatial interaction device;
a processing module 303, configured to generate a robot control instruction according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, where the robot control instruction at least includes a movement path and a target operation of a task to be executed by the robot, which are determined according to the sensor data and the interaction information;
and a sending module 304, configured to control the sending unit to send the robot control instruction to the robot, so that the robot executes the task to be executed according to the movement path and the target operation.
Optionally, the processing module 303 is configured to:
determining the position information of the space interaction device according to the interaction information and the sensor data;
determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
Optionally, the processing module 303 is configured to:
modeling surrounding environment to determine a robot coordinate system;
and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
In another embodiment of the present application, there is also provided a readable storage medium having stored thereon a computer program which, when executed by a processor, implements a spatial interaction method as described in any of the above.
In another embodiment of the present application, there is also provided a spatial interaction device, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring sensor data generated by a user according to a task to be executed by the robot and a working space mobile space interaction device of the task to be executed from a sensor unit;
acquiring interaction information generated by interaction of a user and a space interaction device from an interaction unit;
generating a robot control instruction according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be subjected to a task determined according to the sensor data and the interaction information;
and the control sending unit sends the robot control instruction to the robot so that the robot executes the task to be executed according to the moving path and the target operation.
It should be noted that, the specific implementation of the processor in this embodiment may refer to the corresponding content in the foregoing, which is not described in detail herein.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A spatial interaction device for human-computer interaction, comprising:
the sensor unit is used for acquiring sensor data generated by a user according to a task to be executed by the robot and a working space moving space interaction device of the task to be executed;
the interaction unit is used for acquiring interaction information generated by interaction between a user and the space interaction device;
the computing unit is used for generating a robot control instruction according to the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be executed task, and the moving path and the target operation are determined according to the sensor data and the interaction information;
and the sending unit is used for sending the robot control instruction to the robot so as to enable the robot to execute the task to be executed according to the moving path and the target operation.
2. The apparatus according to claim 1, wherein the calculation unit generates the robot control instruction based on the sensor data acquired by the sensor unit and the interaction information acquired by the interaction unit, comprising:
determining position information of a space interaction device according to the interaction information and the sensor data;
determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
3. The apparatus of claim 2, wherein the computing unit determining location information of the spatial interaction device from the interaction information and the sensor data comprises:
modeling surrounding environment to determine a robot coordinate system;
and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
4. The apparatus according to claim 1, wherein the spatial interaction device includes a preset portion, and the sensor unit is configured to acquire sensor data generated by the user moving the spatial interaction device with reference to the preset portion.
5. A spatial interaction method applied to the spatial interaction device of any one of claims 1-4, comprising:
acquiring sensor data generated by a user according to a task to be executed by the robot and a working space mobile space interaction device of the task to be executed from a sensor unit;
acquiring interaction information generated by interaction of a user and a space interaction device from an interaction unit;
generating a robot control instruction according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be subjected to a task determined according to the sensor data and the interaction information;
and the control sending unit sends the robot control instruction to the robot so that the robot executes the task to be executed according to the moving path and the target operation.
6. The method of claim 5, wherein generating the robot control command from the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit comprises:
determining the position information of the space interaction device according to the interaction information and the sensor data;
determining a moving path and target operation of a task to be executed by the robot according to the interaction information and the position information of the space interaction device;
and generating a robot control instruction so that the robot executes the task to be executed according to the moving path and the target operation.
7. The method of claim 6, wherein determining location information of a spatial interaction device based on the interaction information and sensor data comprises:
modeling surrounding environment to determine a robot coordinate system;
and determining the position information of the space interaction device in the robot coordinate system according to the interaction information and the sensor data.
8. The method of claim 5, wherein the spatial interaction device comprises a preset portion; acquiring sensor data generated by a user at a robot work space movement space interaction device from a sensor unit, comprising:
sensor data generated by the user moving the spatial interaction device and referenced to the preset portion is acquired from the sensor unit.
9. A spatial interaction device for human-computer interaction, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring sensor data generated by a user according to a task to be executed by the robot and a working space mobile space interaction device of the task to be executed from a sensor unit;
acquiring interaction information generated by interaction of a user and a space interaction device from an interaction unit;
generating a robot control instruction according to the sensor data acquired from the sensor unit and the interaction information acquired from the interaction unit, wherein the robot control instruction at least comprises a moving path and target operation of a robot to be subjected to a task determined according to the sensor data and the interaction information;
and the control sending unit sends the robot control instruction to the robot so that the robot executes the task to be executed according to the moving path and the target operation.
10. A computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the steps of the method of any of claims 5-8.
CN202311360908.9A 2023-10-19 2023-10-19 Space interaction device and method applied to man-machine interaction Pending CN117519469A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311360908.9A CN117519469A (en) 2023-10-19 2023-10-19 Space interaction device and method applied to man-machine interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311360908.9A CN117519469A (en) 2023-10-19 2023-10-19 Space interaction device and method applied to man-machine interaction

Publications (1)

Publication Number Publication Date
CN117519469A true CN117519469A (en) 2024-02-06

Family

ID=89742826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311360908.9A Pending CN117519469A (en) 2023-10-19 2023-10-19 Space interaction device and method applied to man-machine interaction

Country Status (1)

Country Link
CN (1) CN117519469A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107199566A (en) * 2017-06-02 2017-09-26 东南大学 A kind of remote control system of the space-oriented station robot based on virtual arm
US20200368904A1 (en) * 2019-05-20 2020-11-26 Russell Aldridge Remote robotic welding with a handheld controller
CN113183133A (en) * 2021-04-28 2021-07-30 华南理工大学 Gesture interaction method, system, device and medium for multi-degree-of-freedom robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107199566A (en) * 2017-06-02 2017-09-26 东南大学 A kind of remote control system of the space-oriented station robot based on virtual arm
US20200368904A1 (en) * 2019-05-20 2020-11-26 Russell Aldridge Remote robotic welding with a handheld controller
CN113183133A (en) * 2021-04-28 2021-07-30 华南理工大学 Gesture interaction method, system, device and medium for multi-degree-of-freedom robot

Similar Documents

Publication Publication Date Title
Ong et al. Augmented reality-assisted robot programming system for industrial applications
US11724388B2 (en) Robot controller and display device using augmented reality and mixed reality
CN110394780B (en) Simulation device of robot
US10737396B2 (en) Method and apparatus for robot path teaching
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
US20190202058A1 (en) Method of programming an industrial robot
Fang et al. A novel augmented reality-based interface for robot path planning
EP2263837A1 (en) Operation teaching system and operation teaching method
CN103250109A (en) Method and means for controlling a robot
US20050251290A1 (en) Method and a system for programming an industrial robot
US20150321351A1 (en) Intuitive Motion Coordinate System for Controlling an Industrial Robot
US11833697B2 (en) Method of programming an industrial robot
KR101876845B1 (en) Robot control apparatus
EP4177015B1 (en) Robot teaching system
CN117519469A (en) Space interaction device and method applied to man-machine interaction
US20220241980A1 (en) Object-Based Robot Control
CN112454363A (en) Control method of AR auxiliary robot for welding operation
JP7068416B2 (en) Robot control device using augmented reality and mixed reality, computer program for defining the position and orientation of the robot, method for defining the position and orientation of the robot, computer program for acquiring the relative position and orientation, and method for acquiring the relative position and orientation.
CN111152230B (en) Robot teaching method, system, teaching robot and storage medium
CN117260776A (en) Man-machine interaction method and device applied to robot
Jaju et al. Telepresence System with 3D Mouse and Path-Planning Functionality
IT202100027485A1 (en) APPARATUS AND PROCEDURE FOR PROGRAMMING ROBOTS BY MEANS OF DEMONSTRATION
CN117140498A (en) Robot motion trail teaching method and teaching device
Sziebig et al. Visual Programming of Robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination