WO2024166238A1 - ロボット制御装置、ロボット制御システム、およびロボット制御方法 - Google Patents
ロボット制御装置、ロボット制御システム、およびロボット制御方法 Download PDFInfo
- Publication number
- WO2024166238A1 WO2024166238A1 PCT/JP2023/004112 JP2023004112W WO2024166238A1 WO 2024166238 A1 WO2024166238 A1 WO 2024166238A1 JP 2023004112 W JP2023004112 W JP 2023004112W WO 2024166238 A1 WO2024166238 A1 WO 2024166238A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- display
- processing unit
- robot control
- person
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 12
- 238000004364 calculation method Methods 0.000 claims abstract description 20
- 238000001514 detection method Methods 0.000 claims abstract description 16
- 238000004458 analytical method Methods 0.000 claims description 10
- 238000003384 imaging method Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 14
- 239000012636 effector Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/06—Safety devices
Definitions
- This disclosure relates to a robot control device, a robot control system, and a robot control method.
- the robot control device controls the robot's movements according to a control program or commands from the user via a user interface.
- a worker works in a shared workspace with a robot controlled by such a robot control device, the worker's safety must be guaranteed. Methods have been proposed to help workers work safely near robots.
- the control program that controls the robot's operations is read ahead, and when a branch command is found, multiple predicted trajectories, which are the trajectories from the branch origin position to each target position for each branch destination, are generated as virtual images and displayed together with the robot. Furthermore, the thickness, etc. of the multiple predicted trajectories are changed depending on the possibility of it being a branch destination.
- Patent Document 1 the robot's predicted trajectory is merely displayed along with possible branching paths. Therefore, it is difficult for workers to quickly determine whether or not the predicted trajectory is important to human safety just by looking at it.
- This disclosure has been made in consideration of the above problems, and aims to provide a robot control device, a robot control system, and a robot control method that make it easy for people to understand the robot's operations that affect human safety.
- the robot control device is a robot control device that operates according to a control program and includes a look-ahead motion calculation unit that looks ahead at the control program and calculates robot motion information, which is information defined in the control program, a display image generation unit that generates a display image showing the robot motion information, a recognition processing unit that identifies whether or not the detection signal received from the detection device contains a human, and a display switching processing unit that switches the display image depending on whether the recognition processing unit recognizes that the detection signal contains a human or not.
- the robot control system includes a robot control device according to this disclosure, a robot controlled by the robot control device, a detection device that detects an area including the robot's operating area, and a display device that receives and displays a display image from the robot control device.
- the robot control method disclosed herein includes the steps of: reading ahead the control program and calculating robot operation information, which is information defined in the control program; generating a display image showing the robot operation information; identifying whether or not the detection signal received from the detection device contains a human; and switching the display image depending on whether or not the detection signal contains a human.
- This disclosure makes it easier for people to understand the robot's behavior that affects their safety.
- FIG. 1 is a schematic configuration diagram showing a robot control system according to a first embodiment.
- FIG. 1 is a functional block diagram of a robot control system according to a first embodiment.
- 4 is a flowchart of a process performed by the robot control device according to the first embodiment.
- 13 is an example of displaying a warning when a person approaches the robot according to the first embodiment.
- FIG. 2 is a hardware configuration diagram for implementing the robot control device according to the first embodiment. 13 is an example of a display image in which the robot operation information according to the first embodiment is expressed as a robot posture.
- 1 is an example in which the projection surface of the display device according to the first embodiment is a conveyor belt.
- FIG. 13 is an example of a case in which an LED display, which is a display device, is incorporated in the base of the robot according to the first embodiment. 13 is an example of displaying that the robot of the robot control system according to the first embodiment is stopped.
- FIG. 11 is a functional block diagram of a robot control system according to a second embodiment. 13 is a flowchart of a process performed by a robot control device according to the second embodiment.
- FIG. 13 is a functional block diagram of a robot control system according to a third embodiment. 13 shows an example of changing the character direction by the image direction changing unit according to the third embodiment.
- FIG. 13 is a functional block diagram showing a robot control system according to a fourth embodiment.
- FIG. 13 is a diagram for explaining a display section according to the fourth embodiment.
- 13 shows an example in which a display section is changed in the robot control system according to the fourth embodiment. 13 shows an example of setting a display section in the robot control system according to the fourth embodiment. 13 shows an example of setting a display section in the robot control system according to the fourth embodiment.
- Embodiment 1. 1 is a schematic diagram showing a robot control system 100.
- the robot control system 100 includes a robot 1, a robot control device 3 that controls the operation of the robot 1, an imaging device 4 that detects people and objects such as a worker 2 in the operating area of the robot 1, and a display device 5 that displays robot operation information (to be described later) in the actual working space.
- Robot 1 is, for example, a six-axis vertical articulated robot, and the arms of each axis are driven by a motor mounted on each axis.
- Robot 1 is equipped with an encoder (not shown), which measures the current rotation angle of the motor.
- robot 1 works with an operator to grasp and move workpiece 7 placed on belt conveyor 6.
- the robot 1 operates on a workpiece 7 placed on the belt conveyor 6, but this is not limiting.
- peripheral devices other than the belt conveyor 6, such as a workbench, and the robot 1 may perform other operations instead of operating on the workpiece 7.
- Worker 2 is a collaborating worker who works in the same workspace as robot 1. Worker 2 works on workpiece 7 in a workspace that includes the motion area of robot 1.
- Imaging device 4 detects an area that includes the motion area of robot 1.
- imaging device 4 is a detection device that detects people and objects that exist around the robot, and the imaging information captured by imaging device 4 is a detection signal.
- the robot control device 3 reads ahead the control program and calculates robot operation information such as the robot 1's operation start position, current position, destination position, operation path (trajectory), operation direction, operation speed, etc., by referencing the positions of the robot 1's arm and end effector.
- robot operation information information related to the robot 1's operation defined in the control program is referred to as robot operation information.
- the robot control device 3 also has a robot model with the arm length and rotation direction of each axis, and calculates the end effector's current feedback position, current feedback posture, current feedback speed, etc. by acquiring the motor rotation angle sent from the robot 1's encoder.
- the robot control device 3 generates an image showing the robot operation information to the worker 2 and sends it to the display device 5.
- the display device 5 is, for example, a projector, and is connected to the robot control device 3 wirelessly or via a wire.
- the display device 5 displays an image showing the robot operation information received from the robot control device 3 in the workspace.
- the area surrounded by a dotted line is the display image.
- the display image includes the operation start position P1, current position Pc, destination position P2, operation path L1, operation direction, etc. of the end effector.
- the operation direction is represented by an arrow.
- the operation start position P1, current position Pc, destination position P2, operation path L1, operation direction, and operation speed are robot operation information.
- the operation path L1 is the operation trajectory from the operation start position P1 to the destination position P2.
- FIG. 2 is a functional block diagram of a robot control system 100 according to the first embodiment.
- the robot control device 3 includes a robot control unit 10, a look-ahead operation calculation unit 11, a display image generation unit 12, a recognition processing unit 13, a determination processing unit 14, and a display switching processing unit 15.
- the look-ahead motion calculation unit 11 looks ahead at the control program and obtains the target position P2 and the type of motion interpolation.
- the types of motion interpolation include interpolation that moves the end effector in a straight line, interpolation that moves it on an arc, and interpolation that moves it on a pre-set free curve, and it is also possible to specify other interpolation methods.
- the look-ahead motion calculation unit 11 performs interpolation processing by calculating the command position P(t) of the end effector for each time t from the target position P2 of the control program and the type of motion interpolation.
- the look-ahead motion calculation unit 11 calculates the motion path L1 based on the command position P(t).
- the look-ahead motion calculation unit 11 calculates the motion path L1, motion start position P1, target position P2, current position Pc, motion speed, etc. as robot motion information, and transmits them to the display image generation unit 12, recognition processing unit 13, and robot control unit 10.
- the motion start position P1 is the position where interpolation to the target position P2 starts or the position obtained from the position where the robot is stopped.
- the robot control unit 10 controls the movement of the robot 1 by driving the motors of each axis of the robot 1 based on the robot movement information obtained from the look-ahead movement calculation unit 11.
- the recognition processing unit 13 receives imaging information from the imaging device 4, and acquires information including the positions and movements of objects such as machine tools, structures, and workpieces 7, and people such as workers 2.
- the recognition processing unit 13 recognizes whether or not a person is included in the imaging information. If a person is included, the recognition processing unit 13 analyzes, based on the imaging information, where the person is located and what kind of movement the person is making (e.g., direction of movement, speed of movement, etc.).
- the recognition processing unit 13 also analyzes where objects are located, etc. For example, model images of objects such as machine tools and structures and people are registered in advance, and analysis is performed by comparing the imaging information with the model images.
- the recognition processing unit 13 transmits the analysis results to the judgment processing unit 14.
- the judgment processing unit 14 judges whether or not the person and the robot 1 are approaching each other based on the robot operation information received from the predictive operation calculation unit 11 and the analysis results of the recognition processing unit 13. To judge whether or not the person and the robot 1 are approaching each other, for example, the relative distance between the person and the robot 1 over time is calculated, and if the relative distance becomes shorter over time, it is judged that they are approaching each other. The judgment processing unit 14 transmits the judgment result to the display switching processing unit 15.
- the display image generation unit 12 generates a display image showing the robot operation information from the robot operation information obtained from the look-ahead operation calculation unit 11, and transmits it to the display switching processing unit 15.
- display images of the operation start position P1, current position Pc, destination position P2, and operation path L1 are generated.
- the display switching processing unit 15 switches the display image based on the judgment result of the judgment processing unit 14.
- the display switching processing unit 15 switches the display image depending on whether the recognition processing unit 13 recognizes that the image includes a person or not. Specifically, if the judgment processing unit 14 judges that a person is approaching in the direction of the robot 1's movement, the display image received from the display image generation unit 12 is changed by adding a warning or the like, and the changed display image is sent to the display device 5.
- FIG. 3 is a flowchart of the processing performed by the robot control device 3 according to the first embodiment.
- the look-ahead motion calculation unit 11 of the robot control device 3 looks ahead at the control program to calculate robot motion information and transmits it to the display image generation unit 12 (step S1).
- the display image generation unit 12 generates a display image based on the robot motion information and transmits it to the display switching processing unit 15 (step S2).
- the recognition processing unit 13 acquires the imaging information captured by the imaging device 4 (step S3).
- the recognition processing unit 13 then analyzes the imaging information to determine where an object is located, whether there is a person, and if so, where the person is located, as well as the direction and speed of the person's movement, and transmits the analysis results to the judgment processing unit 14 (step S4). If there is a person (step S4 Yes), the process proceeds to step S5; if there is no person (step S4 No), the process proceeds to step S7.
- the judgment processing unit 14 determines from the received analysis results whether the person and the robot 1 are approaching each other, and transmits the result to the display switching processing unit 15 (step S5).
- step S5 Yes the display switching processing unit 15 transmits a display image with modifications such as adding a warning to the display device 5 (step S6). Specific modifications include, for example, adding a warning or changing the font size, and if a display image other than the warning interferes with the warning, the warning is displayed with priority. If the person and robot 1 are not approaching each other (step S5 No), the display switching processing unit 15 transmits the display image received from the display image generation unit 12 to the display device 5 as is (step S7).
- steps S1 and S2 and steps S3 and S4 are not limited to this. Steps S3 and S4 may occur before steps S1 and S2.
- the display image indicated by the dotted line in FIG. 4 is an example of the display image shown in FIG. 1 modified by the display switching processing unit 15.
- the display switching processing unit 15 adds warning character 20 and a mark 21, makes the size of the characters "P1" and "Pc” smaller than the warning character 20, and deletes the low priority characters "P2" and "L1", and the arrow.
- FIG. 5 is a hardware configuration diagram for realizing the robot control device 3. As shown in FIG. 5, a part or all of the robot control device 3 is specifically composed of a CPU 31 (Central Processing Unit), a storage device 32, an IO (INPUT OUTPUT) interface 33, etc.
- a CPU 31 Central Processing Unit
- a storage device 32 for storing data
- an IO INPUT OUTPUT
- the storage device 32 is composed of a ROM (Read Only Memory), a HDD (Hard Disk Drive), etc.
- the imaging device 4, the display device 5, and the robot 1 are connected wirelessly or via wires to the IO interface 33 of the robot control device 3.
- the storage device 32 stores control programs, model images of objects such as machine tools and structures and people used by the recognition processing unit 13, an arm model of the robot 1, etc.
- a series of processes of the robot control device 3 is realized by the CPU 31 executing the control programs stored in the storage device 32.
- the imaging device 4 obtains the situation around the robot 1 and detects whether or not a person is present. If a person is present, the judgment processing unit 14 judges whether or not the person and the robot 1 are approaching based on the robot operation information and the movement of the person. If a person is approaching (danger is anticipated), the image generated by the display image generating unit 12 is modified and displayed, for example by adding a warning, so that the robot's operations related to human safety can be displayed in a way that is easy for people to understand.
- the robot control system 100 has been described as being equipped with an imaging device 4, a distance sensor, a visual sensor, an infrared sensor, or a position receiver may be used instead of the imaging device 4.
- a position receiver When a position receiver is used, the worker 2 only needs to carry a position transmitter while working.
- the position receiver receives a position from the position transmitter of the worker 2, the recognition processing unit 13 can recognize that a person is present at that position. In this case as well, the same effect as described above can be achieved.
- people and objects may also be identified from the amount of change in the acquired imaging information over time. If a distance sensor, visual sensor, etc. is used instead of the imaging device 4, people and objects are identified from the amount of change in the acquired sensor information over time. If an infrared sensor is used instead of the imaging device 4, a threshold value corresponding to a person's body temperature can be registered in advance, and the recognition processing unit 13 can recognize that a person is present when the sensor output exceeds the threshold. In these cases as well, the same effects as described above can be achieved.
- step S7 of FIG. 3 an example is shown in which the display image is sent as is when the person and the robot are not in close proximity, but changes different from those made when the person and the robot are in close proximity (step S6) may be made. In that case, changes that are less noticeable than the changes made in step S6 to draw the person's attention are made. Specifically, changes such as making the colors less noticeable or making the text smaller are made.
- the posture of the robot 1 may also be displayed as the robot operation information.
- Figure 6 is an example of a display image in which the robot operation information is the posture of the robot 1.
- the display image generating unit 12 uses an arm model to generate the posture of the robot after a certain period of time. In this way, the robot operation information can be recognized three-dimensionally by people. In other words, the robot's operations that are related to people's safety can be displayed in a more understandable manner.
- FIG. 7 shows an example in which the projection surface is a conveyor belt 6.
- the display device 5 may also be a head-mounted display or an LED display.
- Figure 8 shows an example in which an LED display, which is the display device 5, is built into the base of the robot 1. As robot movement information, the movement direction is displayed as a graphic and the movement speed as text.
- a display may be displayed to that effect.
- Figure 9 is an example of a display indicating that the robot 1 is stopped.
- the robot 1 may be temporarily stopped and then restarted to continue its operation from where it left off. If a person is within the operating range of the robot 1, there is a risk of the robot colliding with the robot 1 when it is restarted, or the person may be startled by the robot 1's movements and fall over. For this reason, by displaying that the robot is stopped and the trajectory of its movements after restart, it is possible to inform the person of the movements of the robot 1 at the time of restart, even when it is stopped, which are movements that concern their safety.
- the worker 2 has been described as an example of a person, it is also possible to detect people other than the worker 2, such as a commander or visitor, and perform similar operations.
- Embodiment 2. 10 is a functional block diagram showing a robot control system 100 according to the second embodiment. Descriptions of the same parts as those in the first embodiment will be omitted.
- the robot control system 100 according to the second embodiment will be described with reference to FIG. 10.
- the robot control system 100 according to the second embodiment differs from the robot control system 100 according to the first embodiment in that the determination processing unit 14 is replaced by a collision determination unit 16.
- the rest of the system is the same as the first embodiment.
- the recognition processing unit 13 analyzes the imaging information to determine whether there is a person, where the person is, where the object is, the speed of the person or object, etc., and transmits the results to the collision determination unit 16.
- the collision determination unit 16 determines whether or not a person or object will collide with the robot 1 based on the robot operation information received from the look-ahead operation calculation unit 11 and the analysis results of the recognition processing unit 13.
- the collision determination unit 16 transmits the determination result to the display switching processing unit 15.
- FIG. 11 is a flowchart of the processing performed by the robot control device 3 according to the second embodiment. Compared to the flowchart in FIG. 3, it differs in that steps S5 and S6 are not included and steps S8 to S11 have been added, but the rest is similar. Explanations of the same parts as in FIG. 3 will be omitted.
- step S4 determines whether or not the person will collide with the robot 1 based on the analysis results of the recognition processing unit 13 (step S8). If it is determined that a collision will occur (step S8 Yes), the display switching processing unit 15 adds a warning to the display image and transmits it to the display device 5 (step S10). If there will be no collision (step S8 No), the display switching processing unit 15 transmits the display image as is (step S7).
- step S4 No the collision determination unit 16 determines whether or not the object will collide with the robot 1 from the analysis results of the recognition processing unit 13 (step S9). If the object will collide with the robot 1 (step S9 Yes), the display switching processing unit 15 adds a warning to the display image in a manner that is less noticeable than the warning in step S10 and transmits the warning to the display device 5 (step S11). If there will be no collision (step S9 No), the display switching processing unit 15 transmits the display image as is (step S7).
- the line thickness used for the warning may be made thicker than the line thickness used for the warning when the object collides with the robot 1, the size of the letters may be made larger, or the color may be made different from the color of the warning in step S11, to make the warning stand out.
- the image may be changed but also a sound may be generated.
- the display switching processing unit 15 adds a warning in a more conspicuous manner than when an object collides with the robot 1. This makes it easier for the worker 2 to understand the behavior of the robot 1, which has a large impact on safety, as it will collide with a person unless evasive action is taken.
- the display switching processing unit 15 adds a warning if an object is about to collide with the robot 1, so that even in such a case, the worker 2 can take action such as removing the workpiece 7 before it collides with the robot 1. In other words, interference between the object and the robot 1 can be prevented without temporarily stopping the work, improving work efficiency.
- a collision determination unit 16 is provided instead of the determination processing unit 14 of embodiment 1, but the collision determination unit 16 may be provided in addition to the determination processing unit 14.
- the display switching processing unit 15 displays an additional warning that is more noticeable than the warning when a person is approaching. This makes it possible to inform the worker of the safety-related levels of the robot 1's operations that are related to human safety.
- Fig. 12 is a functional block diagram showing a robot control system 100 according to the third embodiment.
- the robot control system 100 according to the third embodiment will be described with reference to Fig. 12.
- the third embodiment is different from the first embodiment in that an image direction changing unit 17 is added to the display switching processing unit 15, but the other configurations are the same. Descriptions of the same parts as in the first embodiment will be omitted.
- the image direction change unit 17 acquires the person's position from the recognition processing unit 13.
- the image direction change unit 17 further changes the display image to which a warning or the like has been added so that it is easier to see from the acquired position.
- the image direction change unit 17 changes the orientation of the display image depending on the person's position.
- Figure 13 is an example of changing the character direction by the image direction change unit 17.
- the character of the display image created by the display image generation unit 12 is rotated so that the worker 2 is positioned in front.
- the image direction change unit 17 is added to the display switching processing unit 15 of the robot control system 100 of the first embodiment
- Fig. 14 is a functional block diagram showing a robot control system 100 according to the fourth embodiment.
- the robot control system 100 according to the fourth embodiment will be described with reference to Fig. 14.
- the robot control system 100 of the fourth embodiment is different from that of the first embodiment in that the look-ahead operation calculation unit 11 is provided with a display interval setting unit 18, and the other configurations are the same. Descriptions of the same parts as those of the first embodiment will be omitted.
- the look-ahead motion calculation unit 11 has a display interval setting unit 18 that acquires the display interval of the display image.
- the display interval setting unit 18 has a storage device 32 such as a ROM (not shown), and stores the display interval acquired in advance from the worker 2 in the storage device 32.
- FIG. 15 is a diagram explaining the display interval.
- the current position is Pc
- the current time is Tc
- the motion start position is P1
- the start time is T1
- the destination position is P2
- the time corresponding to the destination position is T2.
- the display interval is from time Tc-Ts, which is the past display interval Ts from the current time Tc as the starting point, to time Tc+Te, which is the future display interval Te from the current time Tc as the starting point.
- the display interval setting unit 18 stores the past display interval Ts and the future display interval Te in the storage device 32.
- the look-ahead motion calculation unit 11 uses the past display section Ts and the future display section Te to calculate the changed display start position Ps corresponding to time Tc-Ts, the changed display end position Pe corresponding to time Tc+Te, and the motion trajectory of the robot 1 from the changed display start position Ps to the changed display end position Pe as robot motion information, and transmits this to the display image generation unit 12.
- the look-ahead motion calculation unit 11 calculates the position of the robot 1 corresponding to the time (Tc+Te) from the current time Tc to the future of the future display section Te as robot motion information.
- the display image generating unit 12 generates a display image including the position of the robot 1 corresponding to a time (Tc+Te) that is a future display period Te minutes from the current time Tc.
- FIG. 16 is an example in which the display section is changed by the display image generating unit 12 according to the fourth embodiment.
- the display image includes an action start position P1, a current position Pc, an action path L1, a change display start position Ps on the trajectory between the action start position P1 and the current position Pc, a change display end position Pe on the trajectory between the current position Pc and the destination position P2, and the action trajectory and action direction from the change display start position Ps to the change display end position Pe.
- FIGS. 17 and 18 are examples of display interval settings in the robot control system 100 according to the fourth embodiment.
- the display interval is set longer than in FIG. 18.
- the length of the display interval is determined according to the worker 2. For example, if the worker 2 is a beginner, it is difficult for the worker 2 to predict the robot's movements, so the display interval is made longer. Conversely, if the worker 2 is an expert, the worker 2 can predict the robot's movements, so the display interval is made shorter.
- the change display start position Ps is set to the same position as the current position Pc.
- the display interval can be set by the display interval setting unit 18, so that the display interval can be changed according to the level of proficiency of the worker 2, for example.
- the display interval can be set by the display interval setting unit 18, so that the display interval can be changed according to the level of proficiency of the worker 2, for example.
- the past display section Ts and the future display section Te may be set to the same value and a single value may be acquired.
- the past display section Ts may be set to a fixed value and only the future display section Te may be acquired. In these cases, the same effect as described above can be achieved.
- Robot control device 1 Robot, 2 Worker, 3 Robot control device, 4 Imaging device, 5 Display device, 6 Conveyor belt, 7 Workpiece, 10 Robot control unit, 11 Look-ahead operation calculation unit, 12 Display image generation unit, 13 Recognition processing unit, 14 Judgment processing unit, 15 Display switching processing unit, 16 Collision judgment unit, 17 Image direction change unit, 100 Robot control system.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/004112 WO2024166238A1 (ja) | 2023-02-08 | 2023-02-08 | ロボット制御装置、ロボット制御システム、およびロボット制御方法 |
JP2023563279A JP7589833B1 (ja) | 2023-02-08 | 2023-02-08 | ロボット制御装置、ロボット制御システム、およびロボット制御方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/004112 WO2024166238A1 (ja) | 2023-02-08 | 2023-02-08 | ロボット制御装置、ロボット制御システム、およびロボット制御方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024166238A1 true WO2024166238A1 (ja) | 2024-08-15 |
Family
ID=92262159
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/004112 WO2024166238A1 (ja) | 2023-02-08 | 2023-02-08 | ロボット制御装置、ロボット制御システム、およびロボット制御方法 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7589833B1 (enrdf_load_stackoverflow) |
WO (1) | WO2024166238A1 (enrdf_load_stackoverflow) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011089885A1 (ja) * | 2010-01-25 | 2011-07-28 | パナソニック株式会社 | 危険提示装置、危険提示システム、危険提示方法およびプログラム |
JP2018051653A (ja) * | 2016-09-27 | 2018-04-05 | 株式会社デンソーウェーブ | ロボット用の表示システム |
US20190299412A1 (en) * | 2018-03-29 | 2019-10-03 | Sick Ag | Augmented Reality System |
JP2019188531A (ja) * | 2018-04-25 | 2019-10-31 | ファナック株式会社 | ロボットのシミュレーション装置 |
WO2021095316A1 (ja) * | 2019-11-11 | 2021-05-20 | 株式会社日立製作所 | ロボットシステム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7370755B2 (ja) * | 2019-08-05 | 2023-10-30 | キヤノン株式会社 | 情報処理装置、情報処理方法、プログラム、記録媒体、および物品の製造方法 |
JP7347169B2 (ja) * | 2019-11-29 | 2023-09-20 | オムロン株式会社 | 情報提示装置、情報提示方法、及び情報提示プログラム |
-
2023
- 2023-02-08 WO PCT/JP2023/004112 patent/WO2024166238A1/ja active Application Filing
- 2023-02-08 JP JP2023563279A patent/JP7589833B1/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011089885A1 (ja) * | 2010-01-25 | 2011-07-28 | パナソニック株式会社 | 危険提示装置、危険提示システム、危険提示方法およびプログラム |
JP2018051653A (ja) * | 2016-09-27 | 2018-04-05 | 株式会社デンソーウェーブ | ロボット用の表示システム |
US20190299412A1 (en) * | 2018-03-29 | 2019-10-03 | Sick Ag | Augmented Reality System |
JP2019188531A (ja) * | 2018-04-25 | 2019-10-31 | ファナック株式会社 | ロボットのシミュレーション装置 |
WO2021095316A1 (ja) * | 2019-11-11 | 2021-05-20 | 株式会社日立製作所 | ロボットシステム |
Also Published As
Publication number | Publication date |
---|---|
JP7589833B1 (ja) | 2024-11-26 |
JPWO2024166238A1 (enrdf_load_stackoverflow) | 2024-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102686371B (zh) | 危险提示装置、危险提示系统以及危险提示方法 | |
KR101982226B1 (ko) | 로봇 시스템 | |
US20190217472A1 (en) | Robot controlling device and automatic assembling system | |
WO2013114493A1 (ja) | コミュニケーション引き込みシステム、コミュニケーション引き込み方法およびコミュニケーション引き込みプログラム | |
JP2009123045A (ja) | 移動ロボット及び移動ロボットの危険範囲の表示方法 | |
JP6051307B2 (ja) | 運転支援装置 | |
JP6801333B2 (ja) | ロボット用の表示システム | |
JP2008229800A (ja) | アーム搭載移動ロボットとその制御方法 | |
US7653458B2 (en) | Robot device, movement method of robot device, and program | |
KR102418451B1 (ko) | 로봇 제어 시스템 | |
US10759056B2 (en) | Control unit for articulated robot | |
JP6824622B2 (ja) | ロボットの制御装置、制御方法およびシステム | |
CN106313085B (zh) | 使用视觉传感器的机器人系统 | |
JP2018176342A (ja) | ロボットシステム及びその運転方法 | |
EP3437805A1 (en) | Robot stopping distance simulating method | |
CN113910225A (zh) | 一种基于视觉边界检测的机器人控制系统及方法 | |
JP7589833B1 (ja) | ロボット制御装置、ロボット制御システム、およびロボット制御方法 | |
JP7079435B2 (ja) | ロボット制御装置、ロボット制御方法及びロボット制御プログラム | |
JP4258718B2 (ja) | ロボット制御装置 | |
JP6885909B2 (ja) | ロボット制御装置 | |
WO2020130091A1 (ja) | ロボットシステム及びロボットシステムの制御方法 | |
JP2020001137A (ja) | ロボット制御装置およびロボットシステム | |
JP2014172159A (ja) | ロボットの教示方法および教示装置 | |
CN113396031A (zh) | 生产机器的至少一个元件在手动运行中的力限制的行进 | |
US11994835B2 (en) | Apparatus, controller, and method for generating image data of movement path of industrial machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2023563279 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23921070 Country of ref document: EP Kind code of ref document: A1 |