JP2009012106A - Remote operation supporting device and program - Google Patents

Remote operation supporting device and program Download PDF

Info

Publication number
JP2009012106A
JP2009012106A JP2007175240A JP2007175240A JP2009012106A JP 2009012106 A JP2009012106 A JP 2009012106A JP 2007175240 A JP2007175240 A JP 2007175240A JP 2007175240 A JP2007175240 A JP 2007175240A JP 2009012106 A JP2009012106 A JP 2009012106A
Authority
JP
Japan
Prior art keywords
manipulator
object
coordinate system
laser scanner
model expression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007175240A
Other languages
Japanese (ja)
Inventor
Keisuke Wada
圭介 和田
Original Assignee
Fuji Electric Systems Co Ltd
富士電機システムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Electric Systems Co Ltd, 富士電機システムズ株式会社 filed Critical Fuji Electric Systems Co Ltd
Priority to JP2007175240A priority Critical patent/JP2009012106A/en
Publication of JP2009012106A publication Critical patent/JP2009012106A/en
Application status is Pending legal-status Critical

Links

Abstract

<P>PROBLEM TO BE SOLVED: To support a remote operation of an object, while confirming the object from an optional direction, even when a position and attitude of the object is unknown in advance. <P>SOLUTION: An environmental data acquisition means 21a acquires point group data on the object 11 measured by a laser scanner 14 as environmental data 15. An object model expression means 21b generates an object model expression 18 on which shape and attitude of the object 11 are reflected by performing a three-dimensional recognition processing based on the environmental data 15 of the object 11. A manipulator model expression means 21c generates a manipulator model expression 20 on which an operation state of a manipulator 12 is reflected based on a state 19 of each axis of the manipulator 12. A three-dimensional image generation means 21d three-dimensionally displays the object model expression 18 and the manipulator model expression 20 viewed from a viewpoint designated in a three-dimensional space on the same screen. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

  The present invention relates to a remote operation support device and a remote operation support program, and in particular, to a method for remotely operating a work robot arranged in an environment where a human cannot easily enter, such as in a nuclear reactor. Therefore, it is suitable.

In places where humans cannot easily enter, such as in a radiation environment or space environment, operations such as maintenance and inspection are generally performed by remotely operating a work robot placed in such an environment. It has been broken.
Here, if the work to be performed in maintenance or inspection is known, or if the content of the work in periodic maintenance is the same every time, the robot operation is programmed in advance so that a human can operate it. Work without automatic operation

On the other hand, even if the same work is performed regularly, it is possible to cope with unexpected situations such as the presence of obstacles that did not exist last time, or some of the devices that are subject to maintenance being broken. In order to achieve this, for example, in Patent Document 1, information on the position and orientation of an object in a work environment and positioning necessary for operation of a robot is stored as an environmental model, and positioning obtained from an image captured by the camera and the environmental model A method is disclosed in which a manual operation can be guided by displaying a combined image obtained by combining an image obtained by displaying information related to a graphic.
JP 2003-316661 A

  However, in the method disclosed in Patent Document 1, it is necessary to prepare information on the position and orientation of an object in a work environment and positioning necessary for robot operation as an environment model. In addition, it is assumed that the object exists at a predetermined position registered in the environmental model, and the state of the remotely operated object is similarly maintained during operation. It needs to be leaned.

For this reason, in the method disclosed in Patent Document 1, if the position of the target object is different from the initial position or the position of the target object is changed during the operation, the target object cannot be remotely controlled and set in advance. There is a problem that only a synthesized image from only the viewpoints can be displayed, and the object cannot be confirmed from an arbitrary direction.
Therefore, an object of the present invention is to support remote operation of an object while enabling the object and work environment to be confirmed from an arbitrary direction even when the position and orientation of the object are not known in advance. It is to provide a remote operation support device and a remote operation support program that can be used.

  In order to solve the above-described problem, according to the remote operation support device according to claim 1, environmental data acquisition means for acquiring measurement data by a laser scanner and measurement data by a stereo camera as environmental data, and an object by the laser scanner By performing three-dimensional recognition processing based on measurement data of the shape of the object, the object model expression means for generating the object model expression reflecting the shape and orientation of the object, and the state of each axis of the manipulator The manipulator model expression means for generating a manipulator model expression reflecting the operation state of the manipulator based on the same, and the object model expression and the manipulator model expression viewed from a specified viewpoint in the three-dimensional space are the same as the environment data Three-dimensional image generation means for three-dimensional display on the screen Characterized in that it comprises a.

  The remote operation support device according to claim 2, further comprising camera coordinate system conversion means for converting a coordinate system of the stereo camera into a coordinate system of the laser scanner, wherein the three-dimensional image generation means includes the laser The stereo camera measurement data converted into a scanner coordinate system is displayed together with the laser scanner measurement data.

  According to the remote operation support program of claim 3, the step of acquiring measurement data by the laser scanner and the measurement data by the stereo camera as environment data, and the third order based on the measurement data of the shape of the object by the laser scanner Performing an original recognition process to generate an object model expression reflecting the shape and orientation of the object, and a manipulator model expression reflecting the operation state of the manipulator based on the state of each axis of the manipulator And causing the computer to execute three-dimensionally displaying the object model representation and the manipulator model representation viewed from a specified viewpoint in a three-dimensional space on the same screen together with the environment data. It is characterized by.

  The remote operation support program according to claim 4, further comprising the step of converting the coordinate system of the stereo camera into the coordinate system of the laser scanner, and the stereo camera converted into the coordinate system of the laser scanner. The measurement data is displayed together with the measurement data of the laser scanner.

  As described above, according to the present invention, a three-dimensional model reflecting the current state of the object and the manipulator is generated, and the model is displayed on the same screen together with environmental data from a laser scanner or a stereo camera. Original display is possible. For this reason, even when the position and orientation of an object change, the current state of the object and manipulator can be accurately displayed along with the work environment, and the object and manipulator can be modeled three-dimensionally. This makes it possible to switch to an image from an arbitrary viewpoint, enabling smooth remote control of objects even in a radiation environment or space environment that humans cannot easily enter. It becomes.

Hereinafter, a remote control support device according to an embodiment of the present invention will be described with reference to the drawings.
FIG. 1 is a block diagram showing a schematic configuration of a remote control support device according to an embodiment of the present invention.
In FIG. 1, a manipulator 12 that operates a target object 11 and a target object 11 that is operated by the manipulator 12 are arranged in a work area in which work such as maintenance and inspection is performed. The object 11 and the manipulator 12 can be installed in a place where humans cannot easily enter, such as in a radiation environment or a space environment. The manipulator 12 refers to all mechanical devices used in remote operation including a working robot.

  Here, the manipulator 12 is provided with a gripper for gripping the object 11, and can be connected to an arm that moves the gripper to an arbitrary position in the three-dimensional space or rotates it in an arbitrary direction. The arms are connected to each other via joint points, and each arm can be configured to be rotatable about the X, Y, and Z axes. An axis is set for each arm, a three-dimensional coordinate system is set for each axis, and the position of each arm can be specified by observing the state of each axis. Further, the manipulator 12 is provided with a stereo camera 13 that stereoscopically captures the environment around the object 11 in the vicinity of the gripper.

On the other hand, a remote operation support device 15 that supports remote operation of the manipulator 12 is installed in an area where a human gives a command to the manipulator 12.
Here, the remote operation support device 15 is provided with environment data acquisition means 15a, object model expression means 15b, manipulator model expression means 15c, camera coordinate system conversion means 15d, and 3D image generation means 15e, as well as the object. 11 is connected to a laser scanner 14 that measures the shape of 11 and a display device 16 that displays the current state of the object 11 and the manipulator 12.

  The environmental data acquisition means 15a can acquire the point cloud data on the object 11 measured by the laser scanner 14 as the environmental data P1, and can acquire the measurement data obtained by the stereo camera 13 as the environmental data P3. . The object model expression means 15b can generate an object model expression in which the shape and orientation of the object 11 are reflected by performing a three-dimensional recognition process based on the environment data 15 about the object 11. The manipulator model expression means 15 c can generate a manipulator model expression reflecting the operation state of the manipulator 12 based on the state of each axis of the manipulator 12. The camera coordinate system conversion unit 15 d can convert the coordinate system of the stereo camera 13 to the coordinate system of the laser scanner 14 by performing calibration between the stereo camera 13 and the laser scanner 14. The three-dimensional image generation means 15e displays the object model expression and the manipulator model expression viewed from the specified viewpoint in the three-dimensional space in three dimensions on the same screen together with the environment data from the laser scanner 14 and the stereo camera 13. be able to.

For the object model expression and the manipulator model expression, for example, a surface model generally used in computer graphics can be used.
The remote operation support device 15 includes a manipulator environment data storage unit 17, a stereo camera environment data storage unit 18, a manipulator calibration result storage unit 19, a stereo camera calibration result storage unit 20, an object transformation matrix storage unit 21, and Manipulator transformation matrix storage means 22 is connected.
The manipulator coordinate storage means 17 can store the three-dimensional coordinates A1, B1, C1, D1,... In the laser scanner coordinate system for each point of the manipulator coordinate system. The stereo camera environment data storage means 18 can store the three-dimensional coordinates A2, B2, C2,... In the laser scanner coordinate system for the coordinate values of the stereo camera coordinate system.

  The manipulator calibration result storage unit 19 can store the calibration result of the manipulator 12. The calibration result of the manipulator 12 refers to a rigid body transformation matrix that converts the coordinate values of the manipulator coordinate system into the laser scanner coordinate system. Here, by measuring the laser scanner coordinate values corresponding to three or more points on the manipulator 12 with the laser scanner 14, calibration (determination) between the laser scanner 14 and the manipulator 12 can be performed. . Specifically, if there are three measurement points on the manipulator 12, the 3 × 3 matrix representing the rotational relationship between the laser scanner coordinate system and the manipulator coordinate system and the three-dimensional vector representing the translation relationship are used. A configured 4 × 4 rigid transformation matrix can be calculated. Then, calibration of the laser scanner 14 and the manipulator 12 can be performed in advance, and the calibration result of the manipulator 12 can be registered in advance in the manipulator calibration result storage unit 19.

  The stereo camera calibration result storage means 20 can store the calibration result of the stereo camera 13. The calibration result of the stereo camera 13 refers to a rigid body transformation matrix that converts the coordinate values of the stereo camera coordinate system into the laser scanner coordinate system. Then, by using the angle of each axis of the manipulator 12 and the calibration result of the manipulator 12, the calibration result of the stereo camera 13 can be calculated.

  The object conversion matrix storage means 21 can store a conversion matrix for converting the points on the object 11 into the three-dimensional coordinates of the reference coordinate system. Note that the recognition result of the object 11 by the three-dimensional recognition process can correspond to a 4 × 4 transformation matrix composed of a rotation transformation element and a translation element, and the object 11 placed in advance in the basic coordinate system is used as a reference. The position and orientation of the object 11 in the coordinate system can be adjusted. Then, by multiplying the basic coordinate system of the object 11 by the conversion matrix, the object 11 can be converted to the reference coordinate system.

  The manipulator conversion matrix storage means 22 can store a conversion matrix for converting the points on the manipulator 12 into the three-dimensional coordinates of the reference coordinate system. Note that the transformation matrix stored in the manipulator transformation matrix storage means 22 can be provided for each axis 1, 2,... Of the manipulator 12, and periodically reads the current axis angle from the arm of the manipulator, It can be converted into an axis coordinate system conversion matrix (axis 1 conversion matrix, axis 2 conversion matrix,..., Tip conversion matrix) and stored. Then, by multiplying the three-dimensional coordinate system of the designated point on the manipulator 12 by the conversion matrix, it can be converted into the three-dimensional coordinates of the reference coordinate system.

The laser scanner 14 performs a three-dimensional measurement process by scanning the object 11 with laser light. Then, when the three-dimensional measurement process is performed by the laser scanner 14, the environment data acquisition unit 15 a acquires the point cloud data on the object 11 as the environment data P 1 and stores it in the remote operation support device 15.
Then, the object model expression means 15b reflects the shape and orientation of the object 11 by performing the 3D recognition process based on the environment data P1 for the object 11 while referring to the 3D recognition database. Generate an object model representation. Here, the three-dimensional recognition database can store spin images for a plurality of objects 11 operated by the manipulator 12. And when performing a three-dimensional recognition process based on the environmental data P1 about the target object 11, the normal vector about the point cloud data of the target object 11 is set, and the point cloud data measured by the laser scanner 14 is set. The spin image and the spin image stored in the three-dimensional recognition database can be collated. Note that the three-dimensional recognition process using the spin image is described in detail in Japanese Patent Application Laid-Open No. 2006-139713.

In the three-dimensional recognition process using the spin image, it is not necessary to measure the point cloud data of the entire target object 11 in order to perform the three-dimensional recognition of the target object 11, and the measurement can be performed from one direction by the laser scanner 14. This can reduce the amount of calculation.
Further, the object model expression means 15 b calculates a transformation matrix of the object 11 by performing a three-dimensional recognition process based on the environment data P <b> 1 about the object 11, and stores it in the object transformation matrix storage means 21. be able to.
In addition, the stereo camera 13 performs three-dimensional measurement of the surrounding environment of the object 11 by photographing the surrounding environment of the object 11 from a plurality of viewpoints. Then, the environment data acquisition unit 15 a acquires measurement data obtained by the stereo camera 13 as environment data P <b> 3 and stores it in the remote operation support device 15.

  Further, the manipulator 12 detects the state P <b> 2 of each axis of the manipulator 12 and sends it to the remote operation support device 15. Then, the manipulator model expression means 15c reflects the operation state of the manipulator 12 by constructing a shape similar to the manipulator 12 in which the actual movement of the manipulator 12 is reflected based on the state P2 of each axis of the manipulator 12. Generate a manipulator model representation. Note that the dimensions of the components of the manipulator 12 such as an arm necessary for generating the manipulator model representation can be registered in the remote operation support device 15 in advance. Further, the state P2 of each axis of the manipulator 12 can use the angle of each axis of the manipulator 12, and an angle sensor can be provided for each axis in order to detect the angle of each axis of the manipulator 12.

Further, when the manipulator model expressing unit 15 c acquires the state P <b> 2 of each axis of the manipulator 12, the manipulator model expressing unit 15 c can convert it into a conversion matrix of each axis coordinate system of the manipulator 12 and store it in the manipulator conversion matrix storage unit 22.
Then, the camera coordinate system conversion unit 15d calculates the calibration result of the stereo camera 13 by using the conversion matrix for each axis of the manipulator 12 and the calibration result of the manipulator 12, and the stereo camera calibration result storage unit 20 Register with. Then, by using the calibration result of the stereo camera 13, the coordinate value of the stereo camera coordinate system is converted into the coordinate value of the laser scanner coordinate system, so that the coordinate value of the stereo camera coordinate system is three-dimensional in the laser scanner coordinate system. Coordinates A2, B2, C2,... Are obtained and stored in the stereo camera environment data storage means 18.

  Then, the three-dimensional image generation means 15e generates a two-dimensional image that three-dimensionally shows the object model expression and the manipulator model expression viewed from the specified viewpoint in the three-dimensional space, and the environmental data P1 and the laser scanner 14 Along with the environment data P3 by the stereo camera 13 converted into the laser scanner coordinate system, it can be displayed on the same screen of the display device 16 in an overlapping manner. When the object model expression and the manipulator model expression are displayed on the display device 16, the object model expression and the manipulator model expression viewed from an arbitrary viewpoint can be displayed on the same screen by using graphics software such as Ooen GL. Can be displayed in an overlapping manner.

Then, when an image that three-dimensionally shows the object model expression and the manipulator model expression is displayed on the display device 16 together with the environment data P1 and P3, the operator switches the viewpoint so that it can be easily operated remotely. Can be operated.
As a result, a three-dimensional model reflecting the current state of the object 11 and the manipulator 12 is generated, and the model is three-dimensionally displayed on the same screen together with the environment data P1 and P3 from the laser scanner 14 and the stereo camera 13. Can be displayed. For this reason, even when the position and posture of the target object 11 change or an obstacle exists in the vicinity of the target object 11, the current state of the target object 11 and the manipulator 12 is accurately displayed together with the work environment. In addition, by modeling the object 11 and the manipulator 12 three-dimensionally, it is possible to switch to an image from an arbitrary viewpoint, and in a radiation environment or space where humans cannot easily enter Even in an environment or the like, the remote operation of the object 11 can be performed smoothly.

The environment data acquisition means 15a, the object model expression means 15b, the manipulator model expression means 15c, the camera coordinate system conversion means 15d, and the 3D image generation means 15e are described with instructions for performing the processing performed by these means. This can be realized by causing the computer to execute the program.
If this program is stored in a storage medium such as a CD-ROM, the environment data acquisition means 15a, the target are installed by installing the storage medium in the computer of the remote operation support device 15 and installing the program in the computer. The processing performed by the object model expressing unit 15b, the manipulator model expressing unit 15c, the camera coordinate system converting unit 15d, and the 3D image generating unit 15 can be realized.

  In addition, a program in which instructions for executing the processing performed by the environment data acquisition means 15a, the object model expression means 15b, the manipulator model expression means 15c, the camera coordinate system conversion means 15d, and the 3D image generation means 15 are described in the computer When executed, it may be executed by a stand-alone computer, or may be distributed to a plurality of computers connected to a network.

FIG. 2 is a diagram illustrating an example of a manipulator coordinate system and a stereo camera coordinate system applied to the remote operation support device according to the embodiment of the present invention.
2, the three-dimensional coordinate system of the manipulator 12 can be provided for each axis 1, 2,... For example, the reference coordinate system of the manipulator 12 is the X0 / Y0 / Z0 coordinate system, the coordinate system of the axis 1 is the X1 / Y1 / Z1 coordinate system, the coordinate system of the axis 2 is the X2 / Y2 / Z2 coordinate system, and the coordinate system of the axis 3 Can be an X3 / Y3 / Z3 coordinate system, the coordinate system of the axis 4 can be an X4 / Y4 / Z4 coordinate system, and the coordinate system of the stereo camera 13 attached to the manipulator 12 can be an X5 / Y5 / Z5 coordinate system. Then, the distances d1, d2, d3, d4 between the axes of the manipulator 12 can be set.

For example, a point in the coordinate system of the axis 1 can be converted into a coordinate value in the laser scanner coordinate system by the following equation (1).
(Calibration result of manipulator 12) × (conversion matrix of coordinate system of axis 1) × (point coordinates on axis 1) (1)
Further, for example, a conversion matrix for converting the coordinate value of the stereo camera coordinate system into the coordinate value of the laser scanner coordinate system can be given by the following equation (2).
(Calibration result of manipulator 12) × (Conversion matrix of coordinate system of axis 1) × (Conversion matrix of coordinate system of axis 2) × (Conversion matrix of coordinate system of axis 3) × (Conversion of coordinate system of axis 4) Matrix) × (transformation matrix of stereo camera coordinate system) (2)

FIG. 3 is a flowchart showing a method for converting the position of environmental data by a stereo camera according to an embodiment of the present invention. Note that the environmental data position conversion method in FIG. 3 will be described on the assumption that there are four axes of the manipulator 12.
In FIG. 3, the camera coordinate system conversion unit 15d in FIG. 1 reads the calibration result of the manipulator 12 from the manipulator calibration result storage unit 19 (step S1). The camera coordinate system conversion means 15d reads the axis angle of the manipulator 12 and the arm lengths d1 to d4 (step S2), and sequentially multiplies the conversion matrix of the front axis and the conversion matrix of the current axis by the number of axes of the manipulator 12. (Steps S3 and S5), and the multiplication result is held (Step S4).

  Then, a movement amount for matching the coordinate system of the axis 4 of the manipulator 12 with the stereo camera coordinate system is calculated (step S6), and the calibration result of the manipulator 12 and the conversion matrix for the number of axes of the manipulator 12 are multiplied. The calibration result of the stereo camera 13 is calculated by adding the movement amount of the axis 4 of the manipulator 12 to the multiplication result (step S7).

  Next, the camera coordinate system conversion means 15d reads the environment data P3 and color information from the stereo camera 13 (step S8), and multiplies the calibration result of the stereo camera 13 by the environment data P3 from the stereo camera 13. Then, the coordinate value of the stereo camera coordinate system in the environment data P3 by the stereo camera 13 is converted to the coordinate value of the laser scanner coordinate system (step S9). Then, the color information is added to the environment data P3 by the stereo camera 13 converted into the laser scanner coordinate system (step S10), and is stored in the stereo camera environment data storage means 18 (step S11). Then, it is determined whether or not the processing of steps S9 to S11 has been repeated for the number of environmental data P3 (step S12). If the processing has not been repeated for the number of environmental data P3, the processing of steps S9 to S11 is repeated.

FIG. 4A is a diagram showing a display example of environmental data, model expression of an object, and model expression of a manipulator by a laser scanner according to an embodiment of the present invention, and FIG. 4B is an embodiment of the present invention. It is a figure which shows the example of a display of the environmental data by the stereo camera which concerns on a form, the environmental data by a laser scanner, the model expression of a target object, and the model expression of a manipulator.
In FIG. 4A, the object model expression reflecting the shape and posture of the object 11 and the manipulator model expression reflecting the actual movement of the manipulator 12 are displayed on the display device 16 together with the environmental data P1 from the laser scanner 14. It is displayed.
In FIG. 4B, the object model expression reflecting the shape and orientation of the object 11 and the manipulator model expression reflecting the actual movement of the manipulator 12 are the environmental data P1 and the laser scanner by the laser scanner 14. It is displayed on the display device 16 together with the environment data P3 by the stereo camera 13 converted into the coordinate system.

It is a block diagram which shows schematic structure of the remote operation assistance apparatus which concerns on one Embodiment of this invention. It is a figure which shows an example of the manipulator coordinate system and stereo camera coordinate system applied to the remote operation assistance apparatus which concerns on one Embodiment of this invention. It is a flowchart which shows the position conversion method of the environmental data by the stereo camera which concerns on one Embodiment of this invention. FIG. 4A is a diagram showing a display example of environmental data, model expression of an object, and model expression of a manipulator by a laser scanner according to an embodiment of the present invention, and FIG. 4B is an embodiment of the present invention. It is a figure which shows the example of a display of the environmental data by the stereo camera which concerns on a form, the environmental data by a laser scanner, the model expression of a target object, and the model expression of a manipulator.

Explanation of symbols

DESCRIPTION OF SYMBOLS 11 Object 12 Manipulator 13 Stereo camera 14 Laser scanner 15 Remote operation support device 15a Environment data acquisition means 15b Object model expression means 15c Manipulator model expression means 15d Camera coordinate system conversion means 15e Three-dimensional image generation means 16 Display device 17 Manipulator environment Data storage means 18 Stereo camera environment data storage means 19 Manipulator calibration result storage means 20 Stereo camera calibration result storage means 21 Object conversion matrix storage means 22 Manipulator conversion matrix storage means

Claims (4)

  1. Environmental data acquisition means for acquiring measurement data by a laser scanner and measurement data by a stereo camera as environmental data;
    An object model expression means for generating an object model expression reflecting the shape and orientation of the object by performing a three-dimensional recognition process based on measurement data of the object shape by the laser scanner;
    Manipulator model expression means for generating a manipulator model expression reflecting the operation state of the manipulator based on the state of each axis of the manipulator;
    3D image generating means for displaying the object model representation and the manipulator model representation viewed from a specified viewpoint in a three-dimensional space together with the environment data in a three-dimensional manner on the same screen. Operation support device.
  2. Further comprising camera coordinate system conversion means for converting the coordinate system of the stereo camera into the coordinate system of the laser scanner;
    The remote operation support device according to claim 1, wherein the three-dimensional image generation means displays the measurement data of the stereo camera converted into the coordinate system of the laser scanner together with the measurement data of the laser scanner.
  3. Obtaining measurement data from a laser scanner and measurement data from a stereo camera as environmental data;
    Generating an object model expression reflecting the shape and orientation of the object by performing three-dimensional recognition processing based on measurement data of the object shape by the laser scanner;
    Generating a manipulator model representation reflecting the operating state of the manipulator based on the state of each axis of the manipulator;
    A remote operation characterized by causing a computer to execute the step of displaying the object model representation and the manipulator model representation viewed from a designated viewpoint in a three-dimensional space on the same screen together with the environment data. Support program.
  4. Further comprising the step of converting the coordinate system of the stereo camera into the coordinate system of the laser scanner,
    The remote operation support program according to claim 3, wherein the measurement data of the stereo camera converted into the coordinate system of the laser scanner is displayed together with the measurement data of the laser scanner.
JP2007175240A 2007-07-03 2007-07-03 Remote operation supporting device and program Pending JP2009012106A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007175240A JP2009012106A (en) 2007-07-03 2007-07-03 Remote operation supporting device and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007175240A JP2009012106A (en) 2007-07-03 2007-07-03 Remote operation supporting device and program

Publications (1)

Publication Number Publication Date
JP2009012106A true JP2009012106A (en) 2009-01-22

Family

ID=40353646

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007175240A Pending JP2009012106A (en) 2007-07-03 2007-07-03 Remote operation supporting device and program

Country Status (1)

Country Link
JP (1) JP2009012106A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012521855A (en) * 2009-03-31 2012-09-20 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Synthetic representation of surgical robot
CN103717358A (en) * 2011-08-02 2014-04-09 索尼公司 Control system, display control method, and non-transitory computer readable storage medium
CN103909525A (en) * 2012-12-28 2014-07-09 发那科株式会社 Robot system display device
US8864652B2 (en) 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US8918211B2 (en) 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
JP2015164073A (en) * 2015-05-13 2015-09-10 東芝プラントシステム株式会社 Three-dimensional cad data creation system and three-dimensional cad data creation method
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9345387B2 (en) 2006-06-13 2016-05-24 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US10258425B2 (en) 2008-06-27 2019-04-16 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US10507066B2 (en) 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63124107A (en) * 1986-11-14 1988-05-27 Toshiba Corp Producing device for working environment model
JPH0857784A (en) * 1994-03-29 1996-03-05 General Electric Co <Ge> Remote maintenance system using robotic arm

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63124107A (en) * 1986-11-14 1988-05-27 Toshiba Corp Producing device for working environment model
JPH0857784A (en) * 1994-03-29 1996-03-05 General Electric Co <Ge> Remote maintenance system using robotic arm

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10271909B2 (en) 1999-04-07 2019-04-30 Intuitive Surgical Operations, Inc. Display of computer generated image of an out-of-view portion of a medical device adjacent a real-time image of an in-view portion of the medical device
US10433919B2 (en) 1999-04-07 2019-10-08 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US9232984B2 (en) 1999-04-07 2016-01-12 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US9345387B2 (en) 2006-06-13 2016-05-24 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US10137575B2 (en) 2006-06-29 2018-11-27 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument
US9801690B2 (en) 2006-06-29 2017-10-31 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical instrument
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US10188472B2 (en) 2007-06-13 2019-01-29 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US10271912B2 (en) 2007-06-13 2019-04-30 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9901408B2 (en) 2007-06-13 2018-02-27 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9629520B2 (en) 2007-06-13 2017-04-25 Intuitive Surgical Operations, Inc. Method and system for moving an articulated instrument back towards an entry guide while automatically reconfiguring the articulated instrument for retraction into the entry guide
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US8864652B2 (en) 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9516996B2 (en) 2008-06-27 2016-12-13 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip
US10368952B2 (en) 2008-06-27 2019-08-06 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9717563B2 (en) 2008-06-27 2017-08-01 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US10258425B2 (en) 2008-06-27 2019-04-16 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
JP2012521855A (en) * 2009-03-31 2012-09-20 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Synthetic representation of surgical robot
US10282881B2 (en) 2009-03-31 2019-05-07 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US9956044B2 (en) 2009-08-15 2018-05-01 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US10271915B2 (en) 2009-08-15 2019-04-30 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US8918211B2 (en) 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9622826B2 (en) 2010-02-12 2017-04-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
CN103717358A (en) * 2011-08-02 2014-04-09 索尼公司 Control system, display control method, and non-transitory computer readable storage medium
US9199379B2 (en) 2012-12-28 2015-12-01 Fanuc Corporation Robot system display device
JP2014128845A (en) * 2012-12-28 2014-07-10 Fanuc Ltd Robot system display device
CN103909525A (en) * 2012-12-28 2014-07-09 发那科株式会社 Robot system display device
US10507066B2 (en) 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
JP2015164073A (en) * 2015-05-13 2015-09-10 東芝プラントシステム株式会社 Three-dimensional cad data creation system and three-dimensional cad data creation method

Similar Documents

Publication Publication Date Title
JP2602812B2 (en) Determining method and apparatus the position and orientation of the three-dimensional object
US5181823A (en) Apparatus and method for producing a video display
JP5215740B2 (en) Mobile robot system
US8355816B2 (en) Action teaching system and action teaching method
US9964398B2 (en) Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
JP5904676B2 (en) Apparatus and method for robust calibration between machine vision system and robot
DE102010045752A1 (en) Visual perception system and method for a humanoid robot
JP3842233B2 (en) Image processing apparatus and robot system
JP4484863B2 (en) Method and system for determining inaccurate information in an augmented reality system
US8687057B2 (en) Three-dimensional measurement apparatus and control method therefor
EP2148629B1 (en) Frame mapping and force feedback methods, devices and systems
CN1322961C (en) Metering mechanism
JP2006301991A (en) Correction method of coordinate transformation function
US6816755B2 (en) Method and apparatus for single camera 3D vision guided robotics
US7714895B2 (en) Interactive and shared augmented reality system and method having local and remote access
US20060025890A1 (en) Processing program generating device
EP1043126A2 (en) Teaching model generating device
US6031941A (en) Three-dimensional model data forming apparatus
JP5977544B2 (en) Information processing apparatus and information processing method
DE60127644T2 (en) Teaching device for a robot
EP0278421A1 (en) Instruction system of remote-control robot
JP4153528B2 (en) Apparatus, program, recording medium and method for robot simulation
JP5036260B2 (en) Position and orientation calculation method and apparatus
US9058693B2 (en) Location correction of virtual objects
JP2664205B2 (en) Manipulator control system

Legal Events

Date Code Title Description
A625 Written request for application examination (by other person)

Free format text: JAPANESE INTERMEDIATE CODE: A625

Effective date: 20100514

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20110422

A977 Report on retrieval

Effective date: 20111117

Free format text: JAPANESE INTERMEDIATE CODE: A971007

A131 Notification of reasons for refusal

Effective date: 20120104

Free format text: JAPANESE INTERMEDIATE CODE: A131

RD03 Notification of appointment of power of attorney

Effective date: 20120104

Free format text: JAPANESE INTERMEDIATE CODE: A7423

A02 Decision of refusal

Effective date: 20120508

Free format text: JAPANESE INTERMEDIATE CODE: A02