US11833698B2 - Vision system for a robot - Google Patents
Vision system for a robot Download PDFInfo
- Publication number
- US11833698B2 US11833698B2 US17/273,235 US201917273235A US11833698B2 US 11833698 B2 US11833698 B2 US 11833698B2 US 201917273235 A US201917273235 A US 201917273235A US 11833698 B2 US11833698 B2 US 11833698B2
- Authority
- US
- United States
- Prior art keywords
- eye
- eye camera
- area
- robot
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000000284 extract Substances 0.000 claims abstract description 5
- 230000003287 optical effect Effects 0.000 claims description 17
- 239000012636 effector Substances 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 239000011521 glass Substances 0.000 description 11
- 238000010422 painting Methods 0.000 description 11
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 210000003128 head Anatomy 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000010287 polarization Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0075—Manipulators for painting or coating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/006—Geometric correction
-
- G06T5/80—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N3/00—Scanning details of television systems; Combination thereof with generation of supply voltages
- H04N3/10—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical
- H04N3/30—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical otherwise than with constant velocity or otherwise than in pattern formed by unidirectional, straight, substantially horizontal or vertical lines
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37074—Projection device, monitor, track tool, workpiece form, process on display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
Definitions
- the present disclosure relates to a robot system.
- Patent Document 1 Technologies in which an operator remotely manipulates a manipulator while watching a work target object displayed on a stereoscopic display unit are known (e.g., see Patent Document 1).
- the present disclosure is made in view of solving the above problem, and one purpose thereof is to provide a robot system, which enables an operator to manipulate a robot body while three-dimensionally watching a part of a work area in detail.
- a robot system includes a robot body having a working part configured to perform a work, a robot manipulation device used by an operator to manipulate the robot body, a left-eye camera and a right-eye camera configured to capture a left-eye capturing image and a right-eye capturing image of a work area where the working part of the robot body performs the work, respectively, a stereoscopic display unit configured to display parallax images seen three-dimensionally by the operator with both eyes, an area manipulation device operated by the operator to specify a stereoscopic vision target area to be seen three-dimensionally through the parallax images displayed on the stereoscopic display unit, in an absolute space in a field of view common between the left-eye camera and the right-eye camera, a robot controlling module configured to control operation of the robot body according to the operation of the robot manipulation device, and a stereoscopic display controlling module configured to extract images corresponding to the stereoscopic vision target area specified by the operation of the
- the “absolute space” means a space where the left-eye camera and the right-eye camera exist, and a “position in the absolute space” is defined based on a given coordinate system, for example, a reference coordinate system of the robot body.
- the “left-eye camera and right-eye camera” mean a pair of cameras which has a pair of optical axes parallel to each other and having a given gap therebetween.
- Fields of view of the left-eye camera and the right-eye camera mean spaces within angles of view of the left-eye camera and the right-eye camera, respectively.
- the stereoscopic display controlling module extracts the images corresponding to the stereoscopic vision target area specified by the operation of the area manipulation device, from the left-eye capturing image and the right-eye capturing image captured by the left-eye camera and the right-eye camera, respectively, and displays the extracted images on the stereoscopic display unit as the parallax images. Therefore, the operator can three-dimensionally see a desired part of the work area where the working part of the robot body performs the work in detail.
- the stereoscopic display controlling module may extract images of the stereoscopic vision target area in the absolute space corresponding to a position of the working part of the robot body, from the left-eye capturing image and the right-eye capturing image captured by the left-eye camera and the right-eye camera, respectively, and display the extracted images on the stereoscopic display unit as the parallax images of an initial setting.
- the image of the stereoscopic vision target area in the absolute space corresponding to the position of the working part of the robot body is displayed so as to follow the operation of the robot body. Therefore, the operator can promptly and three-dimensionally see the desired part of the work area where the working part of the robot body performs the work in detail, by operating the area manipulation device while manipulating the robot body with the robot manipulation device.
- the stereoscopic display unit displays the parallax images of only very small stereoscopic vision target area in the work area. Therefore, when an area far away from the stereoscopic vision target area currently displayed is desired to be displayed as the stereoscopic vision target area, a direction in which the stereoscopic vision target area should be moved is undecidable in some cases. According to this configuration, when the area manipulation device is not operated due to such reasons, the image of the stereoscopic vision target area in the absolute space corresponding to the position of the working part of the robot body is displayed on the stereoscopic display unit as the parallax images of the initial setting. Therefore, by moving the stereoscopic vision target area from the initial setting as a start point, the area far away from the currently displayed stereoscopic vision target area can be easily displayed as the stereoscopic vision target area.
- the area manipulation device may be operated to adjust at least one of the size of the stereoscopic vision target area, a position of the stereoscopic vision target area, a parallax of the parallax images, and an enlargement and a reduction in size of the parallax images.
- the stereoscopic display controlling module may execute image processing of the left-eye capturing image and the right-eye capturing image, and display, on the stereoscopic display unit, the parallax images in which at least one of the size of the stereoscopic vision target area, the position of the stereoscopic vision target area, the parallax of the parallax images, and the enlargement and the reduction in size of the parallax images is adjusted.
- the desired part of the work area where the working part of the robot body performs the work can be seen three-dimensionally in detail, in the desired mode.
- An angle of view of each of the left-eye camera and the right-eye camera may be 150° or above and 360° or below.
- the stereoscopic display controlling module may correct the images extracted from the left-eye capturing image and the right-eye capturing image so as to remove image distortion caused by wide-angle lenses, and display the corrected images on the stereoscopic display unit as the parallax images.
- a plurality of pairs of the left-eye camera and the right-eye camera may be disposed surrounding the work area where the working part of the robot body performs the work, and the stereoscopic display controlling module may display, on the stereoscopic display unit, the parallax images corresponding to a selected pair of the left-eye camera and the right-eye camera.
- the operator can three-dimensionally see the desired part of the work area in detail as if he/she circles around the peripheral of the work area.
- a robot system which enables an operator to manipulate a robot body while three-dimensionally watching a part of a work area in detail, can be provided.
- FIG. 1 is a perspective view illustrating one example of a configuration of hardware of a robot system and a work environment of the robot system according to Embodiment 1 of the present disclosure.
- FIG. 2 is a perspective view illustrating a situation where an operator manipulates a robot body, and performs a manipulation for specifying a stereoscopic vision target area to be seen three-dimensionally in detail.
- FIG. 3 is a functional block diagram illustrating one example of a configuration of a control system of the robot system illustrated in FIG. 1 .
- FIGS. 4 ( a ) to 4 ( d ) are schematic views schematically illustrating modes of adjusting the stereoscopic vision target area to be seen three-dimensionally in detail.
- FIG. 5 is a schematic view illustrating a relation between fields of view of a left-eye camera and a right-eye camera, and the stereoscopic vision target area to be seen three-dimensionally in detail by the operator.
- FIG. 6 is a schematic view schematically illustrating image processing to extract an image corresponding to the stereoscopic vision target area to be seen three-dimensionally in detail by the operator.
- FIG. 7 is a schematic view illustrating relations between fields of view of a plurality of pairs of the left-eye camera and the right-eye camera and a stereoscopic vision target area to be seen three-dimensionally in detail by the operator according to Embodiment 3.
- FIG. 1 is a perspective view illustrating one example of a configuration of hardware of a robot system and a work environment of the robot system according to Embodiment 1 of the present disclosure.
- FIG. 2 is a perspective view illustrating a situation where an operator manipulates a robot body, and performs a manipulation for specifying a stereoscopic vision target area to be seen three-dimensionally in detail.
- FIG. 3 is a functional block diagram illustrating one example of a configuration of a control system of the robot system illustrated in FIG. 1 .
- FIGS. 4 ( a ) to 4 ( d ) are schematic views schematically illustrating modes of adjusting the stereoscopic vision target area to be seen three-dimensionally in detail.
- FIG. 5 is a schematic view illustrating a relation between fields of view of a left-eye camera and a right-eye camera, and the stereoscopic vision target area to be seen three-dimensionally in detail by the operator.
- a robot system 100 is provided with a robot body 1 having a working part 11 which performs a work, a robot manipulation device 2 used by an operator 31 to manipulate the robot body 1 , a left-eye camera 3 and a right-eye camera 4 which capture a left-eye capturing image and a right-eye capturing image of a work area where the working part 11 of the robot body 1 performs the work, respectively, a stereoscopic display unit 5 which displays parallax images 43 to be seen three-dimensionally by the operator 31 with both eyes, an area manipulation device 6 operated by the operator 31 to specify a stereoscopic vision target area 50 which is to be seen three-dimensionally through the parallax images 43 displayed on the stereoscopic display unit 5 and is in an absolute space in a field of view 23 common between the left-eye camera 3 and the right-eye camera 4 , a robot controlling module 9 which controls the operation of the robot body 1 according to the manipulation of the robot manipulation device 2 , and
- the robot system 100 is provided with the robot body 1 , the robot manipulation device 2 , the left-eye camera 3 , the right-eye camera 4 , the stereoscopic display unit 5 , the area manipulation device 6 , and stereoscopic glasses 7 .
- the robot system 100 further includes a controller 8 .
- the controller 8 includes the robot controlling module 9 and the stereoscopic display controlling module 10 .
- a pair of the left-eye camera 3 and the right-eye camera 4 is provided corresponding to one robot body 1 .
- these components are described in detail in order.
- the robot body 1 is provided with the working part 11 .
- the working part 11 is configured at least to perform a given work.
- the working part 11 may be an end effector, for example. Examples of the end effector include a hand, a painting gun, a welding gun, and a nut runner.
- the working part 11 is the painting gun.
- the robot body 1 and the robot controlling module 9 constitute a robot.
- the robot is defined as, for example, “an intelligent machine system having three elemental technologies of a sensor, an intelligence/control system, and a drive system,” (see “Summary of WHITE PAPER Information and Communications in Japan,” the Japanese Ministry of Internal Affairs and Communications, 2015).
- the robot body 1 is comprised of, for example, an industrial robot, such as a vertical articulated robot, a horizontal articulated robot, a parallel link robot, a polar coordinates robot, a cylindrical coordinates robot, and a rectangular coordinates robot.
- an industrial robot such as a vertical articulated robot, a horizontal articulated robot, a parallel link robot, a polar coordinates robot, a cylindrical coordinates robot, and a rectangular coordinates robot.
- a case where the robot body 1 is comprised of a robotic arm of a vertical articulated robot is illustrated.
- the manipulation device 2 may be any device, as long as it can manipulate the robot body 1 (including the working part 11 ).
- the manipulation device 2 may be constituted by a master robot having a similar shape to the robot body 1 , and the robot body 1 may be controlled as a slave robot.
- the manipulation device 2 may be a joystick.
- the manipulation device 2 may be a manipulation device for exclusive use customized by a specific application.
- the manipulation device 2 is comprised of one for exclusive use customized as illustrated in FIG. 2 .
- the left-eye camera 3 and the right-eye camera 4 are comprised of digital cameras or analog cameras.
- the left-eye camera 3 and the right-eye camera 4 are disposed so that optical axes 3 b and 4 b of the respective cameras are parallel to each other having a given gap therebetween.
- the left-eye camera 3 and the right-eye camera 4 are disposed so as to be able to image the work area where the working part 11 of the robot body 1 performs a work to a work target object 21 .
- the left-eye camera 3 and the right-eye camera 4 have given angles of view (fields of view) 3 a and 4 a , respectively.
- Each of the angles of view 3 a and 4 a of the left-eye camera 3 and the right-eye camera 4 is, for example, 150° or above and 360° or below.
- areas (fields of view) in the absolute space which can be imaged increase as the angles of view 3 a and 4 a increase, distortion of the captured images (the captured images curve more in a part closer to the peripheral) increase.
- the stereoscopic display unit 5 is comprised of, for example, a panel display.
- the panel display may be a known panel display.
- the stereoscopic display unit 5 is installed near the robot manipulation device 2 so that the operator can easily watch.
- the area manipulation device 6 may be any device, as long as it can be operated to specify the stereoscopic vision target area 50 in the absolute space in the field of view 23 which is common between the left-eye camera 3 and the right-eye camera 4 .
- the shape of the stereoscopic vision target area 50 is arbitrary.
- the area manipulation device 6 when the operator 31 operates the area manipulation device 6 , the area manipulation device 6 outputs area specifying information including a position (e.g., a representative position) of the stereoscopic vision target area 50 in the absolute space, as an area manipulating signal.
- the area manipulation device 6 is comprised of, for example, a joystick.
- the area manipulation device 6 comprised of the joystick can be tilted in an arbitrary direction.
- the joystick is provided at its tip-end part with a forward button (not illustrated) and a rearward button (not illustrated).
- the joystick is further provided at its tip-end part with a plurality of buttons to control a mode of the stereoscopic vision target area (hereinafter, referred to as “area mode controlling buttons” (not illustrated)).
- area mode controlling buttons are configured to be operated by the operator 31 with a thumb while the area manipulation device 6 is gripped by his/her hand, and pressed down according to a pressing force. Note that, in FIG. 2 , the area manipulation device 6 is illustrated in a mode where the forward button, the rearward button, and the area mode controlling buttons are omitted for simplification.
- the robot controlling module 9 has a reference coordinate system of the robot body 1 (hereinafter, simply be referred to as a “reference coordinate system”).
- the stereoscopic display controlling module 10 shares this reference coordinate system, and identifies a position in the absolute space based on this reference coordinate system.
- the “position in the absolute space” means a position in the space defined based on the reference coordinate system.
- the stereoscopic vision target area 50 is moved in the absolute space according to an amount of tilt of the area manipulation device 6 , assuming that extending directions of the optical axes of the left-eye camera 3 and the right-eye camera 4 correspond to a direction of the area manipulation device 6 from its tip-end part to its base-end part.
- the forward button or the rearward button of the area manipulation device 6 is pressed down, the stereoscopic vision target area 50 moves forward or backward according to an amount of pressing down of the forward button or the rearward button, assuming that the extending directions of the optical axes of the left-eye camera 3 and the right-eye camera 4 correspond to a pressing-down direction of the forward button.
- An initial setting position is set for the stereoscopic vision target area 50 .
- This initial setting position serves as a reference position of the stereoscopic vision target area 50 in the absolute space corresponding to the position of the working part 11 of the robot body 1 .
- the image of the stereoscopic vision target area 50 at this initial setting position is displayed on the stereoscopic display unit 5 as the parallax images 43 of an initial setting corresponding to the position of the working part 11 of the robot body 1 .
- This initial setting position is suitably set according to the content of the work.
- the initial setting position is set at a position separated by a given distance in a direction the painting gun 11 injects paint. This “given distance” is set to, for example, a distance suitable for painting the work target object 21 .
- area specifying information on other than the position of the stereoscopic vision target area 50 is outputted as the area manipulating signal.
- the area specifying information on other than the position of the stereoscopic vision target area 50 will be described later in detail.
- the stereoscopic glasses 7 are configured so that optical filters are attached to a frame instead of lenses of normal glasses.
- optical filter polarizing filter (polarizer), a liquid crystal shutter, etc.
- waveplates which produce circular polarization are attached to a display screen of the stereoscopic display unit 5 so that a rotating direction of the circular polarization becomes alternate for every scan line, and odd scan lines and even scan lines display a left-eye image 41 and a right-eye image 42 , respectively.
- Left and right polarizing filters which can only transmit circular polarization corresponding to the left-eye image 41 and the right-eye image 42 , respectively, are attached to the stereoscopic glasses 7 .
- the stereoscopic display unit 5 is driven at a high speed, and the left-eye image 41 and the right-eye image 42 are alternately displayed in a time division manner. Left and right liquid crystal shutters are opened and closed so as to synchronize with the time-division display.
- the controller 8 includes, for example, a processor and a memory.
- the controller 8 controls operation of the robot body 1 and controls the stereoscopic display on the stereoscopic display unit 5 by the processor reading and executing a given operation program stored in the memory.
- the controller 8 is comprised of, for example, a microcontroller, an MPU, an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), or a logic circuit.
- the controller 8 includes the robot controlling module 9 and the stereoscopic display controlling module 10 .
- the robot controlling module 9 and the stereoscopic display controlling module 10 are functional blocks implemented by the processor reading and executing a given operation program stored in the memory.
- the controller 8 is comprised of a sole controller which executes a centralized control, or a plurality of controllers which execute a distributed control.
- the controller 8 may be comprised of two controllers so that these two controllers implement the robot controlling module 9 and the stereoscopic display controlling module 10 , respectively.
- the controller 8 may be installed at an arbitrary place.
- the controller 8 may be installed, for example, inside a pedestal 12 which supports the robot body 1 .
- the robot manipulation device 2 when the operator 31 operates the robot manipulation device 2 , the robot manipulation device 2 outputs a robot manipulating signal to the robot controlling module 9 .
- the robot controlling module 9 controls the operation of the robot body 1 according to the inputted robot manipulating signal.
- the robot body 1 operates according to the manipulation.
- the robot controlling module 9 has the reference coordinate system as described above, and identifies the position of the robot body 1 based on the reference coordinate system so as to control the operation of the robot body 1 .
- the stereoscopic display controlling module 10 controls operation of the left-eye camera 3 and the right-eye camera 4 , such as ON/OFF and focusing operation.
- the left-eye camera 3 and the right-eye camera 4 capture the work area where the working part 11 of the robot body 1 performs the work to the work target object 21 , and output the imaged left-eye capturing image 61 and right-eye capturing image 62 (see FIG. 6 ) to the stereoscopic display controlling module 10 , respectively.
- the area manipulation device 6 outputs the area manipulating signal to the stereoscopic display controlling module 10 .
- the stereoscopic display controlling module 10 executes image processing to the inputted left-eye capturing image 61 and right-eye capturing image 62 according to the inputted area manipulating signal so as to generate the parallax images 43 . Then, the stereoscopic display controlling module 10 outputs to the stereoscopic display unit 5 an image displaying signal for displaying the generated parallax images.
- the stereoscopic display unit 5 displays the parallax images 43 according to the inputted image displaying signal.
- the left-eye image 41 and the right-eye image 42 constituting the parallax images 43 are displayed next to each other on the display screen.
- the left-eye image 41 and the right-eye image 42 may be displayed overlapping with each other on the display screen.
- the stereoscopic display controlling module 10 displays, on the stereoscopic display unit 5 , the parallax images 43 of the initial setting corresponding to the position of the working part 11 of the robot body 1 . Therefore, the parallax images 43 can be displayed so as to follow the operation of the robot body 1 .
- the parallax images 43 of the initial setting corresponding to the position of the working part (painting gun) 11 of the robot body 1 is displayed only when a given condition is satisfied as well as when the area manipulation device 6 is not operated.
- a given first condition is defined to be a start timing of the work.
- the image of the stereoscopic vision target area 50 in the absolute space corresponding to the position of the working part 11 of the robot body 1 is displayed as the parallax images 43 of the initial setting when the work starts
- the image of the stereoscopic vision target area 50 in the absolute space corresponding to the working part (painting gun) 11 of the robot body 1 is displayed so as to follow the operation of the robot body 1 . Therefore, the operator 31 can promptly and three-dimensionally see a desired part of the work area where the working part 11 of the robot body 1 performs the work in detail, by operating the area manipulation device 6 while manipulating the robot body 1 by the robot manipulation device 2 .
- a given second condition is defined that the area manipulation device 6 is not operated for a given period of time.
- the “given period of time” is suitably determined through a calculation, an experiment, a simulation, etc. According to this configuration, following operation and effects are achieved.
- the stereoscopic display unit 5 displays the parallax images 43 of only very small stereoscopic vision target area 50 in the work area of the robot body 1 . Therefore, when an area far away from the stereoscopic vision target area 50 currently displayed is desired to be displayed as the stereoscopic vision target area, a direction in which the stereoscopic vision target area should be moved is undecidable in some cases. According to this configuration, when the area manipulation device 6 is not operated for the given time period due to the reason as described above, the image of the stereoscopic vision target area 50 in the absolute space corresponding to the position of the working part 11 (painting gun) of the robot body 1 is displayed on the stereoscopic display unit 5 as the parallax images of the initial setting. Therefore, by moving the stereoscopic vision target area 50 from the initial setting of the parallax images 43 as a start point, the area far away from the stereoscopic vision target area 50 currently displayed can be easily displayed as the stereoscopic vision target area 50 .
- the robot system 100 is installed, for example, inside a work room 14 .
- the work room 14 is illustrated to be see-through for convenience.
- a lift 22 which hangs the work target object 21 and transfers it, is provided passing through an upper part of the work room 14 .
- the work target object 21 is, for example, a link member which constitutes a robotic arm of an articulated robot.
- two robot systems 100 are disposed along the lift 22 .
- the robot body 1 of each robot system 100 is comprised of a vertical articulated robot, and a painting gun is attached to a tip-end part of the robot body 1 as the working part 11 .
- the link member transferred by the lift 22 as the work target object 21 is painted by the robot body 1 . Then, an area including the work target object 21 which moves by the time the painting work by the robot body 1 finishes, the surrounding of the work target 21 , and the working part 11 of the robot body 1 , is the work area.
- the robot body 1 is provided on the pedestal 12 .
- the pair of the left-eye camera 3 and the right-eye camera 4 are disposed next to the robot body 1 .
- the pair of the left-eye camera 3 and the right-eye camera 4 are disposed on a placing stand, so that the optical axes pass an area where the work target object 21 hung from the lift 22 passes through.
- the pair of the left-eye camera 3 and the right-eye camera 4 are disposed so that the work area where the working part (painting gun) 11 of the robot body 1 works is within their fields of view (angles of view).
- the pair of the left-eye camera 3 and the right-eye camera 4 , and the placing stand are accommodated in a transparent case 13 so as not to be painted.
- An operation desk 16 and an operation chair 15 are disposed in a room next to the work room 14 . Although these operation desk 16 and operation chair 15 are provided for each robot system 100 , only the operation desk 16 and the operation chair 15 corresponding to one robot system 100 is illustrated in FIG. 1 .
- the robot manipulation device 2 is provided to a right-side part of the operation chair 15 .
- the area manipulation device 6 is provided to a left-side part of the operation chair 15 .
- the operation desk 16 is placed in front of the operation chair 15 .
- the stereoscopic display unit 5 comprised of the panel display is placed on the operation desk 16 .
- the operator 31 is seated on the operation chair 15 , grips the robot manipulation device 2 by his/her right hand, and grips the area manipulation device 6 by his/her left hand.
- the operator 31 wears the stereoscopic glasses 7 , and operates the robot manipulation device 2 and the area manipulation device 6 by his/her right hand and left hand, respectively, while three-dimensionally watching through the stereoscopic glasses 7 the parallax images 43 displayed on the stereoscopic display unit 5 . Then, the robot body 1 operates according to the manipulation of the robot manipulation device 2 , and the parallax images 43 are displayed on the stereoscopic display unit 5 according to the manipulation of the area manipulation device 6 .
- FIG. 6 is a schematic view schematically illustrating image processing to extract an image corresponding to the stereoscopic vision target area to be seen three-dimensionally in detail by the operator.
- the parallax images 43 of the stereoscopic vision target area 50 at the initial setting position are displayed on the stereoscopic display unit 5 as an initial setting image.
- the operator 31 operates the area manipulation device 6 by his/her left hand while operating the robot manipulation device 2 by his/her right hand.
- the operator 31 first, operates the area manipulation device 6 (and the forward button or the rearward button) so as to position the stereoscopic vision target area 50 at the desired position. Then, this positional information is outputted to the stereoscopic display controlling module 10 as the area manipulating signal.
- the image 51 and the image 52 corresponding to the stereoscopic vision target area 50 specified based on the positional information are extracted from the left-eye capturing image 61 and the right-eye capturing image 62 , which are inputted from the left-eye camera 3 and the right-eye camera, respectively. Then, these extracted image 51 and image 52 are corrected so that distortion of the images caused by wide-angle lenses is removed. Therefore, the image distortion of the extracted image 51 and image 52 becomes the distortion level almost the same as an image captured by a standard lens. Note that the degree of removing the image distortion by the correction is arbitrary, as long as the image distortion is corrected to the extent the operator 31 can visually and accurately recognize the target object of the stereoscopic vision.
- the stereoscopic display controlling module 10 enlarges the pair of the corrected images to a given size so as to generate the left-eye image 41 and the right-eye image 42 , and display these images on the stereoscopic display unit 5 as the parallax images 43 .
- the operator 31 three-dimensionally sees the parallax images 43 through the stereoscopic glasses 7 .
- the operator 31 when the operator 31 tilts the area manipulation device 6 in a desired direction, the operator 31 can three-dimensionally watch the desired direction in the work area of the robot body 1 in the field of view common between the pair of the left-eye camera 3 and the right-eye camera 4 . Therefore, only by the operator 31 operating the area manipulation device 6 , he/she can watch around the work area as if he/she looks around the work area like positioning his/her head at the position of the pair of the left-eye camera 3 and the right-eye camera 4 and turning the head in the up-and-down and left-and-right directions. Moreover, the desired partial area of the work area can be three-dimensionally seen in detail.
- the operator 31 can three-dimensionally watch the desired partial area of the work area in detail as if the operator 31 looks around the work area around him/her.
- the operability of the robot body 1 improves compared to the conventional technology.
- the operator 31 operates the plurality of area mode controlling buttons of the area manipulation device 6 .
- the stereoscopic display controlling module 10 changes the size of the stereoscopic vision target area 50 (in detail, the images 51 and 52 corresponding to the stereoscopic vision target area 50 ) when seen in the optical-axis direction of the pair of left-eye camera 3 and the right-eye camera 4 , according to area specifying information for changing the size of the stereoscopic vision target area 50 . Then, the stereoscopic display controlling module 10 displays, on the stereoscopic display unit, the parallax images 43 on which this change is reflected.
- the stereoscopic display controlling module 10 changes the parallax between the image 51 and the image 52 according to area specifying information for changing the parallax. Then, the stereoscopic display controlling module 10 displays, on the stereoscopic display unit 5 , the left-eye image 41 and the right-eye image 42 on which this changed parallax is reflected.
- the stereoscopic display controlling module 10 specifies the zooming of the image 51 and the image 52 according to area specifying information for specifying the zooming. Then, the stereoscopic display controlling module 10 displays, on the stereoscopic display unit 5 , the left-eye image 41 and the right-eye image 42 which are enlarged or reduced in size according to the specified zooming.
- This zooming is a pseudo zooming and carried out by adjusting an enlarging rate of the image 51 and the image 52 illustrated in FIG. 6 .
- the desired part of the work area where the working part 11 of the robot body 1 performs the work can be seen three-dimensionally in detail, in the desired mode.
- the operator 31 can three-dimensionally see the desired part of the work area where the working part 11 of the robot body 1 performs the work, in detail.
- the pair of the left-eye camera 3 and the right-eye camera 4 are provided so as to be changeable of the directions of their optical axes in the robot system 100 of Embodiment 1.
- a device for changing the directions of the optical axes can be comprised of a known posture changing device. Examples of the posture changing device include a small articulated robot, a driving device for a parabolic antenna, etc.
- the operator 31 can three-dimensionally see the desired part of a wider work area of the robot body 1 in detail.
- Embodiment 3 of the present disclosure is different from Embodiment 1 in terms of the following point, and the other points are similar to Embodiment 1. Below, the different point is described.
- FIG. 7 is a schematic view illustrating relations between fields of view of a plurality of pairs of the left-eye camera and the right-eye camera and the stereoscopic vision target area to be seen three-dimensionally in detail by the operator according to Embodiment 3.
- a plurality of pairs (here, four pairs) of the left-eye camera 3 and the right-eye camera 4 are disposed.
- the plurality of pairs of the left-eye camera 3 and the right-eye camera 4 are disposed surrounding the work area (not illustrated in FIG. 7 ) where the working part (painting gun) 11 of the robot body 1 performs the work.
- the plurality of pairs of the left-eye camera 3 and the right-eye camera 4 are disposed so that the common field of view 23 of each pair of the left-eye camera 3 and the right-eye camera 4 is overlapped with that of the adjacent pair.
- the area manipulation device 6 is provided with a camera selecting button which selects one pair from the plurality of pairs of the left-eye camera 3 and the right-eye camera 4 .
- the stereoscopic display controlling module 10 displays, on the stereoscopic display unit 5 , the parallax images 43 captured by the selected pair of the left-eye camera 3 and the right-eye camera 4 .
- the stereoscopic display controlling module 10 may automatically switch the pair of the left-eye camera 3 and the right-eye camera 4 according to the position of the stereoscopic vision target area 50 , and the parallax images 43 thereof may be displayed on the stereoscopic display unit 5 .
- the stereoscopic display controlling module 10 selects a pair of the left-eye camera 3 and the right-eye camera 4 which are closest to the stereoscopic vision target area 50 and of which the optical axes are the closest to the stereoscopic vision target area 50 .
- the operator 31 can three-dimensionally see the desired part of the work area in detail as if he/she circles around the peripheral of the work area. Therefore, the operability of the robot body 1 further improves.
- the stereoscopic display unit 5 may be an HMD (Head Mounted Display) which is mounted on a head of the operator 31 .
- HMD Head Mounted Display
- the robot system of the present disclosure is useful as a robot system which enables an operator to manipulate a robot body while three-dimensionally watching a part of a work area in detail.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-164779 | 2018-09-03 | ||
JP2018164779A JP7169130B2 (ja) | 2018-09-03 | 2018-09-03 | ロボットシステム |
PCT/JP2019/034227 WO2020050179A1 (ja) | 2018-09-03 | 2019-08-30 | ロボットシステム |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210323165A1 US20210323165A1 (en) | 2021-10-21 |
US11833698B2 true US11833698B2 (en) | 2023-12-05 |
Family
ID=69722286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/273,235 Active US11833698B2 (en) | 2018-09-03 | 2019-08-30 | Vision system for a robot |
Country Status (5)
Country | Link |
---|---|
US (1) | US11833698B2 (ja) |
EP (1) | EP3848164A4 (ja) |
JP (1) | JP7169130B2 (ja) |
CN (1) | CN112423942A (ja) |
WO (1) | WO2020050179A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230010975A1 (en) * | 2019-12-13 | 2023-01-12 | Kawasaki Jukogyo Kabushiki Kaisha | Robot system |
CN111645077A (zh) * | 2020-06-19 | 2020-09-11 | 国电南瑞科技股份有限公司 | 配网线路带电作业机器人地面监控系统及监控方法 |
WO2022124398A1 (ja) | 2020-12-10 | 2022-06-16 | 三菱電機株式会社 | 遠隔制御マニピュレータシステムおよび遠隔制御支援システム |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06292240A (ja) | 1993-04-05 | 1994-10-18 | Nippon Steel Corp | 遠隔操作支援立体表示装置 |
JPH089423A (ja) | 1994-06-20 | 1996-01-12 | Mitsubishi Heavy Ind Ltd | 遠隔モニタ装置 |
US5684531A (en) * | 1995-04-10 | 1997-11-04 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Ranging apparatus and method implementing stereo vision system |
JPH09327044A (ja) | 1996-06-06 | 1997-12-16 | Tokyo Electric Power Co Inc:The | マニピュレータの立体視装置 |
JP2001039398A (ja) | 1999-07-30 | 2001-02-13 | Kawasaki Heavy Ind Ltd | ヘリコプタの操縦支援装置 |
JP2002354505A (ja) | 2001-05-29 | 2002-12-06 | Vstone Kk | 立体視システム |
US20110122232A1 (en) | 2009-11-26 | 2011-05-26 | Kenji Hoshino | Stereoscopic image display apparatus, compound-eye imaging apparatus, and recording medium |
US20110234584A1 (en) | 2010-03-25 | 2011-09-29 | Fujifilm Corporation | Head-mounted display device |
US20120095619A1 (en) * | 2010-05-11 | 2012-04-19 | Irobot Corporation | Remote Vehicle Missions and Systems for Supporting Remote Vehicle Missions |
JP2013036243A (ja) | 2011-08-09 | 2013-02-21 | Topcon Corp | 建設機械制御システム |
US8475377B2 (en) * | 2009-09-28 | 2013-07-02 | First Sense Medical, Llc | Multi-modality breast cancer test system |
US20140005484A1 (en) * | 2012-06-27 | 2014-01-02 | CamPlex LLC | Interface for viewing video from cameras on a surgical visualization system |
US9283680B2 (en) * | 2013-06-07 | 2016-03-15 | Kabushiki Kaisha Yaskawa Denki | Workpiece detector, robot system, method for producing to-be-processed material, method for detecting workpiece |
US20160165130A1 (en) | 2009-06-17 | 2016-06-09 | Lc Technologies, Inc. | Eye/Head Controls for Camera Pointing |
US9392258B2 (en) * | 2011-02-01 | 2016-07-12 | National University Of Singapore | Imaging system and method |
US20180222056A1 (en) * | 2017-02-09 | 2018-08-09 | Canon Kabushiki Kaisha | Method of teaching robot and robot system |
US20190187477A1 (en) * | 2017-12-20 | 2019-06-20 | Seiko Epson Corporation | Transmissive display device, display control method, and computer program |
US20190290371A1 (en) * | 2016-09-29 | 2019-09-26 | Medrobotics Corporation | Optical systems for surgical probes, systems and methods incorporating the same, and methods for performing surgical procedures |
WO2019210322A1 (en) * | 2018-04-27 | 2019-10-31 | Truevision Systems, Inc. | Stereoscopic visualization camera and integrated robotics platform |
US20190355148A1 (en) * | 2017-02-06 | 2019-11-21 | Fujifilm Corporation | Imaging control device, imaging control method, and program |
US20200084423A1 (en) * | 2018-09-06 | 2020-03-12 | Toyota Jidosha Kabushiki Kaisha | Mobile robot, remote terminal, control program for mobile robot, control program for remote terminal, control system, control method for mobile robot, and control method for remote terminal |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0421105A (ja) * | 1990-05-16 | 1992-01-24 | Hitachi Ltd | マニピユレータの立体教示装置 |
JPH05318361A (ja) * | 1992-05-20 | 1993-12-03 | Nec Corp | 物体操作方式 |
US5673082A (en) * | 1995-04-10 | 1997-09-30 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Light-directed ranging system implementing single camera system for telerobotics applications |
JPH08336166A (ja) * | 1995-06-07 | 1996-12-17 | Matsushita Electric Ind Co Ltd | 映像視聴装置 |
FR2894684B1 (fr) * | 2005-12-14 | 2008-03-21 | Kolpi Sarl | Systeme de visualisation pour la manipulation d'un objet |
JP2011182808A (ja) * | 2010-03-04 | 2011-09-22 | Fujifilm Corp | 医用画像生成装置、医用画像表示装置、医用画像生成方法及びプログラム |
JP5920911B2 (ja) | 2011-11-10 | 2016-05-18 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
-
2018
- 2018-09-03 JP JP2018164779A patent/JP7169130B2/ja active Active
-
2019
- 2019-08-30 WO PCT/JP2019/034227 patent/WO2020050179A1/ja unknown
- 2019-08-30 CN CN201980045586.4A patent/CN112423942A/zh active Pending
- 2019-08-30 US US17/273,235 patent/US11833698B2/en active Active
- 2019-08-30 EP EP19858091.2A patent/EP3848164A4/en active Pending
Patent Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06292240A (ja) | 1993-04-05 | 1994-10-18 | Nippon Steel Corp | 遠隔操作支援立体表示装置 |
JPH089423A (ja) | 1994-06-20 | 1996-01-12 | Mitsubishi Heavy Ind Ltd | 遠隔モニタ装置 |
US5684531A (en) * | 1995-04-10 | 1997-11-04 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Ranging apparatus and method implementing stereo vision system |
JPH09327044A (ja) | 1996-06-06 | 1997-12-16 | Tokyo Electric Power Co Inc:The | マニピュレータの立体視装置 |
JP2001039398A (ja) | 1999-07-30 | 2001-02-13 | Kawasaki Heavy Ind Ltd | ヘリコプタの操縦支援装置 |
JP2002354505A (ja) | 2001-05-29 | 2002-12-06 | Vstone Kk | 立体視システム |
US20170099433A1 (en) * | 2009-06-17 | 2017-04-06 | Lc Technologies, Inc. | Eye/Head Controls for Camera Pointing |
US20160165130A1 (en) | 2009-06-17 | 2016-06-09 | Lc Technologies, Inc. | Eye/Head Controls for Camera Pointing |
US8475377B2 (en) * | 2009-09-28 | 2013-07-02 | First Sense Medical, Llc | Multi-modality breast cancer test system |
US20110122232A1 (en) | 2009-11-26 | 2011-05-26 | Kenji Hoshino | Stereoscopic image display apparatus, compound-eye imaging apparatus, and recording medium |
JP2011114547A (ja) | 2009-11-26 | 2011-06-09 | Fujifilm Corp | 立体画像表示装置、複眼撮像装置及び立体画像表示プログラム |
US20110234584A1 (en) | 2010-03-25 | 2011-09-29 | Fujifilm Corporation | Head-mounted display device |
JP2011205358A (ja) | 2010-03-25 | 2011-10-13 | Fujifilm Corp | ヘッドマウントディスプレイ装置 |
US20120095619A1 (en) * | 2010-05-11 | 2012-04-19 | Irobot Corporation | Remote Vehicle Missions and Systems for Supporting Remote Vehicle Missions |
US9392258B2 (en) * | 2011-02-01 | 2016-07-12 | National University Of Singapore | Imaging system and method |
JP2013036243A (ja) | 2011-08-09 | 2013-02-21 | Topcon Corp | 建設機械制御システム |
WO2014004717A2 (en) | 2012-06-27 | 2014-01-03 | CamPlex LLC | Surgical visualization systems |
JP2015521913A (ja) | 2012-06-27 | 2015-08-03 | キャンプレックス インコーポレイテッド | 手術可視化システム |
US20140005484A1 (en) * | 2012-06-27 | 2014-01-02 | CamPlex LLC | Interface for viewing video from cameras on a surgical visualization system |
US9283680B2 (en) * | 2013-06-07 | 2016-03-15 | Kabushiki Kaisha Yaskawa Denki | Workpiece detector, robot system, method for producing to-be-processed material, method for detecting workpiece |
US20190290371A1 (en) * | 2016-09-29 | 2019-09-26 | Medrobotics Corporation | Optical systems for surgical probes, systems and methods incorporating the same, and methods for performing surgical procedures |
US20190355148A1 (en) * | 2017-02-06 | 2019-11-21 | Fujifilm Corporation | Imaging control device, imaging control method, and program |
US20180222056A1 (en) * | 2017-02-09 | 2018-08-09 | Canon Kabushiki Kaisha | Method of teaching robot and robot system |
US20190187477A1 (en) * | 2017-12-20 | 2019-06-20 | Seiko Epson Corporation | Transmissive display device, display control method, and computer program |
WO2019210322A1 (en) * | 2018-04-27 | 2019-10-31 | Truevision Systems, Inc. | Stereoscopic visualization camera and integrated robotics platform |
US20200084423A1 (en) * | 2018-09-06 | 2020-03-12 | Toyota Jidosha Kabushiki Kaisha | Mobile robot, remote terminal, control program for mobile robot, control program for remote terminal, control system, control method for mobile robot, and control method for remote terminal |
Non-Patent Citations (1)
Title |
---|
"Key Points of the 2015 White Paper on Information and Communications in Japan," the Japanese Ministry of Internal Affairs and Communications, 2015. |
Also Published As
Publication number | Publication date |
---|---|
WO2020050179A1 (ja) | 2020-03-12 |
EP3848164A4 (en) | 2022-06-15 |
US20210323165A1 (en) | 2021-10-21 |
CN112423942A (zh) | 2021-02-26 |
JP7169130B2 (ja) | 2022-11-10 |
EP3848164A1 (en) | 2021-07-14 |
JP2020037145A (ja) | 2020-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11833698B2 (en) | Vision system for a robot | |
EP3379525B1 (en) | Image processing device and image generation method | |
US4870600A (en) | Three-dimensional image display system using binocular parallax | |
CN102958464B (zh) | 具有改进的控制器的自动化外科手术系统 | |
AU2012227252B2 (en) | Surgical Stereo Vision Systems And Methods For Microsurgery | |
US5175616A (en) | Stereoscopic video-graphic coordinate specification system | |
JP6339239B2 (ja) | 頭部装着型表示装置、及び映像表示システム | |
JPH10210506A (ja) | 3次元画像情報入力装置および3次元画像情報入出力装置 | |
US20170315367A1 (en) | Head-mounted display device and video display system | |
JPH11202256A (ja) | 頭部搭載型画像表示装置 | |
WO2013177654A1 (en) | Apparatus and method for a bioptic real time video system | |
JP3900578B2 (ja) | 追従型虚像視ディスプレイシステム | |
US20210271065A1 (en) | Method of operating a surgical microscope and surgical microscope | |
JP7118650B2 (ja) | 表示装置 | |
WO2017191702A1 (ja) | 画像処理装置 | |
JPH0685590B2 (ja) | 立体表示システム | |
WO2020137088A1 (ja) | ヘッドマウントディスプレイ、表示方法、及び表示システム | |
US11607287B2 (en) | Method of operating a surgical microscope and surgical microscope | |
JPH07328971A (ja) | Tvカメラ付マニピュレータ | |
JPH03119890A (ja) | 立体視視覚装置 | |
JPH08111876A (ja) | 立体映像表示装置及び立体映像を利用した遠隔作業システム | |
WO2020235541A1 (ja) | 画像インタフェース装置、画像操作装置、操作対象物操作装置、操作対象物操作システム、操作対象物提示方法および操作対象物提示プログラム | |
JP2023047376A (ja) | 立体画像制御方法 | |
JP2022053421A (ja) | 遠隔操作のための表示システム、表示方法およびプログラム | |
CA2022706C (en) | Stereoscopic video-graphic coordinate specification system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: KAWASAKI JUKOGYO KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAMON, MASAYUKI;SUGIYAMA, HIROKAZU;SIGNING DATES FROM 20210325 TO 20210329;REEL/FRAME:055920/0902 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |