JP6297784B2 - Manipulator device - Google Patents

Manipulator device Download PDF

Info

Publication number
JP6297784B2
JP6297784B2 JP2013059018A JP2013059018A JP6297784B2 JP 6297784 B2 JP6297784 B2 JP 6297784B2 JP 2013059018 A JP2013059018 A JP 2013059018A JP 2013059018 A JP2013059018 A JP 2013059018A JP 6297784 B2 JP6297784 B2 JP 6297784B2
Authority
JP
Japan
Prior art keywords
manipulator
unit
touch panel
camera
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013059018A
Other languages
Japanese (ja)
Other versions
JP2014184494A (en
Inventor
大志茂 純
純 大志茂
啓一 綿貫
啓一 綿貫
和憲 楓
和憲 楓
千摩 坂井田
千摩 坂井田
雅大 平野
雅大 平野
Original Assignee
株式会社椿本チエイン
国立大学法人埼玉大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社椿本チエイン, 国立大学法人埼玉大学 filed Critical 株式会社椿本チエイン
Priority to JP2013059018A priority Critical patent/JP6297784B2/en
Publication of JP2014184494A publication Critical patent/JP2014184494A/en
Application granted granted Critical
Publication of JP6297784B2 publication Critical patent/JP6297784B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a manipulator device including a manipulator that processes an object.
  There are manipulator devices that process an object under the naked eye or under a microscope (for example, Patent Documents 1 and 2). The user operates an operation unit (for example, a dial, an operation lever, etc.) of the manipulator to process the object while operating the manipulator.
Japanese Patent Laid-Open No. 10-3043 JP 2003-84209 A
  However, in the case of the manipulator devices disclosed in Patent Documents 1 and 2, the manipulator operation unit does not enter the field of view of the user who is viewing the object to be processed. Therefore, the user cannot perform the processing of the object and the operation of the manipulator at the same time until he / she becomes familiar with the operation of the manipulator.
  The present application has been made in view of such circumstances. An object of the present invention is to provide a manipulator device that is easy to operate and allows a user to place an object and an operation unit in the same field of view.
One aspect of the present application is that, in a manipulator device that processes an object with a processing tool attached to a distal end portion of the manipulator, an imaging unit that images the object to be processed by the processing tool, and an image captured by the imaging unit A touch panel that displays on the screen and receives an instruction for controlling the operation of the manipulator, a control unit that controls the operation of the manipulator based on the instruction received by the touch panel, a distance on the screen of the touch panel, and A storage unit that stores the movement distance of the tip in association with each other and stores a three-dimensional position of the object, and the touch panel receives a three-dimensional target position of the processing tool corresponding to the object. The control unit is stored in the three-dimensional target position received by the touch panel and the storage unit. Based on the contents, Yes so as to control the operation of the manipulator, the control unit, the gesture input which the touch panel is accepted, Yes so as to switch between coarse operation and fine operation of the manipulator, said touch panel When a tap operation is received, the operation is switched to the coarse operation, and when the touch panel receives a drag operation, the operation is switched to the fine operation .
  According to one aspect of the present application, the user can put the object and the operation unit in the same field of view.
It is the perspective view which showed the external appearance of the manipulator apparatus from the front side. It is a block diagram which shows the hardware structural example of a manipulator apparatus. It is a block diagram which shows the hardware structural example of a computer. It is the top view which showed roughly arrangement | positioning of the component in a manipulator part. It is explanatory drawing explaining the imaging screen of the 1st camera displayed on the screen of a touch panel, and a 2nd camera. It is explanatory drawing explaining the imaging screen of the 1st camera displayed on the screen of a touch panel, and a 2nd camera. It is a flowchart which shows an example of the procedure of the process which CPU transmits a pulse signal to a manipulator. It is a flowchart which shows an example of the procedure of the process which CPU transmits the pulse signal which concerns on a three-dimensional movement to a manipulator. It is a perspective view which shows the external appearance of an input device. It is a block diagram which shows the hardware structural example of a manipulator apparatus. It is explanatory drawing which shows the structural example of a clean box.
  Hereinafter, a manipulator device in one example of the present application will be described based on the drawings showing embodiments. The manipulator device according to the present application is used to manipulate or process microorganisms, animal and plant cells, thin films, polycrystalline silicon, semiconductors, mechanical parts, tools, rocks, minerals, and the like. Below, the case where the shoot tip which is a growth point of a plant body is extracted for shoot tip culture (Mericron) is mentioned as an example, and the manipulator device concerning this application is explained. An object to be processed in the present embodiment is, for example, a potato bud.
Embodiment 1
FIG. 1 is a perspective view showing the appearance of the manipulator device 1 from the front side. The manipulator device 1 is placed on a substantially horizontal desk surface. Here, when the user faces the object to be processed by the manipulator device 1, the user side from the object is the front side or the front side, and the opposite is the rear side or the back side. From the user toward the object, the right side of the object is the right side of the manipulator device 1, and the left side of the object is the left side of the manipulator device 1. The desk surface side from the object is the lower side of the manipulator device 1, and the opposite is the upper side of the manipulator device 1.
FIG. 2 is a block diagram illustrating a hardware configuration example of the manipulator device 1. The manipulator device 1 includes a manipulator unit 10 and an operation display unit 20.
The manipulator unit 10 is a component that processes or manipulates (manipulates, cuts, peels, removes, moves, assembles, disassembles, separates, etc.) an object with the manipulator 11. The manipulator unit 10 is also a component that captures an image of a work situation in which an object is processed. The operation display unit 20 is a component that operates the manipulator 11 and displays a work status of extracting the shoot apex from the potato bud.
The manipulator unit 10 and the operation display unit 20 are connected by a cable that transmits an operation signal, an image signal, and the like. The manipulator unit 10 and the operation display unit 20 may be connected wirelessly.
In the example of FIG. 1, the manipulator unit 10 and the operation display unit 20 are arranged at adjacent positions. However, since the operation display unit 20 can remotely control the manipulator unit 10, the arrangement position of the manipulator unit 10 and the operation display unit 20 is not restricted by a physical distance.
The manipulator unit 10 includes a manipulator 11, a camera (imaging unit) 12, a movable stage 13, a base plate 14, and a clean box (box body) 15. In FIG. 2, the base plate 14 shown in FIG. 1 is omitted.
The manipulator 11 is a three-axis actuator that moves a processing tool (effector) for processing an object back and forth, left and right, and up and down. The manipulator 11 is driven by, for example, a stepping motor, a piezo element, a linear motor, or the like. In the following, it is assumed that the manipulator 11 is driven by a stepping motor that operates in synchronization with pulse power.
  The manipulator 11 has three rod-shaped bodies that are orthogonal to each other. The arm corresponds to the arm 111 of the manipulator 11. Various processing tools for operating the potato buds are attached to the tip of one arm 111. The processing tool is, for example, a blade, a syringe (syringe), a spear 11m (metzen), or a wire-driven articulated unit. FIG. 1 illustrates a state in which a flange 11m is attached to the tip of the arm 111.
  The camera 12 is, for example, a CCD camera. The camera 12 may be a microscope camera equipped with a magnifying lens or a normal camera without a magnifying function. The camera 12 may be appropriately selected according to the processing target and the processing content. In the present embodiment, for example, a microscope camera is used as the camera 12 in order to image a minute shoot apex of about 0.2 mm.
  The number of cameras 12 may be one or more. In this embodiment, since it is preferable to three-dimensionally image the potato buds, the manipulator device 1 includes two cameras 12 having front and rear and left and right directions as optical axes, respectively. Of course, the manipulator device 1 may include a third camera 12 that images the potato buds in the vertical direction. Further, when it is possible to work with an image from only one direction, the manipulator device 1 may include only one camera 12.
  The movable stage 13 is a mounting table on which an object is mounted. The movable stage 13 can move an object placed on a stage surface substantially parallel to the desk surface, for example, by driving a motor under the control of the operation display unit 20. The movement of the movable stage 13 may be, for example, a translational movement in the front-rear or left-right direction, or may be a rotational movement that rotates around the position where the object is placed. Alternatively, the movement by the movable stage 13 may be translational movement and rotational movement.
  In the example of FIG. 1, a tray 131t on which potato buds are installed and a medium tray 132t for cultivating the extracted shoot apex are placed on the movable stage 13 in a state of being arranged in the left-right direction. Yes. When the cocoon 11m attached to the tip of the manipulator 11 extracts the potato shoot apex, the movable stage 13 moves the medium tray 132t immediately below the cocoon 11m. The manipulator 11 moves the cocoon 11m downward, and transplants the potato shoot apex to the medium tray 132t.
  The base plate 14 is a flat mounting plate on which the manipulator 11, the camera 12, and the movable stage 13 are mounted. The base plate 14 has a rectangular shape extending in the left-right direction.
  The clean box 15 has a rectangular parallelepiped shape. The clean box 15 is composed of, for example, a steel frame and, for example, a transparent acrylic plate that fits into the frame. The clean box 15 is subjected to sterilization and antibacterial treatment. The size of the bottom of the clean box 15 is substantially the same as or slightly smaller than the base plate 14. The bottom of the clean box 15 is open. When the clean box 15 is placed on the base plate 14 along the circumference of the base plate 14, the manipulator 11, the camera 12 and the movable stage 13 placed on the base plate 14 are connected to the outside of the clean box 15. It is blocked by the wall surface. The clean box 15 can keep the processing space of the target object in a clean aseptic state by preventing the entry of foreign matter, microorganisms, and the like.
  The manipulator unit 10 may include an illuminating device that irradiates the object on the movable stage 13 with light.
The operation display unit 20 includes a computer 21, a touch panel 22, a keyboard 23, a mouse 24, and a monitor 25.
The computer 21 is a device that controls the manipulator 11 and the movable stage 13 and displays an image captured by the camera 12 on the touch panel 22 and the monitor 25. The computer 21 is a desktop PC (personal computer), a notebook PC, a tablet PC, a smartphone, or the like. In the following, it is assumed that the computer 21 is a desktop PC. When the computer 21 is a tablet PC, a smartphone, or the like having an input function such as the touch panel 22, the manipulator device 1 may not include the touch panel 22, the keyboard 23, and the mouse 24.
  FIG. 3 is a block diagram illustrating a hardware configuration example of the computer 21. The computer 21 includes a central processing unit (CPU) 211, a read only memory (ROM) 212, and a random access memory (RAM) 213. The computer 21 includes a hard disk (storage unit) 214, a disk drive 215, an interface 216, and a communication unit 217. Each component of the computer 21 is connected to each other via a bus 21b.
  The CPU 211 is a processor and controls each component of the manipulator device 1. The CPU 211 reads the program 21P stored in the hard disk 214 to the RAM 213, and executes the program 21P read to the RAM 213.
The ROM 212 is a non-volatile semiconductor memory or a read-only storage medium other than the semiconductor memory. The ROM 212 stores a basic input / output system (BIOS) executed by the CPU 211 when the computer 21 is started up, firmware, and the like.
The RAM 213 is, for example, an SRAM or a DRAM, and temporarily stores work variables, data, and the like that are necessary during the process executed by the CPU 211. The RAM 213 is an example of a main storage device, and a flash memory, a memory card, or the like may be used instead of the RAM 213.
  The hard disk 214 is an auxiliary storage device that stores the program 21P executed by the CPU 211 and various files. The program 21P records a procedure of processing executed by the CPU 211.
  The hard disk 214 may be attached inside the computer 21 or may be placed outside the computer 21. The hard disk 214 is an example of an auxiliary storage device, and is a flash memory capable of storing a large amount of information or an optical disk such as a CD (Compact Disc), a DVD (Digital Versatile Disc), or a BD (Blu-ray Disc, registered trademark). 21a may be substituted.
  The disk drive 215 is an auxiliary storage device that reads information from an optical disk 21a such as a CD, DVD, or BD, which is an external storage medium, and records information on the optical disk 21a. When the CPU 211 receives a command for ejecting the optical disk 21 a from the interface 216 to the disk drive 215, the CPU 211 ejects a tray (not shown) of the disk drive 215.
The interface 216 is a connector that connects the touch panel 22, the keyboard 23, the mouse 24, and the monitor 25 to the bus 21b. Note that the interface 216 may include a USB connector, a Light Peak connector, and the like.
The interface 216 is also connected to the manipulator 11, the camera 12, and the movable stage 13 of the manipulator unit 10.
The communication unit 217 is a wired or wireless communication modem, a LAN (Local Area Network) card, a router, or the like. The communication unit 217 is connected to a communication network such as a LAN, a WAN (Wide Area Network), the Internet, a telephone line, and a satellite line. Note that the communication unit 217 may include a parallel port or a printer port.
In addition, when the manipulator unit 10 and the operation display unit 20 are respectively disposed at remote locations, the communication unit 217 serves as a connection device for connecting the operation display unit 20 to the manipulator unit 10. In such a case, the same configuration unit as the communication unit 217 is added to the manipulator unit 10.
  The program 21P may be read from the optical disk 21a into the RAM 213 via the disk drive 215. Alternatively, the program 21P may be read into the RAM 213 from an external information processing apparatus or recording apparatus via the communication unit 217. Furthermore, a semiconductor memory 21c such as a flash memory in which the program 21P is recorded may be mounted in the computer 21.
Returning to FIG. 2, the description will be continued.
The touch panel 22 is an electronic component that combines a display device and a position input device such as a touch pad, and is an input device that operates a device by pressing a display on a screen. The display device of the touch panel 22 has a screen such as a liquid crystal display, an organic EL (Electro Luminescence) display, or a CRT (Cathode Ray Tube) display.
  The computer 21 displays an image captured by the camera 12 on the screen of the touch panel 22. The touch panel 22 generates input signals related to the manipulator 11 and the movable stage 13 based on a user's touch operation, and outputs the generated input signals to the computer 21. The computer 21 operates the manipulator 11 and the movable stage 13 based on the input signal received from the touch panel 22.
  The keyboard 23 is an input device that inputs character signals and the like to the computer 21. When the computer 21 receives a signal corresponding to an arrow key or the like from the keyboard, the computer 21 may operate the manipulator 11 according to the signal.
  The mouse 24 is an input device that operates pointers and icons displayed on the screens of the touch panel 22 and the monitor 25. When the computer 21 receives a signal related to the coordinates of the pointer from the mouse 24, the computer 21 operates the manipulator 11 based on the signal.
The monitor 25 is a display device that displays an image captured by the camera 12. The monitor 25 is used as an auxiliary when the user extracts the shoot apex from the potato sprout. The monitor 25 may be used as an educational display device when the user teaches a beginner how to use the manipulator device 1.
The image captured by the camera 12 may be transmitted directly to the monitor 25 without using the computer 21.
Next, the operation of the manipulator device 1 will be described.
FIG. 4 is a top view schematically showing the arrangement of components in the manipulator unit 10. In FIG. 4, the base plate 14 and the clean box 15 are omitted. In FIG. 4, the upper side shows the front side of the manipulator device 1, and the right side shows the left side of the manipulator device 1.
Hereinafter, it is assumed that the movable stage 13 is a disk-shaped rotary stage 13r that performs rotational movement. The potato sprouts 1g are placed at substantially the center position of the rotary stage 13r. FIG. 4 shows the tip of the blade 11b extending from the left to the right. The blade 11b is an example of a processing tool attached to the distal end portion of the arm 111 included in the manipulator 11.
Hereinafter, the front, rear, left, and right directions are referred to as the x-axis, y-axis, and z-axis directions, respectively. The direction from the rear side to the front side is defined as + on the x axis. The direction from right to left is defined as + on the y-axis. The direction from the bottom to the top is defined as + on the z axis.
  The camera 12 of the manipulator unit 10 includes a first camera (imaging unit) 12x that images 1 g of potato buds from the negative side of the x axis, and a second camera (imaging unit) that images 1 g of potato buds from the positive side of the y axis. ) 12y. The first camera 12x and the second camera 12y are microscope cameras equipped with magnifying lenses 121x and 121y, respectively.
  FIG. 5 is an explanatory diagram illustrating the imaging screen 22p of the first camera 12x and the second camera 12y displayed on the screen of the touch panel 22. The imaging screen 22p includes a first camera screen 221p and a second camera screen 222p. The first camera screen 221p is a child screen of the imaging screen 22p on which an image captured by the first camera 12x is displayed. The second camera screen 222p is a child screen of the imaging screen 22p on which an image captured by the second camera 12y is displayed.
Simple layout diagrams showing the positional relationship of the cameras 12 are respectively displayed above the first camera screen 221p and the second camera screen 222p. In the layout diagram above the first camera screen 221p, the symbol of the camera corresponding to the first camera 12x is highlighted. In the layout diagram above the second camera screen 222p, the symbol of the camera corresponding to the second camera 12y is highlighted.
In the layout diagram, a symbol corresponding to the tip of the blade 11b is displayed at the intersection of each optical axis corresponding to the symbol of each camera.
Below the first camera screen 221p and the second camera screen 222p, the orientations of images captured by the first camera 12x and the second camera 12y are displayed, respectively. The first camera 12x captures an image obtained by projecting 1g of potato buds onto the yz plane. On the other hand, the second camera 12y captures an image obtained by projecting potato buds 1g on the xz plane.
On the first camera screen 221p and the second camera screen 222p, a pointer 2p is displayed at the tip of the blade 11b.
  FIG. 6 is an explanatory diagram illustrating the imaging screen 22p of the first camera 12x and the second camera 12y displayed on the screen of the touch panel 22. FIG. 6 shows a state after the manipulator 11 moves the blade 11b to the vicinity of the outside of the potato sprouts 1g by one gesture input on the touch panel 22, for example. A black arrow in FIG. 6 indicates a movement locus of the blade 11b.
The user performs a gesture input on the touch panel 22 when sending a command to the manipulator 11 to move the blade 11b.
The user taps the target position 1t to which the blade 11b is to be moved on the surface of the touch panel 22 on which the first camera screen 221p is displayed, for example. However, this target position is the portion of the potato sprout 1g in the vicinity of the movement destination of the blade 11b. For example, in FIGS. 5 and 6, it is assumed that the portion of the potato sprout 1g indicated by reference numeral 1t on the first camera screen 221p is the target position 1t. When the user taps the target position 1t, the manipulator 11 moves the tip of the blade 11b to the vicinity of the outside of the tapped potato sprout 1g. In the case of the example in FIG. 6, the blade 11b has moved to the vicinity of the + y side and the −x side with respect to the target position 1t. The movement of the blade 11b by the tap operation described above is a high-speed coarse operation.
  In the above description, when the user taps the target position 1t on the first camera screen 221p, the manipulator 11 executes the movement of the blade 11b in both the yz plane and the xz plane. That is, the manipulator 11 performs the movement of the blade 11b also in the depth direction of the first camera screen 221p. Therefore, on the first camera screen 221p of FIG. 5, for example, even if the blade 11b before movement is located on the near side (−x side) from the potato sprouts 1g, the manipulator 11 positions the blade 11b on the far side. Move to the vicinity of 1g of potato sprouts.
  When the user performs a drag operation on the pointer 2p on the first camera screen 221p, the manipulator 11 moves the blade 11b on the yz plane in the direction in which the pointer 2p is dragged and the distance corresponding to the dragged length. Only the blade 11b is moved. When the user performs a drag operation on the pointer 2p on the second camera screen 222p, the manipulator 11 moves the blade 11b on the xz plane in the direction in which the pointer 2p is dragged by a distance corresponding to the dragged length. The blade 11b is moved.
  The manipulator 11 moves the blade 11b in synchronization with the drag operation performed on the touch panel 22 by the user. The manipulator 11 moves the blade 11b at a speed proportional to the speed of the drag operation performed on the touch panel 22 by the user. Therefore, the user can select whether to cause the manipulator 11 to perform a rough operation or to perform a fine operation at the speed of gesture input to the touch panel 22.
The CPU 211 of the computer 21 causes the manipulator 11 to perform the above-described operation based on an input signal from the touch panel 22. Below, the process which the computer 21 performs is demonstrated.
First, processing when the CPU 211 causes the manipulator 11 to move the processing tool in the in-plane direction of one plane based on a drag operation will be described. Next, a process when the CPU 211 moves the processing tool in the three-dimensional direction in the vicinity of the outside of the potato sprouts 1g based on the tap operation will be described.
  First, as a preparation for causing the computer 21 to control the manipulator 11, the CPU 211 stores in the hard disk 214 the correspondence between the distance on the screen of the touch panel 22 and the distance in the xyz space that is the real space.
  The CPU 211 executes edge detection of the imaged object from the images captured by the first camera 12x and the second camera 12y at regular time intervals, and stores the execution result in the RAM 213. When the blade 11b is included in each image, the CPU 211 stores the tip position of the blade 11 in the RAM 213 at regular time intervals from the result of edge detection. The CPU 211 causes the touch panel 22 to display the pointer 2p at the tip position of the blade 11.
  The user drags the pointer 2p displayed on the first camera screen 221p by an arbitrary length in an arbitrary direction. The CPU 211 receives an input signal corresponding to the user's drag end point from the touch panel 22. The CPU 211 calculates the moving direction of the blade 11b on the yz plane from the coordinates of the drag start point and the received drag end point. The CPU 211 transmits to the manipulator 11 a pulse signal that moves the blade 11b in the calculated direction by a certain distance (unit: mm, for example). Based on the pulse signal received from the CPU 211, the manipulator 11 moves the blade 11b by a certain distance in the in-plane direction on the yz plane.
  The first camera 12x captures an image of the blade 11b after the manipulator 11 has moved, and transmits the captured image signal to the CPU 211. The CPU 211 acquires the position of the blade 11b after movement in the image of the first camera screen 221p from edge detection performed based on the image signal received from the first camera 12x. The CPU 211 calculates the movement distance of the blade 11b on the first camera screen 221p from the position of the blade 11b after movement in the image of the first camera screen 221p and the position of the blade 11b before movement.
  The CPU 211 calculates the movement distance (unit is, for example, a pixel) of the blade 11b on the calculated first camera screen 221p and a certain distance (unit is the unit) corresponding to the pulse signal transmitted to the manipulator 11 by the CPU 211 itself due to the user's drag operation. For example, mm) is associated and stored in the hard disk 214. Thereby, the distance on the first camera screen 221p and the distance in the yz plane in the xyz space that is the real space are associated with each other and stored in the hard disk 214. Here, the information stored in the hard disk 214 is referred to as first conversion information.
  Next, the user drags the pointer 2p displayed on the second camera screen 222p in an arbitrary direction. Thereafter, the CPU 211 performs the same processing as described above. The CPU 211 calculates the movement distance (unit is, for example, a pixel) of the blade 11b on the calculated second camera screen 222p and a fixed distance (unit is the unit) corresponding to the pulse signal transmitted to the manipulator 11 by the CPU 211 itself due to the user's drag operation. For example, mm) is associated and stored in the hard disk 214. As a result, the distance on the second camera screen 221p and the distance in the xz plane in the xyz space that is the real space are associated with each other and stored in the hard disk 214. Here, the information stored in the hard disk 214 is referred to as second conversion information.
FIG. 7 is a flowchart illustrating an example of a processing procedure in which the CPU 211 transmits a pulse signal to the manipulator 11.
The CPU 211 receives information related to the distance and direction on each screen from the first camera screen 221p or the second camera screen 222p in response to a gesture input such as tap, double tap, drag, pinch-in, pinch-out (step S11). ). The information is, for example, coordinates on the screen. The CPU 211 calculates the distance and the moving direction of the processing tool on the first camera screen 221p or the second camera screen 222p from the received information (step S12).
  The CPU 211 reads the first conversion information and the second conversion information from the hard disk 214 (step S13). The CPU 211 converts the calculated distance on each screen into the distance in the xyz space based on the read first conversion information and second conversion information (step S14). The CPU 211 generates a pulse signal corresponding to the calculated movement direction of the processing tool and the converted distance (step S15). The CPU 211 transmits the generated pulse signal to the manipulator 11 (step S16) and ends the process.
Next, processing when the CPU 211 moves the processing tool to the manipulator 11 in the three-dimensional direction in the vicinity of the outside of the potato sprouts 1g will be described.
First, as a preparation for causing the computer 21 to three-dimensionally control the operation of the manipulator 11, the solid coordinate distribution on the surface of the potato bud 1g is stored in the hard disk 214. To that end, the y-coordinate and z-coordinate on the first camera screen 221p and the x-coordinate and z-coordinate (unit: pixel, for example) on the second camera screen 222p are executed on the potato bud 1g included in the image. Obtained by detection processing.
  On the other hand, the x-coordinate and y-coordinate (unit: mm) respectively corresponding to the depth direction of the first camera screen 221p and the second camera screen 222p are acquired by a known depth direction distance estimation method. As a known distance estimation method in the depth direction, there are, for example, a binocular stereoscopic method, a method using contrast, a method using chromatic aberration of a lens, and a method using the degree of blurring generated in an object. Alternatively, the distance in the depth direction may be obtained as an approximate numerical value by approximating the shape of the potato bud 1g to a simple shape. Furthermore, the distance in the depth direction may be measured manually.
  The y-coordinate and z-coordinate (unit is, for example, a pixel) on the first camera screen 221p acquired from edge detection for the potato bud 1g are associated with the x-coordinate (the unit is, for example, mm) acquired by a distance estimation method or the like. It is stored in the hard disk 214. Here, the information stored in the hard disk 214 is referred to as first depth information.
  Further, the x-coordinate and z-coordinate (unit is, for example, a pixel) on the second camera screen 222p acquired from the edge detection process for the potato bud 1g, and the y-coordinate (the unit is, for example, mm) acquired by a distance estimation method or the like. The data are stored in the hard disk 214 in association with each other. Here, the information stored in the hard disk 214 is referred to as second depth information.
  On the first camera screen 221p or the second camera screen 222p, the CPU 211 receives the two-dimensional coordinates (unit is, for example, a pixel) of the potato bud 1g as the target position 1t. The CPU 211 can acquire a depth distance (unit: mm, for example) from the target position 1t by referring to the first depth information or the second depth information.
FIG. 8 is a flowchart illustrating an example of a processing procedure in which the CPU 211 transmits a pulse signal related to three-dimensional movement to the manipulator 11.
The CPU 211 receives the target position 1t from the first camera screen 221p or the second camera screen 222p in response to a gesture input such as a tap or double tap (step S21). The information is, for example, two-dimensional coordinates on the screen. The CPU 211 calculates the distance and the moving direction of the processing tool on the first camera screen 221p or the second camera screen 222p from the received information (step S22).
  The CPU 211 reads the first conversion information and the second conversion information from the hard disk 214 (step S23). The CPU 211 converts the calculated distance on each screen into a distance in the xyz space based on the read first conversion information and second conversion information (step S24). The distance converted in step S24 is a distance on the yz plane or the xz plane (unit: mm).
  The CPU 211 reads the first depth information and the second depth information from the hard disk 214 (step S25). Based on the read first depth information and second depth information, the CPU 211 acquires a depth distance (unit: mm) corresponding to the two-dimensional coordinates of the target position 1t received in step S21 (step S26).
  The CPU 211 determines the xyz space based on the moving direction of the processing tool calculated in step S22, the distance on the plane converted in step S24 (unit: mm) and the depth distance acquired in step S26 (unit: mm). The movement distance and the movement direction of the processing tool are determined (step 27). Note that the CPU 211 does not determine the moving distance and moving direction of the processing tool to the target position 1t in step S27. Since the target position 1t itself received by the CPU 211 is the surface position of the potato sprout 1t, the processing tool may come into contact with the potato sprout 1t when the processing tool is moved to the target position 1t. Therefore, the CPU 211 determines the final destination position at a position away from the target position 1t by a certain distance with respect to the central axis of the potato bud 1t in the vertical direction, for example.
  The CPU 211 generates a pulse signal corresponding to the moving distance and moving direction of the processing tool in the determined xyz space (step S28). The CPU 211 transmits the generated pulse signal to the manipulator 11 (step S29) and ends the process.
  In this way, the three-dimensional coordinates of the potato sprouts 1t are stored in the hard disk 214 in advance. The CPU 211 accepts a portion of the potato sprout 1t in which the three-dimensional coordinates are stored as a target position 1t from the touch panel 22, thereby receiving a pulse signal corresponding to the movement distance and movement direction of the processing tool in the xyz space. Generate. Thereby, the manipulator device 1 can receive the two-dimensional information related to the target position 1t from the touch panel 22 and instruct the manipulator 11 to perform a three-dimensional operation by the process shown in FIG.
The movement of the processing tool received by the touch panel 22 is not limited to linear movement. When the user performs a drag operation for drawing a curve on the touch panel 22, the manipulator 11 moves the processing tool so as to draw a locus corresponding to the drawn drag operation. The CPU 211 can move the processing tool along the curve by repeating the linear movement over a short distance with respect to the manipulator 11.
Further, when the user performs a drag operation for drawing an irregular trajectory on the touch panel 22, the manipulator 11 moves the processing tool so as to draw a trajectory corresponding to the drawn drag operation. That is, the manipulator 11 moves the processing tool so as to follow the drag operation performed by the user on the touch panel 22.
The touch panel 22 receives an instruction to rotate the rotary stage 13r by a gesture input.
When rotating the rotary stage 13r, the user drags the lower area of the first camera screen 221p or the second camera screen 222p with, for example, two fingers in the left-right direction. When receiving a drag operation from the left to the right from the touch panel 22, the CPU 211 transmits a signal for rotating counterclockwise to the rotation stage 13r. When the CPU 211 receives a drag operation from the right to the left from the touch panel 22, the CPU 211 transmits a signal for rotating in the clockwise direction to the rotation stage 13r. When the rotation stage 13r receives a rotation signal from the CPU 211, the rotation stage 13r rotates counterclockwise or clockwise depending on the content of the signal.
Based on the above, when extracting the shoot apex from 1 g of potato sprouts, the user operates the manipulator device 1 as follows.
The user holds the clean box 15, lifts it upward from the base plate 14, and removes it from the manipulator device 1. The user places 1 g of potato buds in the approximate center of the rotary stage 13r. The user grasps the clean box 15 and moves it onto the base 14 to house the components of the manipulator unit 10 in the clean box 15.
The user performs the above-described preparatory work, and causes the hard disk 214 to store the first conversion information, the second conversion information, the first depth information, and the second depth information.
  On the imaging screen 22p displayed on the touch panel 22, the user moves the processing tool (for example, the blade 11b) to the vicinity of the potato sprout 1g by a rough operation, and therefore any part of the potato sprout 1g is set to the target position 1t. decide. The user taps the target position 1t on the first camera screen 221p or the second camera screen 222p. The CPU 211 generates a pulse signal based on the input signal from the touch panel 22 and the stored contents of the hard disk 214, and transmits the generated pulse signal to the manipulator 11. The manipulator 11 moves the processing tool to the vicinity of the target position 1g and to the outside position of the potato sprout 1g by the pulse signal received from the CPU 211.
  The user performs a drag operation on the touch panel 22 to move the processing tool (for example, the blade 11b) to the top portion of the potato sprout 1g in the manipulator 11. The user slowly drags while drawing a curve along, for example, the left epidermis of the potato sprout 1g displayed on the first camera screen 221p or the second camera screen 222p. The manipulator 11 follows the user's drag operation and slowly moves the processing tool so as to peel off the left epidermis of the potato sprouts 1g. When the left epidermis of the potato sprouts 1t is peeled off, the user drags the lower region of the first camera screen 221p or the second camera screen 222p in the left-right direction with two fingers, and rotates the rotary stage 13r by a certain angle. Then, the user again moves the processing tool to the top portion of the potato sprouts 1g by a drag operation. The user repeats this operation.
When the entire skin of 1 g of potato sprouts has been removed, the user moves the treatment tool (for example, 11 m) to the top of 1 g of potato sprouts by dragging, slices the top of 1 g of sprouts, and extracts the top of the stem To do. The user moves the treatment tool to which the extracted shoot apex is attached, for example, to a medium tray by a drag operation.
Thus, in order to extract the shoot apex from 1 g of potato sprouts, the epidermis of 1 g of potato sprouts is peeled off one by one and the upper part of 1 g of sprouts is sliced. The manipulator device 1 can be executed in a shorter time than when this fine work is performed manually.
The manipulator device 1 also has a function for a rough operation of moving the processing tool to a position away from the object by a predetermined distance or more by gesture input. With this function, the manipulator device 1 moves the processing tool to the standby position outside the imaging screen 22p in a short time.
Further, the manipulator device 1 also has a function for a rough operation for returning a processing tool in a state separated from the object by a predetermined distance or more to an original position by a gesture input.
  The operation of the manipulator device 1 for realizing the above function will be described. For example, it is assumed that the blade 11b is in the position of FIG. 6 on the first camera screen 221p or the second camera screen 222p. In this state, it is assumed that the user flicks in the + direction of the z axis at the position of the pointer 11b or an arbitrary position on the first camera screen 221p or the second camera screen 222p. When the CPU 211 receives a flick input in the positive direction of the z axis from the touch panel 22, the CPU 211 transmits a pulse signal that moves the blade 11 b upward from the current position by a certain distance to the manipulator 11. Based on the pulse signal received from the CPU 211, the manipulator 11 rotates the stepping motor corresponding to the x-axis direction so as to roughly move the blade 11b in the positive z-axis direction.
The user may flick in the ± direction of the x axis and the ± direction of the y axis in addition to the + direction of the z axis. The CPU 211 transmits a pulse signal corresponding to the direction of the flick input received from the touch panel 22 to the manipulator 11. Alternatively, when the CPU 211 receives a flick input from the touch panel 22, the CPU 211 may transmit a pulse signal for moving the blade 11 b to a predetermined standby position to the manipulator 11.
When CPU 211 receives a flick input from touch panel 22, CPU 211 stores in RAM 213 the coordinate position (coordinate position in xyz space) of blade 11b when the flick input is received.
  Next, the returning operation of the blade 11b will be described. The user double taps an arbitrary position on the first camera screen 221p or the second camera screen 222p, for example, in a state where the blade 11b is at a standby position that is a predetermined distance or more away from the object. When the CPU 211 receives a double-tap input from the touch panel 22 in a state where the blade 11b is at a certain distance or more from the object, the CPU 211 reads out the coordinate position of the blade 11b from the RAM 213 when the flick input is received. The CPU 211 transmits to the manipulator 11 a pulse signal that causes the blade 11b to return to the position read from the standby position by a coarse operation. The manipulator 11 rotates stepping motors corresponding to the x-axis, y-axis, and z-axis directions based on the pulse signal received from the CPU 211 in order to return the blade 11b to the original position.
  The manipulator device 1 also has a function of controlling the operation of the processing tool by gesture input. For example, when the processing tool is a heel 11m, the user pinches out at an arbitrary position on the first camera screen 221p or the second camera screen 222p when the manipulator device 1 opens the blade edge of the heel 11m. In addition, when the user causes the manipulator device 1 to close the cutting edge of the scissors 11m, the user pinches in at an arbitrary position on the first camera screen 221p or the second camera screen 222p. When the CPU 211 receives a pinch-out or pinch-in input from the touch panel 22, the CPU 211 transmits a signal related to the opening / closing of the bag 11 m to the manipulator 11. When the manipulator 11 receives the signal from the CPU 211, the manipulator 11 opens and closes the bag 11m.
  In the above description, the manipulator device 1 has received an instruction from the user from the touch panel 22. However, the manipulator device 1 may accept an instruction from the user from the mouse 24. For example, in FIG. 5, the CPU 211 may accept the coordinates designated by, for example, a click operation of the mouse 22 as the target position 1t. Further, the CPU 211 may accept a drag operation using the mouse 22 for the pointer 2p instead of the drag operation on the touch panel 22 for the pointer 2p.
  The manipulator device 1 may be used for medical purposes. In such a case, a surgical instrument is attached to the tip of the manipulator 11. Thereby, the manipulator device 1 can perform a fine work such as a suturing process. An endoscope may be used as the camera 12.
According to the manipulator device 1, the user can put the object and the operation unit of the manipulator 11 in the same field of view.
The user operates the manipulator 11 by gesture input on the touch panel 22 at the same time as viewing an object displayed on the screen of the touch panel 22 and a processing tool for operating the object. Therefore, the field of view on which the user looks at the touch panel 22 is not closed, such as the field of view of an eyepiece lens of a microscope, but is open. Therefore, compared with the conventional manipulator device, the space range in which the manipulator device 1 can be operated is wider. Thus, since the user can operate the manipulator device 1 in an open space, the fatigue level of the user is reduced as compared with the conventional manipulator device.
According to the manipulator device 1, the user can quickly become familiar with the operation of the manipulator 11.
Intuitive gesture input to the manipulator 11 from the touch panel 22 does not require understanding of the complicated mechanical configuration of the manipulator 11, and therefore even a beginner can immediately start operating the manipulator device 1. The input from the touch panel 22 can be easily performed even if the input is related to a fine work. Therefore, the manipulator device 1 can be operated even for an elderly person or a handicapped person whose hands are trembling.
Embodiment 2
The second embodiment relates to a configuration in which the manipulator device 1 includes a force feedback type input device that inputs a three-dimensional position. The input device is a 3D input device that amplifies the reaction force of the force exerted on the target object by the processing tool according to the hardness of the target object and feeds it back to the input device itself.
In the second embodiment, the same reference numerals are assigned to the same components as those in the first embodiment, and detailed description thereof is omitted.
  FIG. 9 is a perspective view showing an external appearance of the input device 26. The input device 26 includes a base 261, a main body 262, a first arm 263, a second arm 264, and a pen 265. The base 261 is located at the lowermost part of the input device 26 and is a base that supports the upper constituent members. The main body 262 has a spherical shape, for example, and is fixed on the base 261 so as to be rotatable in the horizontal direction. The main body 262 is a component that calculates the three-dimensional position received from the user and outputs the calculated three-dimensional position to the computer 21.
  The base of the first arm 263 is pivotally supported by the main body 262 and a horizontal axis. One end of the second arm 264 is pivotally supported by a horizontal axis at the tip of the first arm 263. The pen 265 is a rod-shaped body directly gripped by the user, and is pivotally supported by the other end of the second arm 264 and a horizontal axis in the vicinity of the root near the pen tip. The user designates the position of the pen tip in the three-dimensional space while pressing the button 265b provided on the pen 265, thereby setting the three-dimensional position on the main body 262 via the first arm 263 and the second arm 264. input. When the arm 111 of the manipulator 11 is a slave arm, the first arm 263, the second arm 264, and the pen 265 can be said to be master arms. Hereinafter, the main body 262, the first arm 263, the second arm 264, and the pen 265 are referred to as a receiving unit 260.
  The main body 262 includes an actuator (force generation unit) 266 that applies a load to the operation of the pen 265 by the user. The actuator 266 is joined to, for example, a wire that applies resistance to the main body 262 and the first arm 263, the first arm 263 and the second arm 264, and the second arm and the pivot of the pen 265. The main body operates the actuator 266 in accordance with a signal from the manipulator 11 via the computer 21 to control the force of the receiving unit 260. Thereby, the user can recognize the resistance to the operation from the pen 265.
  A fixed virtual plane can be set for the input device 26. In such a case, the input device 26 can be replaced with the mouse 24 by the user specifying the position on the virtual plane with the pen tip of the pen 265.
FIG. 10 is a block diagram illustrating a hardware configuration example of the manipulator device 1. The manipulator 11 includes a reaction force detection unit (detection unit) 112 and an output unit (detection unit) 113.
The reaction force detection unit 112 is, for example, a strain gauge, a pressure sensor, or a displacement sensor attached to the tip of the arm 111 in the manipulator 11. The reaction force detection unit 112 detects a reaction force of a force exerted on the object by the processing tool, and generates an electrical signal corresponding to the reaction force. The output unit 113 is a component that amplifies the electrical signal generated by the reaction force detection unit 112 and transmits the amplified signal to the main body 262 of the input device 26 as a detection signal.
  The CPU 211 generates pulse signals corresponding to the x-axis, y-axis, and z-axis, respectively, according to the change in the three-dimensional position input from the input device 26. The CPU 211 transmits each generated pulse signal to the manipulator 11. The manipulator 11 moves the processing tool based on the pulse signal received from the CPU 211. When the processing tool comes into contact with the object and exerts a force on the object, the reaction force detection unit 112 detects the reaction force from the object and generates an electrical signal corresponding to the reaction force. The output unit 113 amplifies the detection signal detected by the reaction force detection unit 112 and transmits it to the computer 21. The computer 21 transmits the detection signal received from the output unit 113 to the input device 26. The actuator 266 of the input device 26 generates a force corresponding to the detection signal transmitted from the manipulator 11 as a target value. The input device 26 controls the receiving unit 260 with the force generated by the actuator 266.
  The manipulator device 1 may include a head mounted display. For example, two cameras 12 are arranged at a position where a parallax angle is formed, and an image captured by the camera 12 is displayed on a head mounted display. Thereby, since the user can further improve the three-dimensional recognition of the target object, the manipulator device 1 can improve the workability of the process on the target object even when the target object has a complicated shape.
  When a human performs an operation such as gripping, cutting, peeling, or removing a target manually, it recognizes the operation result with a subtle sense. The manipulator device 1 amplifies the reaction force of the force exerted on the object by the processing tool and returns it to the receiving unit 260 of the input device 26. Thereby, the user can perform a finer operation on the object.
Embodiment 3
The third embodiment relates to a form in which the state in the clean box 15 is maintained in a sterile state.
In the third embodiment, the same reference numerals are assigned to the same components as those in the first and second embodiments, and detailed description thereof is omitted.
  FIG. 11 is an explanatory diagram showing a configuration example of the clean box 15. FIG. 11 shows a side cross section of the clean box 15. The clean box 15 includes an inflow port 151, a filter 152, an outflow port 153, and a blower unit (blowout unit) 154.
  The inflow port 151 is an opening through which external air flows into the clean box 15. The inflow port 151 is provided on, for example, a ceiling plate of the clean box 15. A filter 152 for purifying and sterilizing the air flowing into the clean box 15 is covered with the inflow port 151 at the upper part of the inflow port 151. The filter 152 is, for example, a HEPA filter (High Efficiency Particulate Air Filter). The filter 152 may be an ULPA filter (Ultra Low Penetration Air Filter) that further increases the particle collection efficiency of the HEPA filter.
The outlet 153 is an opening through which the air in the clean box 15 is discharged to the outside.
The blower unit 154 is a component that sends outside air into the clean box 15. The air blowing unit 154 is provided on the upper side of the filter 152. The blower unit 154 includes a motor 154m and a fan 154f. The motor 154m is an electric motor for rotating the fan 154f. The fan 154f is, for example, a sirocco fan or a silent fan. The fan 154f is rotated by the motor 154m and sends the air above the clean box 15 to the inflow port 151.
  The clean box 15 or the base plate 14 may include a sterilizing gas burner for sterilizing a processing tool for processing an object or a heater for adjusting the culture environment temperature of animals and plants.
According to the clean box 15, an aseptic condition inside can be secured by sending clean air into the inside.
It can be said that the clean box 15 is a narrow clean bench of a so-called desktop size. When performing operations such as shoot tip extraction and seedling division, it is necessary to keep the working environment clean. Therefore, a clean room, a clean bench, etc. are required, and a large capital investment is required. However, since the clean box 15 keeps only its small volume portion clean, it contributes to cost reduction.
The manipulator device 1 can also be used for component processing in a semiconductor substrate or a liquid crystal device that requires a clean environment, component processing using a polymer material, and the like. At that time, the clean box 15 can provide a clean working environment in which dust is eliminated.
  The disclosed embodiments should be considered as illustrative in all points and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
  Further, the technical features (components) described in the embodiments can be combined with each other, and new technical features can be formed by combining them.
  Regarding the above embodiment, the following additional notes are disclosed.
(Appendix 7)
The manipulator has a detection unit that detects a reaction force against a force exerted on the object by the processing tool and outputs a signal corresponding to the detected reaction force.
A receiving device for receiving a force corresponding to a signal output from the detection unit and receiving a three-dimensional target position of the processing tool;
The control unit is configured to control the operation of the manipulator based on a three-dimensional target position received by the accepting device. The manipulator according to any one of appendices 1 to 6, apparatus.
(Appendix 10)
The manipulator device according to any one of appendix 1 to appendix 9, wherein a treatment tool for treating plant cells is attached to the tip portion.
DESCRIPTION OF SYMBOLS 1 Manipulator apparatus 11 Manipulator 111 Arm 112 Reaction force detection part (detection part)
113 Output unit (detection unit)
12 Camera (imaging part)
12x first camera (imaging part)
12y Second camera (imaging unit)
121x, 121y Magnifying lens 13 Movable stage 13r Rotating stage 14 Base plate 15 Clean box (box)
151 Inlet 152 Outlet 153 Blower unit (outlet)
154 Filter 21 Computer 211 CPU (Control Unit)
212 ROM
213 RAM
214 Hard disk (storage unit)
215 Communication unit 22 Touch panel 23 Keyboard 24 Mouse 25 Monitor 26 Input device (accepting device)
260 Reception unit 266 Actuator (force generation unit)
11b blade (processing tool)
11m 鋏 (treatment tool)
1g Potato sprouts (object)

Claims (7)

  1. In a manipulator device that processes an object with a processing tool attached to the tip of the manipulator,
    An imaging unit for imaging an object to be processed by the processing tool;
    A touch panel for displaying an image captured by the imaging unit on a screen and receiving an instruction for controlling the operation of the manipulator;
    A control unit for controlling the operation of the manipulator based on an instruction received by the touch panel;
    A storage unit that stores a distance on the screen of the touch panel and a moving distance of the tip part in association with each other, and stores a three-dimensional position of the object;
    The touch panel is adapted to receive a three-dimensional target position of the processing tool corresponding to the object,
    The control unit is configured to control the operation of the manipulator based on the three-dimensional target position received by the touch panel and the content stored in the storage unit,
    The control unit is configured to switch between a coarse operation and a fine operation of the manipulator by a gesture input received by the touch panel. When the touch panel receives a tap operation, the control unit switches to the coarse operation. A manipulator device that switches to the fine operation when a drag operation is received .
  2. The manipulator device according to claim 1, further comprising a box body that houses the manipulator and the imaging unit.
  3. A movable stage on which the object is placed;
    The touch panel is adapted to receive instructions for controlling the operation of the movable stage,
    The manipulator device according to claim 1, wherein the control unit is configured to control an operation of the movable stage based on an instruction received by the touch panel.
  4. The manipulator device according to any one of claims 1 to 3, wherein the imaging unit is configured to magnify and capture an object.
  5. The manipulator has a detection unit that detects a reaction force against a force exerted on the object by the processing tool and outputs a signal corresponding to the detected reaction force.
    A force generation unit that generates a force according to a signal output from the detection unit, and a reception unit that receives the force generated by the force generation unit and receives the three-dimensional target position of the processing tool by the applied force. A reception device,
    The said control part controls the operation | movement of the said manipulator based on the three-dimensional target position which the said receiving apparatus received. The any one of Claim 1 to 4 characterized by the above-mentioned. Manipulator device.
  6. The box is
    A blowout section for blowing out air;
    A filter for purifying the air blown out by the blowing section;
    An inlet through which clean air from the filter flows into the interior;
    The manipulator device according to claim 2, further comprising: an outflow port from which internal air flows out.
  7. The imaging unit has a plurality of imaging units for imaging the object from different directions,
    The touch panel has screen areas for displaying images captured by the plurality of imaging units, respectively, and receives instructions for controlling the operation of the manipulator in the screen areas.
    The controller controls the operation of the manipulator based on an instruction received by each screen area of the touch panel.
    The manipulator device according to any one of claims 1 to 6, wherein the manipulator device is provided.
JP2013059018A 2013-03-21 2013-03-21 Manipulator device Active JP6297784B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013059018A JP6297784B2 (en) 2013-03-21 2013-03-21 Manipulator device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013059018A JP6297784B2 (en) 2013-03-21 2013-03-21 Manipulator device

Publications (2)

Publication Number Publication Date
JP2014184494A JP2014184494A (en) 2014-10-02
JP6297784B2 true JP6297784B2 (en) 2018-03-20

Family

ID=51832569

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013059018A Active JP6297784B2 (en) 2013-03-21 2013-03-21 Manipulator device

Country Status (1)

Country Link
JP (1) JP6297784B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6497021B2 (en) * 2014-10-01 2019-04-10 株式会社デンソーウェーブ Robot operation device, robot system, and robot operation program
EP3263290A4 (en) 2015-02-25 2018-11-21 Olympus Corporation Manipulator system and medical system
JP6601201B2 (en) * 2015-03-19 2019-11-06 株式会社デンソーウェーブ Robot operation device and robot operation program
JP6713190B2 (en) * 2015-10-09 2020-06-24 住友重機械工業株式会社 Shovel operating device and shovel operating method
WO2018051435A1 (en) * 2016-09-14 2018-03-22 三菱電機株式会社 Numerical control apparatus

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2645866B2 (en) * 1988-09-05 1997-08-25 株式会社日立製作所 Manipulator control method and device
JP3675004B2 (en) * 1995-10-04 2005-07-27 株式会社安川電機 Robot control device
JPH11262883A (en) * 1998-03-19 1999-09-28 Denso Corp Manual operation device for robot
JP3465613B2 (en) * 1999-01-13 2003-11-10 松下電器産業株式会社 Operating device for fine objects
JP2001072247A (en) * 1999-09-06 2001-03-21 Murata Mach Ltd Conveying system
JP2002253574A (en) * 2001-03-01 2002-09-10 Hitachi Ltd Operation support device
JP2006350602A (en) * 2005-06-15 2006-12-28 Yushin Precision Equipment Co Ltd Operation terminal equipment
JP4992491B2 (en) * 2007-03-14 2012-08-08 日本精工株式会社 Manipulator system
JP5217662B2 (en) * 2007-08-02 2013-06-19 日本精工株式会社 Manipulator and manipulator system
JP2009148869A (en) * 2007-12-21 2009-07-09 Olympus Corp Assembly apparatus
EP2279444A4 (en) * 2008-03-19 2014-01-01 Univ Toronto System and method for micromanipulating samples
JP5466965B2 (en) * 2010-02-08 2014-04-09 オリンパス株式会社 Apparatus, method, program, and system for controlling stage
JP5526881B2 (en) * 2010-03-12 2014-06-18 株式会社デンソーウェーブ Robot system
JP2012119557A (en) * 2010-12-02 2012-06-21 Hitachi Kokusai Electric Inc Substrate processing device
JP5739680B2 (en) * 2011-01-27 2015-06-24 株式会社日立製作所 Manipulator device and working device with manipulator
JP5694277B2 (en) * 2012-11-16 2015-04-01 株式会社日立製作所 Automatic culture equipment

Also Published As

Publication number Publication date
JP2014184494A (en) 2014-10-02

Similar Documents

Publication Publication Date Title
JP6297784B2 (en) Manipulator device
US10203764B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
DE202016008297U1 (en) Two-handed object manipulations in virtual reality
EP3431921B1 (en) System with 3d user interface integration
WO2016109409A1 (en) Virtual lasers for interacting with augmented reality environments
JP5469090B2 (en) Device for controlling the functions of the microscope with one hand
JP2009531184A (en) Intelligent interface device for gripping an object by an operating robot and method of operating this device
CN102958464A (en) Robotized surgery system with improved control
CN103251455A (en) Auxiliary image display and manipulation on computer display in medical robotic system
CN107765414B (en) Surgical microscope with gesture control and method for gesture control of a surgical microscope
EP3262505B1 (en) Interactive system control apparatus and method
CN102650906B (en) A kind of control method of user interface and device
US20170163972A1 (en) Observation Device Comprising A Control Unit
CN107106245A (en) Reciprocation between user interface and master controller
EP3547095A1 (en) Information processing apparatus and method, and program
JP6419487B2 (en) Medical observation system
WO2018078470A1 (en) Medical spatial orientation system
US8170698B1 (en) Virtual robotic controller system with special application to robotic microscopy structure and methodology
JP5711531B2 (en) Sample stage apparatus and electron beam apparatus
WO2018159155A1 (en) Medical observation system, control device, and control method
WO2018216501A1 (en) Control device, control method, and surgery system
EP3541064A1 (en) Image processing device and method, and program
JP6180692B1 (en) Medical manipulator system
JP2007030136A (en) Minute object handling system
JP2018099297A (en) Control device, control method, and medical observation system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20151210

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20161021

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161101

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161219

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170606

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170803

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20171205

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180129

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180206

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180222

R150 Certificate of patent or registration of utility model

Ref document number: 6297784

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150