WO2005085785A1 - 光学式触覚センサ、センシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置、ロボットハンド - Google Patents
光学式触覚センサ、センシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置、ロボットハンド Download PDFInfo
- Publication number
- WO2005085785A1 WO2005085785A1 PCT/JP2005/004259 JP2005004259W WO2005085785A1 WO 2005085785 A1 WO2005085785 A1 WO 2005085785A1 JP 2005004259 W JP2005004259 W JP 2005004259W WO 2005085785 A1 WO2005085785 A1 WO 2005085785A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- self
- size
- optical
- force
- sensor
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/081—Touching devices, e.g. pressure-sensitive
- B25J13/082—Grasping-force detectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/081—Touching devices, e.g. pressure-sensitive
- B25J13/084—Tactile sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L5/00—Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
- G01L5/22—Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes for measuring the force applied to control members, e.g. control members of vehicles, triggers
- G01L5/226—Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes for measuring the force applied to control members, e.g. control members of vehicles, triggers to manipulators, e.g. the force due to gripping
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N19/00—Investigating materials by mechanical methods
- G01N19/02—Measuring coefficient of friction between materials
Definitions
- Optical ⁇ insect sensor sensing method, sensing system, object operation force control method, object operation control P device, object material force control device, robot hand
- the present invention relates to an optical sensor, a sensing method using an optical tactile sensor,
- a conventional sensor using this principle a plurality of strain gauges are arranged inside a curved elastic attachment, and the pressure distribution or strain distribution inside the elastic body is measured from the output of each strain gauge,
- a Nada insect sensor that measures the state of sticking and sticking in the invertebrate area
- an optical insect sensor of a type in which deformation of a transparent elastic body is captured by an imaging means such as a CCD camera.
- an imaging means such as a CCD camera.
- three-dimensional force vectors generated near the deworming area are obtained by burying spherical marker parts inside a transparent elastic body separated from each other and measuring the displacement of each marker part with a CCD camera.
- Patent document 1 Japanese Patent Laid-Open No. 2000-254884
- Patent Document 2 Japanese Patent Publication No. 2002-523568
- Non-Patent Document 1 Hiromitsu, Maeno, "Stickness of the Human Finger Moon 15 at the Time of Object-f Material Torsion, Slip Distribution and Response to Tactile Receptor Vessels," Journal of Japan, March 2002, Vol. 68, No. 667, ed. 914 -919)
- Non-Patent Document 2 Kamiyama, Kajimoto, Inami, Kawakami, Tachi, "Tactile Camera-Making an Optical 3D Tactile Sensor with Elasticity 1", Electricity ⁇ Transactions E, January 2003, 123 vol. , Issue 1 (p. 16-22) '
- the bug sensor has a problem in durability because it performs sensing by deforming a strain gauge. Furthermore, it is necessary to arrange multiple strain gauges inside the elastic body. Therefore, there are problems such as the production procedure becoming difficult and wiring becoming difficult.
- the optical insect sensor described above can detect the series number generated between the document and the tactile part because the surface of the transparent elastic body that is removed from the object (object) is flat. Have difficulty. Also, even if you try to change the face of your home, you can expect the production to be very difficult.
- the tactile sensor force of the bandits has been proposed. However, a finger-type tactile sensor can measure only the withdrawn state, and cannot simultaneously measure multidimensional mechanical quantities including up to the number of spectacles.
- Non-Patent Document 1 As a means for measuring multidimensional mechanical quantities, it has been proposed to use a combination of an observer's sensor and an optical 5 ⁇ insect sensor of different types (see Non-Patent Document 1). However, such a means requires two types of sensor force S, and it is difficult to achieve / h3 ⁇ 4.
- the present invention has been made in view of the above, and an object of the present invention is to provide an optical sensor that can be easily manufactured and that can be easily downsized. Further, a further object of the present invention is to provide a sensing system, an object operation force control method, and an optical sensation sensor using an optical sensor capable of simultaneously measuring a plurality of mechanical quantities with a single type of sensor. It is to share the object force control device, the object f village Tomoe force control device and the robot hand. Disclosure of the invention
- a first aspect of the present invention for solving the above-mentioned problem is that the flit is formed of an elastic body having a convex curved surface, a marker force is applied to the flit self-convex curved surface, a tactile portion placed by itself, and a ffJlS convex curved surface.
- an optical quintuple sensor with an adhesive layer is provided with an image pickup means for moving the marker itself when the object is disliked.
- the behavior of the viscoelastic body is imaged by the imaging means. Since it is captured as information, a large amount of information can be processed with a relatively small and simple structure. Therefore, multiple physical quantities (for example, 3 ⁇ 4 # spring force, simulated spring force, Mae system and torque, etc.) can be measured at the same time. / You don't have to. For this reason, it is easy to realize an optical sensor. In addition, it is not necessary to provide a marker part inside the light-transmitting elastic body or to arrange a plurality of strain gauges inside the elastic body, which facilitates the production of the optical insect sensor. .
- the part that deforms most when throwing an object is a convex curved surface.
- the amount of deformation decreases as the convex surface force moves toward the inside of the haptic part.
- the marker section is more likely to be deformed than the marker section is arranged inside the tactile section. Therefore, the force acting on the tactile part can be accurately obtained by fidelity of the deformation of the marker part by the imaging means. That is, the detection accuracy is higher than in the case where the strain generated inside the optical elastic body is detected.
- the “optical elastic body” is preferably formed of a silicone resin such as silicone rubber, but is preferably formed of another rubber or an elastomeric material such as an elastomer. You may.
- the “3 ⁇ 4Si elastic body” may be transparent or translucent.
- the “marker section” in the sensor is disposed only on the convex curved surface, not inside the optical elastic body. In other words, it is preferable that only one layer of the marker portion is arranged on the convex curved surface. This is because arranging the marker section inside, particularly arranging the marker section inside, makes it difficult to manufacture the optical insect sensor.
- the marker section may be formed by attaching another material to the elastic body (for example, coating, shellfish occupation, printing, etc.), and attaching another material. May be formed on the elastic body itself, but the latter is particularly preferred.
- the marker part is constructed by attaching another material to the elastic body, the marker part is formed when the object is removed on the convex curved surface. This is because the force S may fall off.
- it is necessary to attach another adhesive to the photoelastic body, which may increase the manufacturing cost of the optical sensor.
- the marker portion formed on the photoimageable elastic body itself includes a force s such as a groove, a ridge, a protrusion, or a dent. Further, the marker portion may be colorless or may be blank.
- the “viewing means” it is preferable to use a camera that outputs image information as electric signals, and it is particularly preferable to use a digital camera.
- the “digital camera” include a CCD camera and a digital camera using a C—M3 ⁇ 4S3 ⁇ 4? F image sensor.
- the humility marker section preferably comprises a plurality of grooves or a plurality of ridges arranged in a grid. As described above, it is possible to easily reduce the deformation of the marker part due to the force acting on the tactile part in which the marker part includes a plurality of grooves or a plurality of ridges arranged in a lattice. Therefore, the force acting on the tactile part can be easily obtained by controlling the deformation of the marker part by the imaging means.
- the disgusting tactile part is formed by removing the uncured elastic material from the molding surface and curing it, using a playfulness having a plurality of molding grooves or a plurality of molding ridges on the molding surface. Power to be done. This: ⁇ At the same time as the formation of the entire tactile part, a ridge is formed by the growth ridge 3 A groove is formed by the formed ridge. For this reason, the production of the tactile portion becomes easy despite the fact that it has a convex curved surface. In addition, the manufacturing cost can be reduced because the step of attaching another material to the optical elastic body can be omitted. ',
- the second aspect of the present invention utilizes an optical insect sensor including a tactile portion made of a photo-elastic body and an imaging means for capturing the behavior of a place where an object glows in the disgusting tactile portion.
- a method for sensing and synthesizing the physical quantity of the complex type wherein the image information from the imaging means is image-processed to obtain a size of a worm region generated between the object and the tiilE tactile part.
- Shape and shape 3 ⁇ 4t ⁇ ⁇ ⁇ ⁇ to extract the information about the size of the fibrous area between the body and the ffit self-tactile part, which is generated in the ⁇ translation area, and extract the 3 ⁇ 4 fountain force from the size of the ⁇ ⁇ area.
- the center of gravity of the self-worm region, and the ratio of the size of the selfish region to the size of the self-worm region A sensing method using an optical insect sensor is included, which includes a step for obtaining a Mae coefficient.
- the image 'ft! * is information force s extraction on the size of the ® range. Then, based on the information, 3 ⁇ 4 # fountain power, fountain power, and number of hearings are calculated. That is, a plurality of types of mechanical quantities can be simultaneously measured by one or more sensors.
- a third aspect of the present invention is to provide a tactile part made of a photo-elastic body and a tactile part! /
- the optical insect sensor equipped with an imaging means that shows the behavior of the object where the object is fiberized, and the image ⁇ t # from the self-imaging means is processed to obtain the tfrfB object and the tfrt self It extracts the information on the size of the worm region between the tactile part and the center of gravity of the shape of the worm, and the information on the size of the tfif self object and the terrible tactile region in the humiliating worm region ⁇ Information extraction means for extracting information, tut Determines normal force from the size of the self-touching area, obtains fountain force from the shape of the Fujitani insect removal area and the center of gravity of the self-worm area, From the ratio of the size of the tiJlB solid region, which is the size of the invertebrate region, a sensing system using a glue bed that includes a mechanical quantity measuring means capable of obtaining
- the third aspect of the present invention when the image information from the imaging means is input, the information on the size, shape, and center of gravity of the invertebrate area is extracted by the information extracting means, and Information about the size of the oil is released. And, by the physical quantity measurement means, On the basis of these information, ⁇ Izumi power, Onsen power, and Mae number are calculated. That is, the dynamics of multiple explosives can be measured simultaneously by the sensor of one view.
- a marker part is arranged on the surface of the part where the worms are larvae in the larval part, and the tilt self-imaging means is arranged on the haptic part.
- the behavior of the self-marker part when the book is wormed is reflected, and the disgusting 'tf' extraction means extracts the information on the deformation of the disgusting marker part by narrowing the image information from the reading fiber means.
- the anxiety mechanics amount measuring means can obtain the torque from the information on the deformation of the taiB marker section.
- the marker part is likely to be deformed, and the object touches the tactile part, which is the easiest to stand, and because it is located on the surface of the point, torque acts on the tactile part It deforms almost simultaneously.
- the degree of deformation of the marker part is substantially the same as the degree of deformation of the tactile part. Therefore, not only can the sensors measure multiple mechanical quantities (3 ⁇ 41 spring force, tangential force, rnmo) on the same temple, but also the deformation of a part of the marker can be measured by imaging means. The torque can also be determined accurately.
- a machine using an optical worm sensor having a tactile portion made of a viscoelastic body and imaging means for performing a behavior of a portion where an object is removed in the haptic portion.
- This is an operation control method, in which the image t ⁇ from the lit self-imaging means is subjected to image processing, and the image t3 ⁇ 4 is dissatisfied. Extracting the information on the size of the solid region between the tin object and the tin tactile part generated in the disgusting insect region, and calculating the ⁇ 1 spring force from the size of the tins weave region.
- the shape of the self-repellent region and the window The pseudo-focal force is calculated from the center of gravity of the self-repellent region, and the mah coefficient is calculated from the ratio of the size of the frt self-restricted region to the size of the tins repellent region.
- the fourth aspect of the present invention when the image information from the fiber means is covered, information on the size, shape, and center of gravity of the insect region is released, and the size of the solid region is increased. Is extracted. Then, based on the information, 3 ⁇ 41 spring force, tangential force, and number of appearances are calculated. That is, a plurality of mechanical quantities can be simultaneously measured by the sensor. Then, an appropriate operating force to be given to the object can be given based on the measured mechanical quantity. Thus, a desired operation can be performed by applying an appropriate force to the object.
- the “operating force” refers to a force that pushes, rotates, or grips an object in a state where the tactile part is in contact with the object.
- the fifth aspect of the present invention relates to a sensor support book, an actuator for ⁇ of a knitting sensor support book, a tactile part made of a sexually elastic body, and a behavior of a part of the tiitstt sensation where an object is removed from an object. It is equipped with an optical observation sensor supported by the tilt self-sensor appreciation book, and an image of the image information from the disgusting self-imaging means. Extraction of information on the size, shape, and center of gravity of the resulting removal area, as well as extraction of 'lt' on the size of the fixed area between the wisteria object and the disgusting tactile part in the ttif self-infestation area Means and size of the terrible ginseng area?
- the information extracting means extracts information on the size, shape, and center of gravity of the fiber region, and Information on the size of the solid area is extracted. Then, the ⁇ H force, pseudo-fountain force, and friction coefficient are obtained based on the information by the physical quantity measuring means. In other words, the dynamics of double explosion can be measured at the same time by the sensor of 1 tribute. Then, the operating force calculating means can calculate an appropriate operating force to be applied to the object based on the measured mechanical quantity. Thus, a desired operation can be performed by applying an appropriate force to the object.
- the appropriate operation force to be applied to the object has changed in the middle: ⁇
- the feedback operation performed by the actuator horse a3 ⁇ 4 control means can maintain the appropriate operation on the object.
- the sixth eleventh aspect of the present invention includes a sensor support, an actuator for supporting the self-sensor support book, a tactile part made of a 1 elastic body, and a behavior of a part which is woven into an object in the insect leakage part.
- An optical sensor supported by a fuzzy sensor support tree, and an image 'W' from the self-imaging means are processed to provide a tBt) body and a self-tactile sensor. Extract the size, shape, and center of gravity of the fiber region between the worms and the center of gravity.
- Information extraction means for extracting information on the lift, the lift force is calculated from the size of the lift self-propelled region, and the spring force is calculated from the shape of the venomous region and the center of gravity of the triE control region.
- ttrt Mechanics that can determine the Mae coefficient from the ratio of the size of the self-adhesive region to the size of the self-gleaming region Measuring means; gripping force calculating means for calculating an appropriate gripping force to be applied to the body based on the physical quantity of the complex obtained by the anaerobic mechanical quantity measuring means;
- the humiliation sensor has a proper gripping force calculated according to the above.
- the actuator has a so-called wholesaler that performs feedback control of the courage actuator so as to sleep the supporting tree, so-called wholesale means. Share control device.
- the sixth j ⁇ of this effort when the image information from the imaging means is defined, information on the size, shape, and center of gravity of the insect removal area is extracted by the information extraction means. At the same time, information on the size of the fixed area is extracted. Then, based on the information, the mechanical quantity measuring means obtains the #spring force, the glowing spring force, and the bulk modulus. That is, a plurality of mechanical quantities can be simultaneously measured by one type of sensor. Then, the operating force calculating means can calculate an appropriate gripping force to be applied to the object based on the measured mechanical quantity. In this way, the object can be grasped so as not to be crushed, not to be slid, and not to slide down. Also, even if the appropriate gripping force to be applied to the object changes in the middle, the feedback control performed by the actuator and sleep control means grips the object so that the object is not crushed or slipped down. You can continue to do.
- a seventh woven fabric of the present invention is a plurality of fingers, an actuator that tans the plurality of fingers, a tactile portion made of a light-transmitting elastic body, and a portion of the disgusting tactile portion that mimics an object.
- TfilB is provided with an imaging means that behaves, and tfilB is an optical 53 ⁇ 4 insect sensor supported at the tip of at least one of a plurality of fingers; Information on the size and shape of the deworming area generated between the haptic part and the haptic part, and the fixed area between the tin object and the tiiff self haptic part in the tirf self-infesting area Information extraction means for extracting information about the size, ⁇
- Gripping force calculating means for calculating an appropriate gripping force to be applied to the knitted object based on the dynamic quantity of the retrospective obtained by the determining means, and an appropriate gripping force calculated by the disgusting gripping force calculating means
- a robot hand with a glue bed that includes an actuating device that controls the ftilB actuator in a feed-pack manner so as to touch a plurality of fingers $ f is shared. Therefore, the seventh! According to ⁇ , when the image information from the imaging means is imaged i M, the information extraction means extracts information about the size, shape and center of gravity of the insect area, and the size of the solid ⁇ R area Is extracted. The force, the tangential force, and the coefficient of friction are obtained by the physical quantity measuring means based on the information.
- a plurality of mechanical quantities can be measured simultaneously by one sensor. Then, the operating force calculating means can calculate an appropriate gripping force to be applied to the object based on the measured physical quantity.
- the object can be gripped by a plurality of fingers so that the object is not crushed or slipped down. Therefore, a humanoid dropper with a robot hand closer to the human hand can be played.
- FIG. 1 is an overall S diagram showing an optical sensor according to the present invention.
- FIG. 2 is an overall R view of the touchpad.
- FIG. 3 is a block diagram showing a configuration of the sensing system.
- Fig. 4 shows a grid with a touch pad that is not invading an object. It is a figure which shows the shape II of a turn.
- FIG. 5 is a diagram showing a state of a grid pattern when the touch pad invades an object.
- FIG. 6 is a diagram showing a state of a grid pattern of 3 ⁇ 4 ⁇ where no torque is acting on the touch pad.
- FIG. 7 is a view showing a state of a dalid pattern when a torque acts on the touch pad.
- FIG. 8 is a flowchart showing an outline of processing by the sensing system.
- FIG. 9 is a block diagram showing the configuration of the mouth bot hand.
- FIG. 10 is a flowchart showing an outline of processing by the robot hand. Best mode to make invention
- the tip f of the cylindrical casing 10 constituting the optical insect sensor 11 has a hemispherical touch pad 12 2 ing.
- a CCD camera 13 as an imaging means is placed.
- the CCD camera 13 is arranged on the touch pad 12 on the side opposite to the side where the object W 1 (see FIG. 3) is connected.
- the CCD camera 13 shows the behavior (displacement and distortion) of the grid pattern 19 when the object W 1 rubs on the touch pad 12 from the back side from the back side. Therefore,. # 0
- the focus of the camera 13 is on the convex curved surface 15 of the touch pad 12 having the dalid pattern 19.
- a ring-shaped illumination 14 for illuminating the grid pattern 19 is arranged in the casing 10.
- the illumination 14 is constituted by a plurality of light emitting diodes, but may be constituted by an optical fiber or the like.
- the touch pad 12, CCD camera 13 and illumination 14 are coaxially arranged.
- the touch pad 12 has a convex curved surface 15 according to the tip rule, and a transparent elastic body 17 (optical elastic body) having a flat surface 16 on the base end side. ing.
- the ⁇ rule of the transparent elastic body 17 is that the holding plate 18 (with holding down) which is relatively harder than the transparent elastic body 17 Aged with a clear adhesive.
- the holding plate 18 is aged at one end of the casing 10.
- a transparent acrylic plate is used as the holding plate 18.
- a touch pad 12 is provided on the holding plate 18.
- a Fujimi CCD camera 13 is arranged on the holding plate 18.
- the transparent elastic body 17 is formed of transparent silicone rubber (YE5822 made of G silicone).
- the height HI of the transparent living body 17 is set to 13 mm, and the curvature ⁇ of the convex curved surface 15 of the transparent elastic living body 17 is 2 O mm to 3 O mm (30 in this embodiment). mm).
- the grid pattern 19 includes a plurality of grooves 20 arranged in a grid.
- the grid pattern 19 is composed of a plurality of grooves 20 having a depth of 100 ⁇ arranged in a lattice with a pitch of 300 ⁇ . That is, the dalid pattern 19 is made of the same material as the transparent elastic body 17.
- the depth of the groove 20 may be set to, for example, 50 ⁇ .
- a material other than the transparent elastic body 17 is not particularly used, but another material may be used.
- the touch pad 12 is formed by pouring uncured transparent silicone rubber (elastic material) into a recording material (not shown) having a substantially hemispherical molding surface, and a part of the transparent silicone rubber. Formed on the molding surface by allowing them to worm and cure. On the molding surface, a plurality of molding ridges having a height of 100 ⁇ m are arranged. Thus, as the touch pad 12 is formed, a plurality of grooves 20 are formed on the convex curved surface 15 of the transparent elastic body 17 in an eave-like shape. As shown in FIG. 3, the sensing system 21 including the optical insect sensor 11 includes a control unit 22 for controlling the entire sensing system 21.
- the control unit 22 includes a CPU 23, and a ROM 23, a RAM 25, and an input / output port ( ⁇ ⁇ ⁇ port) 26 are connected to the CPU 23.
- the CPU 23 irradiates each process for controlling the entire sensing system 21 and outputs the processing result as a predetermined control signal.
- the ROM 24 stores a control program for controlling the sensing system 21 and the like. Various information necessary for the operation of the sensing system 21 is temporarily stored in the RAM 25.
- the input / output port 26 has a ttif CCD camera 13 and a self-illumination 14 power. The image input from the CCD camera 13 is sent to the CPU 23 by performing Sf on the behavior of the Sukki grid pattern 19! "Shigarato is input via the input / output port 26.
- the CPU 23 outputs a signal for turning on the light 14 to the light 14 via the input / output port 26.
- the CPU 23 shown in FIG. 3 filters the image information from the CCD camera 13 input at regular intervals (every 33 ms in this embodiment) via the input / output port 26.
- the grid pattern is recognized as a 19-force S grid-like ⁇ (see Fig. 4).
- the image information acquired at regular intervals is recorded in the recording area of RAM 25, and is sequentially deleted from the oldest one.
- commercially available screen management software TeVTech based: HALCON
- the CPU 23 sets the size (attraction), shape, and the size of the insect removal area A 1 (see FIG. 5) generated between the object W 1 and the touch pad 12.
- the CPU 23 includes the object W1 and the touching object generated in the love area A1. It extracts information (ie, Heng dingological information) on the size (storehouse) of the fixed area A 2 (see Fig. 5) with the pad 12. That is, the CPU 23 has a function as information extracting means.
- the area A 2 is an area in which the touch pad 12 is set to the force S object #W 1 and the dalid pattern 19 does not move.
- the area in which the grid pattern 19 moves S while the touch pad 12 is in contact with the object W1 is referred to as a sliding area A3 (see FIG. 5).
- the grid pattern 19 is recognized as a difficult shape, and the infested area A1 is recognized. Looks brighter than the rest. Therefore, it is possible to measure the color of the insect region A1 based on the difference in the luminance of the image.
- ⁇ illumination 14 that illuminates the grid pattern 19 one that emits white light in which the luminance difference between the nematode area A 1 and the ⁇ insect area is large is used. . This makes the A1 area clearer! Can be removed.
- the image taken one step from the image »this time:? M (33 ms ago) is extracted, and the image is HE-enhanced to increase the contrast.
- the grid pattern is displayed as a gliding pattern.
- a grid-like bell is not displayed, and the image power is displayed like white noise.
- the boundary between the two becomes clear, and the boundary between the solid ⁇ ! R region A2 and the slip region A3 can be obtained from the image.
- the guest area ⁇ ⁇ 2 and the slip area A3 can be measured separately.
- the CPU 23 shown in FIG. 3 calculates the distance between the object W1 and the transparent elastic body 17 from the ratio of the size of the solid region B 2 to the size of the solid region B 1 Measure the pseudocoefficient (slipperiness) of.
- the CPU 23 reads out from the lirlHR OM 24 data indicating the relationship between the number of the maize and the ratio of the size of the adhesive region A2 to the size of the fiber region A1.
- the CPU 23 calculates the ratio of the size of the fixed area A 2 to the measured size of the worm area A 1.
- [Corresponding] 3 ⁇ 4 ⁇ Measure the modulus data by 1 meter. By this selection, the assigned coefficient near the weaving area A1 can be measured. Note that the higher the ratio of the size of the fixed area A 2 to the size of the insect removing area A 1 is, the smaller the sliding area A 3 becomes, so that the number of the mae becomes larger. It may be determined by other means.
- the CPU 23 calculates the spring force from the size of the contact area A1.
- spring force is a force that acts on the object W 1 in the lead direction when the object W 1 is pressed by the tiltSS bright elastic body 17 of the touching touch pad 12.
- the CPU 23 reads, from the ROM 24, data indicating the relationship between the power of the lover area A 1 and the power of the spring 1. Then, the CPU 23 selects the data of the 3 ⁇ 4H force corresponding to the measured value of the fiber region A 1, so that the magnitude of the ⁇
- 3 ⁇ 4The spring power may be determined by other methods. That is, the CPU 23 has a function as a physical quantity measuring means.
- CPU 23 shown in FIG. 3 obtains a pseudoquan power from the shape of the 3 ⁇ 4 ⁇ region A1;
- the tangential force is a force that acts on the object W1 in the horizontal direction when the object W1 is pressed by the transparent elastic body 17.
- the CPU 23 reads out the ROM 24 data indicating the relationship between the tangential force and the shape and the center of gravity of the grubworm region A1.
- the CPU 23 selects the data of the citrus force corresponding to the shape and the center of gravity of the measured worm region A 1, thereby obtaining the magnitude of the simulated spring force acting in the vicinity of the worm region A 1. And direction can be measured.
- Miizumi Ka may be sought by other:
- the CPU 23 extracts information on the deformation of the grid pattern 19 by processing the image information from the self CCD camera 13. Then, the CPU 23 obtains the torque from ⁇ f # relating to the deformation of the grid pattern 19. Specifically, for example, the CPU 23 changes the image (see FIG. 6) before the grid pattern 19 is deformed (before touching the object W1). Compare the image after shaping (after removing the object # W1) (see Fig. 7) and measure the lattice twist (angle ⁇ ). Next, the CPU 23 also reads out data indicating the relationship between the angle ⁇ and the torque in the ROM 24. Then, the CPU 23 can measure the magnitude and direction of the torque acting on the vicinity of the vermin area A1 by selecting the torque data corresponding to the measured angle ⁇ . The torque may be determined by other methods.
- step S110 when the touch pad 12 comes into contact with the object # W1 in step S110, the CPU 23 moves the grid pattern 19 to cause the CCD camera 13 to input an image 'I The gorge is captured (step S120), and image processing is performed (step S130). Then, the CPU 23 measures the contact area A1 based on the brightness difference between the insect area A1 and the insect area displayed on the image, and determines the size, shape, and center of the fiber area A1. Information to be extracted (step S140). Next, the CPU 23 executes the image one step before (33 ms before) the image displayed this time. Further, the CPU 23 measures the fixed area A2 from the white noise-like image displayed by adding the image S1E so as to increase the contrast. Further, the CPU 23 extracts information on the size of the solid region A2 from the measurement result (step S150).
- the CPU 23 calculates an aesthetic coefficient between the object W1 and the transparent elastic body 17 from the ratio force of the size of the solid ⁇ S area A2 to the size of the removal area A1 (""). Step S 160). Further, the CPU 23 obtains 3 ⁇ 4
- the behavior of the transparent bow 10 is captured as an image by the CCD camera 13, so that a large amount of information 3 ⁇ 4r can be processed with a relatively small and simple it can. Therefore, several mechanical quantities (3 ⁇ 4
- the convexity of the transparent elastic body 17 It can be arranged on the curved surface 15.
- the pot-like components constituting the optical elastic sensor 17 are the transparent elastic body 17, the illumination 14 and the CCD camera 13 and are relatively few. Therefore, it is easy to manufacture the optical insect sensor 11.
- the portion that is most likely to be deformed during the inversion of the object W 1 is the convex curved surface 15. Then, as the convex surface 15 moves toward the inside of the touch pad 12, the amount of deformation becomes smaller. For this reason, according to the present difficulty mode in which the dalid pattern 19 is placed on the convex curved surface 15 by the rooster 3, the dalid pattern 19 is disposed inside the touch pad 12 rather than the dalid pattern 1. 9, large deformation is likely to occur. Therefore, the force applied to the touch pad 12 can be accurately obtained by performing iif of the deformation of the grid pattern 19 using the CCD camera 13. That is, the distortion generated inside the transparent elastic body 17 is detected: the detection accuracy is higher than that in the case where the transparent elastic body 17 is removed.
- the CPU 23 determines the shape, the shape, and the center of gravity of the worm region A 1. At the same time, information on the size of the fixed area A2 is extracted. Then, the CPU 23 calculates ⁇
- the grid pattern 1 9 is easily caught on the surface of the force S object W 1 when the transparent elastic body 1 7 force S object W 1 is infested, so that the dalid pattern 1 9 on the convex curved surface 15
- the coefficient of friction in the vicinity is large. Therefore, the grip of the touchpad 12 on the object W1 is improved.
- the »rule of the transparent elastic body 17 is a flat surface (flat surface 16), and a pressing plate 18 is provided on that side. Therefore, the deformation of the transparent elastic body 17 is prevented, so that the measurement of the mechanical quantity by the optical insect sensor 11 can be performed more accurately. Further, the touch pad 12 can be stably provided by the pressing plate 18. In addition, since the touch pad 12 is stably formed, the photographing of the grid pattern 19 by the CCD camera 13 becomes easy.
- the first form of the optical insect sensor 11 is a sensing system.
- the optical sensor 11 of the present embodiment is used for a robot hand 31 (a material force controller P device, a material force holding device g).
- the hand main body 32 constituting the robot hand 31 has a pair of fingers. 34, 35 (sensor support).
- the hand main body 32 is provided with two fingers 34 and 35; At least two of 34 and 35 are provided! / May be, Italy [J is 5 like a human hand! /.
- the hand main body 32 is provided with a first servomotor 36 (actuator), a second servomotor 37 (actuator), and a third servomotor 38 (actuator).
- the fingers 34 and 35 move in the direction in which the first servo motor 36 opens and closes by the I-separation, ie, in the horizontal direction in FIG. S).
- the fingers 34 and 35 are moved by the horsepower of the second servomotor 37 in a direction perpendicular to the direction in which the fingers 34 and 35 are opened and closed, and in a horizontal direction (that is, the front-back direction in FIG. 9). Further, the fingers 34 and 35 are moved in the lead it direction (that is, the vertical direction in FIG. 9) by the third servomotor 38.
- one finger 35 supports a ttit self-optical insect sensor 11.
- the other finger 34 is B-bound with the optical 53 ⁇ 4 insect sensor 1 IL and the outer shape, and is supported by the finger finger 39 force S.
- Appendix 39 and the optical sensor: 11 are arranged so as to face each other.
- An optical sensor 11 may be provided instead of the finger 39. That is, an optical sensor 1 may be provided on one of the fingers 34, 35 or one of the fingers 34, 35, or an optical sensor 11 may be provided on each of the fingers 34, 35. Good.
- control unit 22 for controlling the entire robot hand 31 has a configuration of the first haze type, and sleeps each of the servomotors 36 to 38, and positions, speeds and forces of the respective 34, 35. It has a motor driver 41 for controlling the operation.
- the motor driver 41 is controlled by the CPU 23.
- the motor driver 41 supplies a current having a predetermined waveform to each of the servomotors 36 to 38 based on the horse sleep signal output from the CPU 23.
- the CPU 23 shown in FIG. 9 determines the magnitude of the appropriate gripping force (operating force) to be given to the object W1 based on the spring force, the citric force, and the friction-related torque determined by the CPU 23. Calculate the direction. For example, if the 3
- the CPU 23 instructs the first to third servo motors 36 to 38 to the motor controller 41 based on the calculated appropriate gripping force.
- each finger 34, 35 holds object # W1 with an appropriate gripping force. That is, the CPU 23 performs feedback control of the gripping force. That is, the CPU 23 has a function as an actuator motor operation control means.
- step S110 when the processing from step S110 to step S190 is completed and the normal force, the pseudo spring force, and the j3 ⁇ 4f system 3 ⁇ 43 ⁇ 4 ⁇ torque are measured, the CPU 23 proceeds to step S110.
- the CPU 23 calculates an appropriate gripping force to be applied to the object W1 based on the obtained spring force, pseudo spring force, and friction-related torque. Then, the CPU 23 instructs the motor driver 41 to separate the horses of the first to third servomotors 36 to 38 with the calculated appropriate gripping force. 4, 35 Grasp the object W1 with a sharp gripping force.
- CPU 23 which has completed the processing of step S 210, performs the processing of steps S 120 to S 210 again. This process is performed at regular intervals while the fingers 34 and 35 are gripping the object W1. That is, the CPU 23 feedback controls the gripping force.
- the CPU 23 extracts information on the size, shape, and straightness of the fiber region A1 and information on the size of the fixed region A2. Then, based on the information, the CPU 23 calculates the ⁇ H power, the pseudo spring power, and the Maki 3 ⁇ 43 ⁇ 4 ⁇ torque. That is, a plurality of mechanical quantities can be measured simultaneously by one sensor. Then, an appropriate gripping force to be applied to the object ⁇ * ⁇ 1 can be calculated based on the measured mechanical quantity. Thus, the object W1 can be moved by the fingers 34, 35 so as not to be crushed, so that the object W1 can move smoothly.
- the touch pad 12 of the optical sensor 11 has a semicircular shape, the object W1 can be gripped without being affected by the shape of the object W1. Therefore, a humanoid robot equipped with a near-robot hand 31 in human hands can be used.
- the feedback control performed by the CPU 23 prevents the object W1 from crushing and slipping. You can continue to grasp. Since the camera and optical sensor 11 capture the behavior of the transparent elastic body 17 as image information by the CCD camera 13, a large amount of information can be processed relatively quickly. Therefore, even if the appropriate operation force S to be applied to the object # W1 changes abruptly, the object # W1 can be held so as not to be crushed or slipped without force.
- the haze form of the present invention may be changed as follows.
- the grid pattern 19 does not have a grid shape, but may have another shape such as a triangular mesh or a hexagonal mesh (honeycomb).
- the grid pattern 19 may be covered with a cover layer made of a conductive material.
- a plurality of grooves 20 are formed on the convex curved surface 15 of the elastic body 17.
- a plurality of grooves 20 may be formed (for example, by cutting) after the formation of the touch pad 12 force S.
- the illumination 14 emits white light, but may be other light such as blue light or red light.
- the first to third servo motors 36 to 38 are used as actuators for the fingers 34 and 35, but the hydraulic cylinder, the empty Mffi cylinder, the ultrasonic motor You can use them as actuators.
- the optical insect sensor of the present invention and the sensing system using the same, there is a possibility that two-way information of tactile information between a human and a robot can be easily obtained.
- it can be applied to a robotic chinda machine that can make a robot learn a strange movement and feeling of a fingertip such as throwing a ball.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Analytical Chemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Chemical & Material Sciences (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Force Measurement Appropriate To Specific Purposes (AREA)
- Manipulator (AREA)
- Geophysics And Detection Of Objects (AREA)
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/592,243 US7707001B2 (en) | 2004-03-09 | 2005-03-04 | Control of object operating force, object gripping force and robot hands |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-066401 | 2004-03-09 | ||
JP2004066401A JP4621827B2 (ja) | 2004-03-09 | 2004-03-09 | 光学式触覚センサ、光学式触覚センサを利用したセンシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置及びロボットハンド |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005085785A1 true WO2005085785A1 (ja) | 2005-09-15 |
Family
ID=34918329
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/004259 WO2005085785A1 (ja) | 2004-03-09 | 2005-03-04 | 光学式触覚センサ、センシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置、ロボットハンド |
Country Status (3)
Country | Link |
---|---|
US (1) | US7707001B2 (ja) |
JP (1) | JP4621827B2 (ja) |
WO (1) | WO2005085785A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108519325A (zh) * | 2018-05-07 | 2018-09-11 | 河北工业大学 | 一种研究手与物体接触间摩擦系数和接触面积之间关系的方法与装置 |
Families Citing this family (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090132088A1 (en) * | 2007-04-24 | 2009-05-21 | Tairob Ltd. | Transfer of knowledge from a human skilled worker to an expert machine - the learning process |
CA2683566C (en) * | 2007-05-30 | 2017-08-22 | Martin Pointing Devices | Touch-sensitive pointing device with guiding lines |
JP5003970B2 (ja) * | 2008-03-17 | 2012-08-22 | 国立大学法人東京農工大学 | 接触面積測定装置および接触面積測定方法 |
JP5003974B2 (ja) * | 2008-08-01 | 2012-08-22 | 国立大学法人東京農工大学 | 接触面積測定装置および接触面積測定方法 |
GB2470537B (en) * | 2008-03-17 | 2012-12-12 | Univ Tokyo Agriculture | Contact area measuring apparatus and contact area measuring method |
WO2009155501A2 (en) * | 2008-06-19 | 2009-12-23 | Massachusetts Institute Of Technology | Tactile sensor using elastomeric imaging |
US9052710B1 (en) * | 2009-03-20 | 2015-06-09 | Exelis Inc. | Manipulation control based upon mimic of human gestures |
JP5239987B2 (ja) * | 2009-03-24 | 2013-07-17 | 株式会社豊田自動織機 | ロボットハンド用撮像装置内蔵フィンガ |
JP5239986B2 (ja) * | 2009-03-24 | 2013-07-17 | 株式会社豊田自動織機 | ロボットハンド用フィンガ |
DE102009031385A1 (de) * | 2009-07-01 | 2011-01-05 | Giesecke & Devrient Gmbh | Verfahren, tragbarer Datenträger und System zum Freigeben einer Transaktion |
JP5660531B2 (ja) * | 2010-08-12 | 2015-01-28 | 国立大学法人名古屋大学 | 形状計測装置、及び形状計測方法 |
US20130220032A1 (en) | 2010-10-26 | 2013-08-29 | Muthukumaran Packirisamy | System For Sensing a Mechanical Property of a Sample |
JP5834478B2 (ja) | 2011-05-10 | 2015-12-24 | セイコーエプソン株式会社 | ロボット |
KR20140041890A (ko) | 2011-07-28 | 2014-04-04 | 메사추세츠 인스티튜트 오브 테크놀로지 | 고해상도 표면 측정 시스템 및 방법 |
HUP1100633A2 (en) * | 2011-11-17 | 2013-06-28 | Pazmany Peter Katolikus Egyetem | Device with optical feedback for measuring force and pressure |
EP2793688A4 (en) * | 2011-12-19 | 2015-05-06 | Univ California | SYSTEM AND METHOD FOR QUANTIFYING BODILY PALPATION FOR THE IMPROVEMENT OF MEDICAL DIAGNOSIS |
JP5516610B2 (ja) * | 2012-01-19 | 2014-06-11 | 株式会社安川電機 | ロボット、ロボットハンドおよびロボットハンドの保持位置調整方法 |
TR201208054A2 (tr) * | 2012-07-11 | 2012-12-21 | B�Y�K�Ah�N Utku | Cihaz ve robotlara çok noktalı, yuksek hassasiyetli dokunma hissi sağlayan modül. |
TWI482956B (zh) * | 2012-07-24 | 2015-05-01 | Univ Nat Cheng Kung | 物體特性感測系統及其控制方法 |
KR101979680B1 (ko) | 2012-12-05 | 2019-05-20 | 삼성전자주식회사 | 촉각센서 |
US10574944B2 (en) | 2013-03-08 | 2020-02-25 | Gelsight, Inc. | Continuous contact-based three-dimensional measurement |
US9452538B2 (en) | 2013-03-13 | 2016-09-27 | Disney Enterprises, Inc. | Selectively modifiable layer for alteration of appearance or texture |
JP6161157B2 (ja) * | 2013-07-16 | 2017-07-12 | 国立大学法人広島大学 | 印加力推定方法、印加力推定プログラム、およびコンピュータ入力デバイス |
DE102013017007B4 (de) * | 2013-10-14 | 2015-09-10 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Roboter mit einem Endmanipulatorarm mit Endeffektor sowie Verfahren zur Bestimmung eines Kraft- und Drehmomenteintrages auf einen Endeffektor eines Roboters |
US9381659B2 (en) * | 2014-04-16 | 2016-07-05 | The Boeing Company | Automated apparatus for use in selectively cutting side walls of a honeycomb core |
CN105313127A (zh) * | 2014-06-02 | 2016-02-10 | 精工爱普生株式会社 | 机器人、机器人的控制方法以及机器人的控制装置 |
US9757862B2 (en) | 2014-10-16 | 2017-09-12 | Technische Universität München | Tactile sensor |
KR101684237B1 (ko) * | 2014-10-28 | 2016-12-20 | 한국기계연구원 | 생체내 이물질 측정 장치 |
CN104589358A (zh) * | 2014-12-03 | 2015-05-06 | 安徽省库仑动力自动化科技有限公司 | 一种基于力传感与数据库对比的工业机器人拆解倾倒方法 |
JP6468871B2 (ja) * | 2015-02-03 | 2019-02-13 | キヤノン株式会社 | ロボットハンド制御方法及びロボット装置 |
CN104932382A (zh) * | 2015-06-24 | 2015-09-23 | 哈尔滨工业大学 | 微创环境下用于触摸诊断的三维微型力传感器 |
US9889564B2 (en) * | 2015-07-08 | 2018-02-13 | Empire Technology Development Llc | Stable grasp point selection for robotic grippers with machine vision and ultrasound beam forming |
FR3038721B1 (fr) * | 2015-07-09 | 2018-12-07 | Centre National De La Recherche Scientifique | Tribometre pour la mesure de champs de deplacements a l'interface de deux elements |
RU2704897C2 (ru) * | 2015-09-16 | 2019-10-31 | Филип Моррис Продактс С.А. | Картридж с частью для хранения жидкости с гибкой стенкой |
EP3355778B1 (en) * | 2015-09-30 | 2020-12-09 | 3M Innovative Properties Company | System and method for optimizing body and object interactions |
US10456910B2 (en) * | 2016-01-14 | 2019-10-29 | Purdue Research Foundation | Educational systems comprising programmable controllers and methods of teaching therewith |
CN109074153A (zh) * | 2016-03-29 | 2018-12-21 | 斋藤创造研究所株式会社 | 一种输入装置及图像显示系统 |
DE102016108966B4 (de) * | 2016-05-13 | 2017-11-30 | Technische Universität München | Visuell-haptischer Sensor für 6D-Kraft/Drehmoment |
TR201606363A2 (tr) * | 2016-05-13 | 2017-11-21 | Sensobright Ind Llc | Çok işlevli bir algılama sistemi. |
DK3455599T3 (da) * | 2016-05-13 | 2022-03-21 | Sensobright Ind Llc | Praktisk sensorsystem |
JP6729930B2 (ja) * | 2016-07-08 | 2020-07-29 | 国立大学法人広島大学 | 触覚評価方法および触覚評価システム |
US20200139543A1 (en) * | 2017-06-21 | 2020-05-07 | Saito Inventive Corp. | Manipulator and robot |
KR102013992B1 (ko) * | 2017-08-04 | 2019-08-23 | 한국해양대학교 산학협력단 | 영상기반 비접촉식 촉각 센싱 시스템 |
US11945098B2 (en) | 2017-08-14 | 2024-04-02 | Contactile Pty Ltd | Friction-based tactile sensor for measuring grip security |
US10372155B2 (en) * | 2017-08-20 | 2019-08-06 | Pixart Imaging Inc. | Joystick and related control method |
US10814494B2 (en) * | 2017-09-26 | 2020-10-27 | Toyota Research Institute, Inc. | Robotic gripper fingers |
US10668627B2 (en) * | 2017-09-26 | 2020-06-02 | Toyota Research Institute, Inc. | Deformable sensors and methods for detecting pose and force against an object |
DE102018123546A1 (de) * | 2017-09-26 | 2019-03-28 | Toyota Research Institute, Inc. | Verformbare sensoren und verfahren zur erfassung der pose und kraft an einem objekt |
US11110603B2 (en) | 2018-10-02 | 2021-09-07 | Toyota Research Institute, Inc. | Systems and methods for naïve physics for contact and contact-awareness in robotic teleoperation |
TWI699511B (zh) * | 2018-11-12 | 2020-07-21 | 國立中央大學 | 觸覺感測器 |
JP7280032B2 (ja) * | 2018-11-27 | 2023-05-23 | ローム株式会社 | 入力デバイス、自動車 |
GB2579846A (en) * | 2018-12-18 | 2020-07-08 | Univ Bristol | Improvements in or relating to tactile sensing |
EP3693139A1 (en) | 2019-02-11 | 2020-08-12 | Université d'Aix-Marseille | Optical tactile sensor |
JP7396357B2 (ja) * | 2019-06-05 | 2023-12-12 | ソニーグループ株式会社 | 制御装置および方法、並びに、プログラム |
PL430286A1 (pl) * | 2019-06-19 | 2020-12-28 | Innovative Radar Technologies Laboratory Spółka Z Ograniczoną Odpowiedzialnością | Układ do pomiaru odkształceń oraz sposób pomiaru odkształceń |
US11836823B2 (en) | 2019-07-04 | 2023-12-05 | Fingervision Co., Ltd. | Tactile sensor, tactile sensor system, and program |
JP6873439B2 (ja) * | 2019-07-18 | 2021-05-19 | 株式会社齋藤創造研究所 | マニピュレーターおよびロボット |
WO2021081084A1 (en) * | 2019-10-21 | 2021-04-29 | The Regents Of The University Of California | Multi-directional high-resolution optical tactile sensors |
WO2021085098A1 (ja) * | 2019-10-30 | 2021-05-06 | ソニー株式会社 | 光学式センサおよび光学式センサモジュール |
WO2021124388A1 (ja) * | 2019-12-16 | 2021-06-24 | 国立大学法人東北大学 | 把持装置、制御方法及びプログラム |
US11584026B2 (en) * | 2020-02-17 | 2023-02-21 | Toyota Research Institute, Inc. | Robot arm assemblies including fingers having deformable sensors |
TWI767264B (zh) * | 2020-06-24 | 2022-06-11 | 財團法人工業技術研究院 | 受壓狀態量測方法及受壓狀態量測系統 |
GB202012448D0 (en) * | 2020-08-11 | 2020-09-23 | Ocado Innovation Ltd | Object presence sensing |
CN112077882A (zh) * | 2020-09-18 | 2020-12-15 | 马鞍山迈若斯机器人科技有限公司 | 一种基于触觉的避障机器人 |
US20220143821A1 (en) * | 2020-11-11 | 2022-05-12 | Sony Interactive Entertainment Inc. | Method for robotic training based on randomization of surface stiffness |
CN112744604B (zh) * | 2020-12-11 | 2022-01-28 | 珠海格力电器股份有限公司 | 一种码垛机器人及其控制方法、装置、存储介质及处理器 |
CN112857630B (zh) * | 2021-01-15 | 2022-10-14 | 之江实验室 | 一种软体机器人手的三维凸面柔性触觉传感器及制造方法 |
US11472039B1 (en) * | 2021-04-23 | 2022-10-18 | Toyota Research Institute, Inc. | Deformable sensor with rotatable sensing components for detecting deformation levels |
CN113223377A (zh) * | 2021-04-25 | 2021-08-06 | 浙江理工大学 | 一种盲文阅读笔及盲文读取方法 |
CN113681585B (zh) * | 2021-09-13 | 2022-09-13 | 清华大学深圳国际研究生院 | 一种具备人工痛觉的变刚度柔性夹爪 |
CN114113008B (zh) * | 2021-10-22 | 2023-12-22 | 清华大学深圳国际研究生院 | 一种基于结构光的人工触觉设备和方法 |
WO2023127302A1 (ja) * | 2021-12-28 | 2023-07-06 | ソニーグループ株式会社 | センサ装置、およびロボット |
US20230251149A1 (en) * | 2022-02-04 | 2023-08-10 | Massachusetts Institute Of Technology | Flexible optical tactile sensor |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0726805B2 (ja) * | 1992-10-30 | 1995-03-29 | 工業技術院長 | 触覚センサ |
JPH11304602A (ja) * | 1998-04-15 | 1999-11-05 | Matsushita Electric Works Ltd | 半導体チップの応力分布検出方法 |
JP3047021B1 (ja) * | 1999-04-05 | 2000-05-29 | 工業技術院長 | 触覚センサ |
JP2000254884A (ja) * | 1999-03-10 | 2000-09-19 | Keiogijuku | ハンド又はマニピュレータによる物体把持制御方法 |
WO2002018893A1 (fr) * | 2000-08-31 | 2002-03-07 | Center For Advanced Science And Technology Incubation, Ltd. | Détecteur tactile optique |
JP2005114715A (ja) * | 2003-09-16 | 2005-04-28 | Toudai Tlo Ltd | 光学式触覚センサを用いた力ベクトル再構成法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61205831A (ja) * | 1985-03-08 | 1986-09-12 | Nippon Telegr & Teleph Corp <Ntt> | マトリツクス触覚センサ |
US4599908A (en) * | 1985-03-18 | 1986-07-15 | Sheridan Thomas B | Opto-mechanical touch sensor |
JPS6225519A (ja) * | 1985-07-26 | 1987-02-03 | Nippon Telegr & Teleph Corp <Ntt> | 同一周波干渉量検出方式 |
JPH0650966B2 (ja) | 1989-07-14 | 1994-07-06 | 日本製紙株式会社 | 組識培養によるメロン苗の増殖方法 |
JPH0726805A (ja) | 1993-07-13 | 1995-01-27 | Nissan Motor Co Ltd | キイシリンダ構造 |
JPH07128163A (ja) * | 1993-11-08 | 1995-05-19 | Fuji Electric Co Ltd | 触覚センサ |
JP2000227371A (ja) * | 1999-02-05 | 2000-08-15 | Masahiko Matsubara | 面圧力分布検出装置 |
CN1853093A (zh) * | 2003-09-16 | 2006-10-25 | 株式会社东京大学Tlo | 光学式触觉传感器和使用该传感器的力矢量分布再构成法 |
JP2006003137A (ja) * | 2004-06-16 | 2006-01-05 | Toudai Tlo Ltd | 光学式触覚センサ及び該センサにおける情報取得方法 |
-
2004
- 2004-03-09 JP JP2004066401A patent/JP4621827B2/ja not_active Expired - Lifetime
-
2005
- 2005-03-04 US US10/592,243 patent/US7707001B2/en active Active
- 2005-03-04 WO PCT/JP2005/004259 patent/WO2005085785A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0726805B2 (ja) * | 1992-10-30 | 1995-03-29 | 工業技術院長 | 触覚センサ |
JPH11304602A (ja) * | 1998-04-15 | 1999-11-05 | Matsushita Electric Works Ltd | 半導体チップの応力分布検出方法 |
JP2000254884A (ja) * | 1999-03-10 | 2000-09-19 | Keiogijuku | ハンド又はマニピュレータによる物体把持制御方法 |
JP3047021B1 (ja) * | 1999-04-05 | 2000-05-29 | 工業技術院長 | 触覚センサ |
WO2002018893A1 (fr) * | 2000-08-31 | 2002-03-07 | Center For Advanced Science And Technology Incubation, Ltd. | Détecteur tactile optique |
JP2005114715A (ja) * | 2003-09-16 | 2005-04-28 | Toudai Tlo Ltd | 光学式触覚センサを用いた力ベクトル再構成法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108519325A (zh) * | 2018-05-07 | 2018-09-11 | 河北工业大学 | 一种研究手与物体接触间摩擦系数和接触面积之间关系的方法与装置 |
CN108519325B (zh) * | 2018-05-07 | 2023-08-04 | 河北工业大学 | 一种研究手与物体接触间摩擦系数和接触面积之间关系的方法与装置 |
Also Published As
Publication number | Publication date |
---|---|
US7707001B2 (en) | 2010-04-27 |
JP2005257343A (ja) | 2005-09-22 |
JP4621827B2 (ja) | 2011-01-26 |
US20080027582A1 (en) | 2008-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005085785A1 (ja) | 光学式触覚センサ、センシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置、ロボットハンド | |
Lepora | Soft biomimetic optical tactile sensing with the TacTip: A review | |
Chen et al. | Tactile sensors for friction estimation and incipient slip detection—Toward dexterous robotic manipulation: A review | |
Low et al. | Hybrid tele-manipulation system using a sensorized 3-D-printed soft robotic gripper and a soft fabric-based haptic glove | |
Chorley et al. | Development of a tactile sensor based on biologically inspired edge encoding | |
Dahiya et al. | Tactile sensing—from humans to humanoids | |
Qu et al. | Recent progress in advanced tactile sensing technologies for soft grippers | |
Lin et al. | Sensing the frictional state of a robotic skin via subtractive color mixing | |
CN109176590B (zh) | 一种具有压滑觉感知的柔性指尖、装置及方法 | |
Stepp et al. | Relative to direct haptic feedback, remote vibrotactile feedback improves but slows object manipulation | |
JP6587195B2 (ja) | 触覚情報推定装置、触覚情報推定方法、プログラム及び非一時的コンピュータ可読媒体 | |
KR20190080802A (ko) | 가상 물체의 터칭 및 파지와 관련된 햅틱 효과를 제공하는 시스템 및 방법 | |
Kappassov et al. | Color-coded fiber-optic tactile sensor for an elastomeric robot skin | |
US20180011538A1 (en) | Multimodal haptic effects | |
JP2010221358A (ja) | ロボットハンド用撮像装置内蔵フィンガ | |
JP2005165670A (ja) | ヒューマンインタフェース装置及びヒューマンインタフェースシステム | |
Zhang et al. | Vision-based sensing for electrically-driven soft actuators | |
Battaglia et al. | ThimbleSense: an individual-digit wearable tactile sensor for experimental grasp studies | |
KR101879811B1 (ko) | 수직 전단력 촉각센서, 이의 제조 방법 및 촉각 센서 시스템 | |
JP2001265522A (ja) | 爪に装着するセンサ | |
Jin et al. | Progress on flexible tactile sensors in robotic applications on objects properties recognition, manipulation and human-machine interactions | |
JP2009198475A (ja) | 弾性体特性を利用した3次元触覚センサ及び3次元触覚センシング方法 | |
Sakuma et al. | A wearable fingernail deformation sensing system and three-dimensional finite element model of fingertip | |
JP2020023050A (ja) | 触覚情報推定装置、触覚情報推定方法及びプログラム | |
TWI396835B (zh) | Piezoelectric tactile sensor and its manufacturing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10592243 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10592243 Country of ref document: US |