CN111805562B - Tactile sensor and robot - Google Patents

Tactile sensor and robot Download PDF

Info

Publication number
CN111805562B
CN111805562B CN202010503562.3A CN202010503562A CN111805562B CN 111805562 B CN111805562 B CN 111805562B CN 202010503562 A CN202010503562 A CN 202010503562A CN 111805562 B CN111805562 B CN 111805562B
Authority
CN
China
Prior art keywords
unit
photoelectric detection
imaging
tactile sensor
sensor according
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010503562.3A
Other languages
Chinese (zh)
Other versions
CN111805562A (en
Inventor
姜峣
张伦玮
眭若旻
李铁民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202010503562.3A priority Critical patent/CN111805562B/en
Publication of CN111805562A publication Critical patent/CN111805562A/en
Application granted granted Critical
Publication of CN111805562B publication Critical patent/CN111805562B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/084Tactile sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a touch sensor and a robot, and relates to the technical field of sensors. The tactile sensor includes: the photoelectric detection structure and set up the touching structure on the photoelectric detection structure, the photoelectric detection structure is including the photoelectric detection unit that has the photosurface, the touching structure is including the elasticity leaded light unit that has a plurality of mark points and the shading unit that has a plurality of formation of image apertures, the shading unit sets up between elasticity leaded light unit and photoelectric detection unit, the formation of image aperture sets up to image the mark point and generate light spot information on the photosurface, the photoelectric detection unit sets up to turn into the signal of telecommunication with light spot information. According to the embodiment of the invention, the imaging small hole on the shading unit is used as the imaging element, so that the imaging requirement of at least 4 times of focal length of the convex lens imaging is avoided, the overall thickness of the touch sensor is reduced, the installation space required by the touch sensor is reduced, and the imaging device is suitable for the application requirement of a manipulator.

Description

Tactile sensor and robot
Technical Field
The invention relates to the technical field of sensors, in particular to a touch sensor and a robot.
Background
Today, as the robot technology is rapidly developed, more and more manpower labor can be completed by various robots, so that the production efficiency is greatly improved, the production cost is reduced, and the labor intensity of people is also reduced. With the popularization of robot applications, the application scenarios of robots are gradually expanding from predictable structured scenarios (such as production lines, automated warehouses, etc.) to unpredictable unstructured scenarios (such as homes, fields, etc.).
In production life, the grabbing of articles is an important operation which needs to be completed by a robot using a manipulator in many application scenes. However, the performance of the manipulator in the grabbing operation is still far from the human hand, especially when grabbing objects with unknown physical properties such as appearance, quality and surface friction coefficient. Research shows that abundant information provided by touch sense plays an important role in controlling the grabbing force and selecting the grabbing point when a human hand grabs an object, so that reliable grabbing can be realized; therefore, for the manipulator, the performance of the touch sensor directly affects the reliability of the manipulator grasping.
For the manipulator, because the volume of the mechanical finger is small, the installation position and the wiring space of the touch sensor are greatly limited. At present, a micro camera is directly adopted as an image acquisition device for the touch sensor using a photoelectric detection element as a sensitive element, and because a camera lens is a convex lens for imaging, the principle determines that the camera lens needs at least 4 times of focal length for optical imaging, and the occupied space is very large, so that the camera lens is difficult to apply to a manipulator.
Disclosure of Invention
The embodiment of the invention provides a touch sensor and a robot, and solves the problem that the existing touch sensor occupies a large installation space.
In order to solve the above problem, an embodiment of the present invention provides a tactile sensor including: the photoelectric detection structure and set up the touching structure on the photoelectric detection structure, the photoelectric detection structure is including the photoelectric detection unit that has the photosurface, the touching structure is including the elasticity leaded light unit that has a plurality of mark points and the shading unit that has a plurality of formation of image apertures, the shading unit sets up between elasticity leaded light unit and photoelectric detection unit, the formation of image aperture sets up to image the mark point and generate light spot information on the photosurface, the photoelectric detection unit sets up to turn into the signal of telecommunication with light spot information.
The embodiment of the invention also provides a robot, which comprises a mechanical hand with a mechanical finger and the touch sensor, wherein the touch sensor is arranged on the inner side of the surface of the mechanical finger.
The embodiment of the invention provides a touch sensor and a robot, wherein the imaging small hole on the shading unit is used as an imaging element, so that the imaging requirement of at least 4 times of focal length of convex lens imaging is avoided, the overall thickness of the touch sensor is reduced, the occupied volume of the touch sensor is reduced, and the touch sensor is suitable for the application requirement of a manipulator.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.
FIG. 1 is a schematic view of a robot configured to grasp an object with a robot hand;
FIG. 2 is a block diagram of a tactile sensor according to an embodiment of the invention;
FIG. 3 is an exploded view of a tactile sensor according to an embodiment of the invention;
FIG. 4 is an assembled structural view of a tactile sensor according to an embodiment of the present invention;
FIG. 5A is a top view of an array of marker points according to an embodiment of the present invention;
FIG. 5B is an RGB layout diagram of the mark points according to the embodiment of the present invention;
fig. 6 is a schematic diagram of a robot operating as a tactile sensor according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. It should be noted that the embodiments and features of the embodiments in the present application may be arbitrarily combined with each other without conflict.
In the description of the present invention, it should be noted that the terms "upper", "lower", "one side", "the other side", "one end", "the other end", "side", "opposite", "four corners", "periphery", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the structure referred to has a specific orientation, is constructed and operated in a specific orientation, and thus, is not to be construed as limiting the present invention.
In the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "connected," "directly connected," "indirectly connected," "fixedly connected," "mounted," and "mounted" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; the terms "mounted," "connected," and "fixedly connected" may be directly connected or indirectly connected through intervening media, or may be connected through two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Fig. 1 is a schematic structural view of a robot when a robot hand grips an object. As shown in fig. 1. The robot adopts a manipulator (or called a smart manipulator) 10 to grab an object 20, an action area between a mechanical finger 11 and the object 20 is defined as a grabbing area, a touch sensor 30 for detecting the three-dimensional stress condition of the mechanical finger is arranged on the surface of the grabbing area of the mechanical finger 11, and the grabbing force of the manipulator 10 can be adjusted and a proper grabbing position can be selected according to information obtained by the touch sensor 30. Since the robot finger 11 is small in size, the installation position and the wiring space of the touch sensor 30 are greatly limited. However, in the current touch sensor using a photodetection element as a sensitive element, for example, a CMOS-based photodetection element, a miniature camera is directly used as an image acquisition device, but since a camera lens is a convex lens for imaging, the principle determines that the camera lens needs a distance of at least 4 times of focal length for optical imaging, the occupied space is large, and the application to a manipulator is difficult, and therefore, the small-sized touch sensor 30 becomes an urgent need for the development of the robot field.
In order to solve the problem that the existing touch sensor occupies a large installation space, the embodiment of the invention provides a touch sensor, which comprises: the photoelectric detection structure comprises a photoelectric detection unit with a photosensitive surface, the touch structure comprises an elastic light guide unit with a plurality of mark points and a shading unit with a plurality of imaging pores, the shading unit is arranged between the elastic light guide unit and the photoelectric detection unit, the imaging pores are arranged to image the mark points on the photosensitive surface to generate light spot information, and the photoelectric detection unit is arranged to convert the light spot information into an electric signal.
The technical solution of the embodiments of the present invention is specifically described below with reference to the accompanying drawings.
Fig. 2 is a structural diagram of a tactile sensor according to an embodiment of the present invention.
As shown in fig. 2, the touch sensor 30 includes a photodetecting structure 31 and a touch structure 32 disposed on the photodetecting structure 31. The touch structure 32 is configured to deform by pressing the object under the action of the manipulator, and converts deformation information into light spot information on the photosensitive surface of the photoelectric detection structure, and the light spot information may include light spot position and size information. The photo detection structure 31 is configured to detect the light spot information, convert the light spot information into an electrical signal, and upload the electrical signal to a computer system. The computer system can process and convert the electric signals, compare the light spot images before and after deformation, calculate the three-dimensional stress condition in the mechanical finger grabbing area in the process that the mechanical arm grabs the object, and obtain measurement information, and the computer system selects the grabbing point and regulates and controls the grabbing force of the mechanical arm according to the measurement information.
Fig. 3 is an exploded view of a tactile sensor according to an embodiment of the present invention, and fig. 4 is an assembled structural view of the tactile sensor according to the embodiment of the present invention.
As shown in fig. 3-4, the photo-detecting structure 31 includes a photo-detecting unit 311 having a photo-sensing surface 3111, and the photo-sensing surface 3111 is located on a side of the photo-detecting unit 311 facing the touch structure 32. The photo detection unit 311 includes a connection terminal 3112, and the photo detection unit 311 converts the spot information into an electrical signal to transmit to the computer system through the connection terminal 3112. The photosensitive surface can be a rectangular surface or a circular surface and is set according to the distribution condition of the marking points. In this embodiment, the photodetection unit 311 may include a Complementary Metal Oxide Semiconductor (CMOS) sensor module, and the CMOS module has color channels, such as red, green, and blue color channels, and can identify the light spot color of the mark point.
As shown in fig. 3-4, the touch structure 32 includes an elastic light guide unit 321 having a plurality of mark points 3211 and a light shielding unit 322 having a plurality of imaging apertures 3221, the light shielding unit 322 is disposed between the light emitting unit 321 and the photodetecting unit 311, and the mark points 3211 are disposed directly above the imaging apertures 3221 and correspond to the imaging apertures 3221 one to one. The imaging aperture 3221 is configured to image a corresponding marker point onto the photosensitive surface 3111 to generate spot information. The elastic light guide unit 321 has light guiding property and elasticity, and light emitted from the mark point can pass through the elastic light guide unit. The elastic light guide unit can deform under the extrusion of an object and a mechanical finger, when the deformation occurs, the position of the mark point relative to the imaging small hole changes, specifically, the position and the shape of the light spot of the mark point imaged on the photosensitive surface change, and then the deformation information is converted into light spot information. In one example, the orthographic projection of the marker point 3211 on the photosensitive surface 3111 includes the orthographic projection of the corresponding imaging aperture 3221 on the photosensitive surface 3111. The term "comprising" is to be understood as meaning that the orthographic projection of the marking point on the photosensitive surface completely coincides with the orthographic projection of the corresponding imaging aperture on the photosensitive surface, or that the orthographic projection of the imaging aperture on the photosensitive surface is within the range of the orthographic projection of the corresponding marking point on the photosensitive surface.
In this embodiment, the material of the elastic light guide unit may be transparent silicone, such as polydimethylsiloxane PMMS. The thickness of the elastic light guide unit is 0.5-2.0 mm, and the mark points can be arranged in the elastic light guide unit or on one side of the elastic light guide unit, which is far away from the shading unit. The marking points can be actively luminous, passively luminous or reflective.
In the present embodiment, the material of the light shielding unit includes a metal foil, and the metal foil may include, but is not limited to, at least one of an aluminum foil, a copper foil, and a silver foil. The thickness of the metal foil is 0.06 mm-0.1 mm, the diameter of the imaging holes can be 0.06 mm-0.173 mm, and the distance between the imaging holes is 0.5 mm-2 mm. The length of the imaging aperture is equal to the thickness of the metal foil. In one example, the aspect ratio of the imaged aperture is 1.0 to 1.9, and optionally the aspect ratio of the imaged aperture is 1.73. In another example, the distance between the imaging aperture and the mark point is 0.5 mm to 2.0 mm, the distance between the imaging aperture and the photosensitive surface is 0.5 mm to 2.0 mm, and optionally, the distance between the imaging aperture and the mark point is equal to the distance between the imaging aperture and the photosensitive surface, wherein the distance between the imaging aperture and the mark point may be the distance between the upper surface of the light shielding unit and the mark point, and the distance between the imaging aperture and the photosensitive surface may be the distance between the lower surface of the light shielding unit and the photosensitive surface.
In this embodiment, when the touch structure 32 does not press the object, the elastic light guide unit 321 is not deformed, and at this time, the mark point 3211 is imaged on the photosensitive surface 3111 through the corresponding imaging aperture 3221 to generate a reference spot. When the object is squeezed by the touch structure 32, the elastic light guide unit 321 deforms, the position of the mark point 3211 relative to the imaging small hole 3221 changes, and the imaging of the mark point 3211 on the photosensitive surface 3111 through the corresponding imaging small hole 3221 changes, at this time, the mark point 3211 images on the photosensitive surface 3111 through the corresponding imaging small hole 3221 to generate a measurement light spot, the computer system compares the light spot position and size change of the reference light spot image and the measurement light spot image, so that the three-dimensional stress condition of the surface of the mechanical finger gripping area can be estimated, measurement information is formed, and the adjustment of the gripping force of the manipulator and the selection of the gripping position are realized according to the measurement information.
According to the embodiment of the invention, the imaging small hole on the shading unit is used as the imaging element, so that no convex lens is arranged in the imaging light path of the mark point, the imaging requirement of at least 4 times of focal length of the convex lens is avoided, the overall thickness of the touch sensor is reduced, the occupied volume of the touch sensor is reduced, and the method is more suitable for a manipulator of a robot. The applicant studies show that the thickness of the touch sensor of the embodiment of the invention can be reduced to be within 5 mm.
In an exemplary embodiment, the marker points include fluorescent marker points, and the fluorescent marker points are arranged on the side of the elastic light guide unit, which faces away from the light shielding unit. The fluorescent mark points convert the absorbed light into light with specific wavelength, and the fluorescent mark points are imaged on the photosensitive surface through the imaging pores to form light spot information with color, wherein the light spot information with color can comprise red light spot information, green light spot information or blue light spot information.
Because the light of one fluorescent mark point can form a plurality of light spots on the light sensing surface of the photoelectric detection unit through the imaging pinholes corresponding to the adjacent fluorescent mark points, the light spot information can not be effectively distinguished. In an exemplary embodiment, the fluorescent marker dots include red fluorescent marker dots, green fluorescent marker dots and blue fluorescent marker dots, the fluorescent marker dots are arranged in multiple rows and multiple columns, adjacent rows or columns are staggered, and any one fluorescent marker dot is different from the fluorescent marker dots of the adjacent columns and rows in color. In one example, as shown in fig. 5A and 5B, the fluorescent marker dots include red fluorescent marker dots 3211a, green fluorescent marker dots 3211B, and blue fluorescent marker dots 3211c, the fluorescent marker dots are arranged in multiple rows and multiple columns, the fluorescent marker dots in adjacent columns are staggered, and any adjacent three columns include red fluorescent marker dot 3211a, green fluorescent marker dot 3211B, and blue fluorescent marker dot 3211c. In another example, the fluorescent markers are arranged in multiple rows and columns, and the fluorescent markers in adjacent rows are staggered, and in any adjacent three rows, the fluorescent markers include a red fluorescent marker 3211a row, a green fluorescent marker 3211b row, and a blue fluorescent marker 3211c row. Specifically, for example, as shown in fig. 5B, the plurality of fluorescent marker points include red fluorescent marker points 3211a, green fluorescent marker points 3211B, and blue fluorescent marker points 3211c, the plurality of fluorescent marker points are arranged in multiple rows and multiple columns, and the fluorescent marker points in adjacent columns are staggered to form a mosaic arrangement, where the first column is a red fluorescent marker point 3211a column, the second column is a blue fluorescent marker point 3211B column, the third column is a green marker point 3211c column, and the fluorescent marker points are repeatedly arranged according to the color sequence of the fluorescent marker points, so that any adjacent three columns include a red fluorescent marker point column 3211a, a blue fluorescent marker point column 3211B, and a green marker point column 3211c. In this embodiment, the selected fluorescent mark points are red, green, and blue, and correspond to three color channels of the photodetection unit (e.g., CMOS module), respectively, and when the distance between the imaging aperture and the fluorescent mark point is equal to the distance between the imaging aperture and the photosensitive surface, the light emitted from the first red fluorescent mark point 3211a1 may be imaged on the photosensitive surface through the imaging aperture corresponding to the adjacent first green fluorescent mark point 3211b1, and coincide with the light spot of the first green fluorescent mark point 3211b 1. The light spots of the first green fluorescent mark point 3211b1 and the first red fluorescent mark point 3211a1 on the photosensitive surface can be distinguished through red and green color channels of the photoelectric detection unit, and the other positions are the same, so that the problem of imaging of the mark points in the imaging small holes corresponding to the adjacent mark points is solved. In this embodiment, only the problem of interference in imaging of each fluorescent mark point with the adjacent fluorescent mark points in the adjacent columns and rows is considered, and the fluorescent mark points farther away are not considered, because the aspect ratio of the imaging aperture can control only the light ray whose included angle with the axis of the imaging aperture is smaller than a certain value to pass through the imaging aperture, and the included angle between the light ray direction and the aperture axis of the imaging aperture is too large to pass through the imaging aperture. Fig. 5A is a top view of an array of fluorescent markers according to an embodiment of the present invention, and fig. 5B is an RGB layout of the fluorescent markers according to the embodiment of the present invention.
Considering the problems of the difference of the color resolution performance of the photodetecting unit and the low purity of the color of the light emitted from the fluorescent mark points, which may result in the failure to effectively distinguish the overlapped light spots in the image, in an exemplary embodiment, the aspect ratio of the imaging aperture is properly increased to make the aperture have better selectivity for the direction of the transmitted light, and the distance between adjacent apertures is properly increased to make the light emitted from each fluorescent mark point not pass through other adjacent apertures, thereby avoiding the formation of multiple light spots. At this time, the mark points can not be distinguished by colors. In one example, the aspect ratio of the imaging aperture is 1.5 to 1.9.
In an exemplary embodiment, as shown in fig. 3-4, the touch structure 32 further includes an elastic light shield 323 and a light source 324. The light emitted from the light source 324 is transmitted to the fluorescent mark point through the elastic light guide unit 321, and the fluorescent mark point absorbs the light emitted from the light source and converts the light into fluorescence with a specific wavelength to emit the fluorescence. The elastic light shield 323 can prevent external light from interfering the imaging of the mark point on the photosensitive surface, and can absorb the light emitted by the light source and directly irradiating the inner surface of the mark point, thereby reducing light reflection and improving the accuracy and reliability of the light spot information of the touch structure. In an example, the light shielding unit 322 covers a side surface of the photodetecting structure 31 close to the touch structure 32, the elastic light shielding cover 323 is disposed on the light shielding unit 322, the light source 324 and the elastic light guiding unit 321 are located in a cavity surrounded by the elastic light shielding cover 323 and the light shielding unit 322, and the light source 324 is disposed on a side of the elastic light guiding unit 321. In another example, an elastic light shield is disposed on the photodetecting structure, the light shielding unit, the light source and the elastic light guiding unit are disposed in a cavity enclosed by the elastic light shield and the photodetecting structure, and the light source is disposed on one side of the elastic light guiding unit. In this embodiment, the material of the elastic light shield includes silica gel mixed with a proper amount of carbon black, and the mechanical properties of the elastic light shield are the same as or similar to those of the transparent light guide film. The elastic light shield is bonded on the light shielding unit or the photoelectric detection structure. The light source 324 may be a white LED light source, and a power line 3241 of the light source may pass through the elastic light shield 323 to be connected to an external power source. It should be noted that the present embodiment is also applicable to a mark point imaged by reflected light.
In the foregoing embodiment, a certain gap is formed between the elastic light shield and the elastic light guide unit, when the manipulator grabs an object, the gap prevents deformation information of the elastic light shield from being effectively transmitted to the elastic light guide unit, and affects the touch sensor to acquire accurate three-dimensional stress information of a mechanical finger grabbing area. In this embodiment, the transmission medium may be transparent silica gel.
In an exemplary embodiment, as shown in fig. 3 to fig. 4, the photodetection structure 31 further includes a substrate 312, a supporting frame 313, and a transparent cover 314, the transparent cover 314 and the substrate 312 are embedded in the supporting frame 313 and enclose a receiving cavity with the supporting frame 313, the photodetection unit 311 is disposed on the substrate 312 and located in the receiving cavity, and the transparent cover 313 is opposite to the light sensing surface 3111. The surfaces of the supporting frame 313 and the transparent cover 314 close to one side of the touch structure 32 are flush. In another example, the inner side of the supporting frame 313 is provided with an annular protrusion 3131, the transparent cover 314 is overlapped on one side of the annular protrusion 3131 close to the touch structure, and the substrate 312 is overlapped on one side of the annular protrusion 3131 away from the touch structure. In this embodiment, the transparent cover plate not only needs to have a good light transmittance, but also needs to have a certain rigidity to support the light shielding unit and the elastic light guiding unit. The material of the transparent cover plate includes but is not limited to glass, organic glass PMMA or polyethylene terephthalate PET, and the light transmittance of the transparent cover plate 313 is not less than 80%. The substrate may be a PCB board.
In an exemplary embodiment, as shown in fig. 3-4, the light shielding unit 322 comprises a metal foil, and the light shielding unit 322 covers a side of the transparent cover plate 314 facing away from the photodetecting unit. In one example, the light shielding unit 322 may cover only the transparent cover 313. In another example, the light shielding unit 322 may cover the transparent cover 313 and a side of the support frame 312 near the touch structure. In this embodiment, the metal foil may be bonded to the transparent cover plate and/or the support frame.
The computer system related to the foregoing embodiments may include an image signal conversion unit that converts the electrical signal of the photodetection unit into a computer-readable signal, and a central processing unit that processes the readable signal. Wherein the computer readable signal comprises a digital signal.
The technical solution of the embodiment of the present invention is further explained by the working principle of the robot applying the touch sensor. Fig. 6 is a schematic diagram of a robot operating as a tactile sensor according to an embodiment of the present invention. As shown in fig. 6:
the support frame of the touch sensor is fixed on the surface of a mechanical finger of the robot and is positioned in a mechanical finger grabbing area, and the touch structure of the touch sensor can directly contact with an object to be grabbed. Before the manipulator does not grab an object, a photoelectric detection unit (such as a CMOS module) arranged on an imaging light path acquires a reference light spot generated by imaging a mark point on a photosensitive surface through an imaging small hole, and converts reference light spot information into an electric signal, an image signal conversion unit converts the electric signal into a signal capable of being read by a computer, and a central processing unit receives and stores the readable signal to generate a reference light spot image. When the manipulator grabs an object, the elastic light shield and the elastic light guide unit of the touch structure bear positive and tangential acting forces under the extrusion of a mechanical finger and the object, adaptive deformation occurs, in the deformation process, the relative position of the mark point and the corresponding imaging small hole changes, and further the imaging of the mark point on the photosensitive surface changes, specifically, the position and the size of the light spot change, the central processing unit receives and stores the measurement light spot information converted by the image signal conversion unit to generate a measurement light spot image, the central processing unit compares the measurement light spot image with the reference light spot image to estimate the stress condition of the position corresponding to the mark point, and the selection of the grabbing point and the adjustment of the grabbing force of the manipulator are controlled.
According to the working principle of the touch sensor provided by the embodiment of the invention, the touch sensor provided by the embodiment of the invention adopts an imaging pinhole imaging mode, and light spot information formed by the mark points penetrating through the imaging pinhole before and after the light-emitting unit is deformed is obtained, so that the limitation of four times of focal length of a convex lens imaging mode is not required, the thickness of the touch sensor is reduced, and the volume of the touch sensor is reduced.
The embodiment of the invention also provides a robot, which comprises a mechanical hand with a mechanical finger and the touch sensor, wherein the touch sensor is arranged on the inner side of the surface of the mechanical finger.
The robot comprises a mechanical finger, a touch sensor is arranged on the mechanical finger, and a touch structure of the touch sensor is in contact with an object when the mechanical finger grabs the object.
Although the embodiments of the present invention have been described above, the above description is only for the purpose of understanding the present invention, and is not intended to limit the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (13)

1. A tactile sensor, comprising: the photoelectric detection structure with set up in the last touching structure of photoelectric detection structure, the photoelectric detection structure is including the photoelectric detection unit that has the photosurface, the touching structure is including the elasticity leaded light unit that has a plurality of mark points and the shading unit that has a plurality of formation of image apertures, the shading unit set up in elasticity leaded light unit with between the photoelectric detection unit, the formation of image aperture sets up to with mark point formation of image is in generate facula information on the photosurface, the photoelectric detection unit sets up to with facula information turns into the signal of telecommunication.
2. The tactile sensor according to claim 1, wherein: the mark points are in one-to-one correspondence with the imaging pinholes, and the orthographic projection of the mark points on the photosensitive surface comprises the orthographic projection of the corresponding imaging pinholes on the photosensitive surface.
3. The tactile sensor according to claim 1, wherein: the length of the imaging small holes is 0.06-0.10 mm, the diameter of the imaging small holes is 0.06-0.173 mm, and the distance between the imaging small holes is 0.5-2 mm.
4. The tactile sensor according to claim 3, wherein: the aspect ratio of the imaging aperture is 1.00-1.9.
5. The tactile sensor according to claim 3, wherein: the distance between the imaging small hole and the mark point is 0.5-2.0 mm, and the distance between the imaging small hole and the photosensitive surface is 0.5-2.0 mm.
6. The tactile sensor according to claim 1, wherein: the mark points comprise fluorescent mark points, and the fluorescent mark points are arranged on one side of the elastic light guide unit, which is far away from the shading unit.
7. The tactile sensor according to claim 6, wherein: the fluorescent marker points comprise red fluorescent marker points, green fluorescent marker points and blue fluorescent marker points, the fluorescent marker points are arranged in multiple rows and multiple columns, adjacent rows or columns are staggered, and the color of any fluorescent marker point is different from that of the fluorescent marker points in the adjacent rows and columns.
8. A tactile sensor according to claim 6 or 7, wherein: the touch structure further comprises an elastic light shield and a light source, the light shield unit covers one side, facing the touch structure, of the photoelectric detection structure, the elastic light shield is arranged on the light shield unit, the light source and the elastic light guide unit are located in a cavity defined by the elastic light shield and the light shield unit, and the light source is arranged on one side of the elastic light guide unit; or the like, or, alternatively,
the elastic light shield is arranged on the photoelectric detection structure, the light shielding unit, the light source and the elastic light guide unit are arranged in a cavity enclosed by the elastic light shield and the photoelectric detection structure, and the light source is arranged on one side of the elastic light guide unit.
9. The tactile sensor according to claim 8, wherein: the touch structure further comprises a filling medium for filling a gap between the elastic light shield and the elastic light guide unit.
10. The tactile sensor according to any one of claims 1 to 7, wherein: the photoelectric detection structure further comprises a substrate, a supporting frame and a transparent cover plate, the substrate and the transparent cover plate are embedded into the supporting frame and form a containing cavity with the supporting frame in an enclosing mode, the photoelectric detection unit is arranged on the substrate and located in the containing cavity, and the transparent cover plate is opposite to the light sensing surface.
11. The tactile sensor according to claim 10, wherein: the shading unit comprises a metal foil, and covers one side of the transparent cover plate, which is far away from the photoelectric detection unit.
12. The tactile sensor according to any one of claims 1 to 7, wherein: the photodetection unit includes a CMOS module.
13. A robot comprising a manipulator having a mechanical finger and the tactile sensor according to any one of claims 1 to 12, the tactile sensor being disposed inside a surface of the mechanical finger.
CN202010503562.3A 2020-06-05 2020-06-05 Tactile sensor and robot Active CN111805562B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010503562.3A CN111805562B (en) 2020-06-05 2020-06-05 Tactile sensor and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010503562.3A CN111805562B (en) 2020-06-05 2020-06-05 Tactile sensor and robot

Publications (2)

Publication Number Publication Date
CN111805562A CN111805562A (en) 2020-10-23
CN111805562B true CN111805562B (en) 2023-03-10

Family

ID=72848652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010503562.3A Active CN111805562B (en) 2020-06-05 2020-06-05 Tactile sensor and robot

Country Status (1)

Country Link
CN (1) CN111805562B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114113008B (en) * 2021-10-22 2023-12-22 清华大学深圳国际研究生院 Structured light-based artificial haptic device and method
WO2023100483A1 (en) * 2021-11-30 2023-06-08 ソニーグループ株式会社 Haptic sensor device and robot arm device
CN114714354B (en) * 2022-04-12 2023-10-03 清华大学 Vision module device and mechanical arm

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006173177A (en) * 2004-12-13 2006-06-29 Toshiba Matsushita Display Technology Co Ltd Flat surface imaging device and liquid crystal display
CN101395718A (en) * 2006-03-06 2009-03-25 美光科技公司 Image sensor light shield
EP3517888A1 (en) * 2018-01-29 2019-07-31 Sick Ag Tactile sensor system
CN110838874A (en) * 2019-10-15 2020-02-25 同济大学 Mobile optical communication device supporting high-speed multi-beam tracking
CN111093914A (en) * 2017-08-14 2020-05-01 新南创新私人有限公司 Friction-based tactile sensor for measuring clamping safety
CN210571102U (en) * 2019-08-28 2020-05-19 华南理工大学 Photoelectric multipoint array sensing type touch sensor

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2419252C (en) * 2000-08-31 2011-03-29 Center For Advanced Science And Technology Incubation, Ltd. Optical tactile sensor
JP4492533B2 (en) * 2005-12-27 2010-06-30 船井電機株式会社 Compound eye imaging device
WO2015132694A1 (en) * 2014-03-07 2015-09-11 Semiconductor Energy Laboratory Co., Ltd. Touch sensor, touch panel, and manufacturing method of touch panel
DE102018217285A1 (en) * 2017-10-11 2019-04-11 Carl Zeiss Industrielle Messtechnik Gmbh Touch probe for optical and tactile measurement of at least one DUT

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006173177A (en) * 2004-12-13 2006-06-29 Toshiba Matsushita Display Technology Co Ltd Flat surface imaging device and liquid crystal display
CN101395718A (en) * 2006-03-06 2009-03-25 美光科技公司 Image sensor light shield
CN111093914A (en) * 2017-08-14 2020-05-01 新南创新私人有限公司 Friction-based tactile sensor for measuring clamping safety
EP3517888A1 (en) * 2018-01-29 2019-07-31 Sick Ag Tactile sensor system
CN210571102U (en) * 2019-08-28 2020-05-19 华南理工大学 Photoelectric multipoint array sensing type touch sensor
CN110838874A (en) * 2019-10-15 2020-02-25 同济大学 Mobile optical communication device supporting high-speed multi-beam tracking

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于光学的触觉传感器电路设计;张贵相等;《传感器与微系统》;20160430;第35卷(第04期);第70-72页 *
基于小孔光阑的CCD过饱和像差分析方法;韩国霞等;《大学物理》;20140430;第33卷(第04期);第33-36页 *

Also Published As

Publication number Publication date
CN111805562A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN111805562B (en) Tactile sensor and robot
US11605238B2 (en) Fingerprint identification module, fingerprint identification method, and display apparatus
JP6590804B2 (en) Compact optoelectronic module
CN106055172B (en) Optical navigation chip, optical navigation module and optical encoder
EP0965098B1 (en) User input device for a computer system
US9501686B2 (en) Multi-purpose thin film optoelectric sensor
WO2018040774A1 (en) Substrate, display panel and display device
JP2013175773A (en) Optical sensor and method of optically detecting object
US20090147245A1 (en) System and method for measuring optical resolution of lens
WO2018171174A1 (en) Display panel and display apparatus
CN212624070U (en) Fingerprint sensing module
KR20090118192A (en) Position sensing apparatus and lens driving device using the same
CN111668277A (en) Display device
KR100832073B1 (en) Optical sensor module
US11258933B2 (en) Light source module and display module
CN110059562B (en) Display device and electronic device
CN210015428U (en) Image sensing device and display device
JP2006129121A (en) Imaging unit and image reader
KR102393910B1 (en) Tiled image sensor
CN111721233B (en) Three-dimensional sensing device, light emitting module and control method thereof
CN212515220U (en) Display panel with eye tracking function
JPH0658272B2 (en) Tactile sensor
JP5514685B2 (en) Imaging module and electronic information device
CN215769884U (en) Contact image sensor
CN217904512U (en) Image sensor with a plurality of pixels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant