CN109239388B - Electronic skin touch dynamic sensing method - Google Patents
Electronic skin touch dynamic sensing method Download PDFInfo
- Publication number
- CN109239388B CN109239388B CN201811049863.2A CN201811049863A CN109239388B CN 109239388 B CN109239388 B CN 109239388B CN 201811049863 A CN201811049863 A CN 201811049863A CN 109239388 B CN109239388 B CN 109239388B
- Authority
- CN
- China
- Prior art keywords
- image
- translation
- tactile
- rotation
- haptic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000013519 translation Methods 0.000 claims abstract description 60
- 230000008447 perception Effects 0.000 claims abstract description 24
- 230000008569 process Effects 0.000 claims abstract description 5
- 230000008859 change Effects 0.000 claims abstract description 4
- 230000009466 transformation Effects 0.000 claims description 11
- 238000005070 sampling Methods 0.000 claims description 10
- 238000012417 linear regression Methods 0.000 claims description 6
- 230000006740 morphological transformation Effects 0.000 claims description 6
- 230000010339 dilation Effects 0.000 claims description 4
- 230000003628 erosive effect Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 claims description 3
- 238000013178 mathematical model Methods 0.000 claims description 3
- 230000035807 sensation Effects 0.000 claims description 2
- 230000014616 translation Effects 0.000 description 45
- 238000002474 experimental method Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000003592 biomimetic effect Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
- G01P3/64—Devices characterised by the determination of the time taken to traverse a fixed distance
- G01P3/68—Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/60—Rotation of a whole image or part thereof
Abstract
The invention discloses a touch dynamic sensing method of electronic skin, which comprises the following steps: s1, acquiring a first tactile image and a second tactile image of an object at two different time points in the motion process; s2, translating the first tactile image to enable the centroids of the first tactile image and the second tactile image to be at the same position, taking the centroid of the translated tactile image as a reference point, and calculating the change of the centroid coordinates before and after translation to obtain the translation amount of the object on the plane; s3, rotating the first tactile image to align the directions of the first tactile image and the second tactile image, and measuring the angle of rotation to measure the angle of rotation of the object. The method of the invention can accurately measure the motion of the object by using electronic skin perception, thereby realizing dynamic tactile perception.
Description
Technical Field
The invention relates to a touch dynamic perception method of electronic skin.
Background
Although vision has been widely applied to robotic sensing research, biomimetic tactile sensing and control capabilities remain a major challenge for robotic research. Haptic perception is the basis for smart operation of robots. Haptic sensations allow the robot to sense contact, detect and control slippage, and ensure safe and stable operation when the robot enters an unstructured or dynamic environment.
A robot with electronic skin sensing can capture rich features of contact with an object. Electronic skin is an integrated tactile sensor consisting of a large area flexible substrate and a multifunctional, compact sensor array, and has made significant research progress in recent years. Electronic skin equipped with a high density sensor array can collect more accurate tactile information than a sensor for measuring the total force/torque applied to an object. For example, the robot and the target object contact force distribution, relative motion, etc., so as to achieve more comprehensive perception of the target object, and therefore, the electronic skin is an ideal sensor for achieving tactile perception of the robot.
Herein, we refer to the behavior of perceiving moving objects as haptic dynamic perception. And acquiring the motion state of the object is an important component and premise for perceiving the moving object. There have been studies to extract haptic information from static or motion-defining objects. Such as: high frequency tactile features are extracted from the tactile sensor array to determine an internal state of the object, such as the presence of a liquid in a container. Or using the frequency and magnitude of the contact force to distinguish the contact material, such as identifying a plane, an object at one edge, an object at two edges, a cylinder, by tactile sensors and machine learning methods. The method solves the problem of lack of touch sense of the robot, but the problem of the prior art is how to accurately measure the motion of an object by using electronic skin perception to realize dynamic touch sense perception.
Disclosure of Invention
In order to solve the problem, the invention provides an object motion state sensing method based on electronic skin, which is used for assisting in dynamic sensing of touch sense. The method relies on electronic skin acquisition to obtain tactile images of moving objects, and by analyzing these images the state of motion of the object is obtained.
The main purpose of the present invention is to overcome the disadvantages of the prior art and to provide a method for dynamically sensing the touch of electronic skin.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for electronic skin tactile dynamic perception, comprising the steps of:
s1, acquiring a first tactile image and a second tactile image of an object at two different time points in the motion process;
s2, translating the first tactile image to enable the centroids of the first tactile image and the second tactile image to be at the same position, taking the centroid of the translated tactile image as a reference point, and calculating the change of the centroid coordinates before and after translation to obtain the translation amount of the object on the plane;
s3, rotating the first tactile image to align the directions of the first tactile image and the second tactile image, and measuring the angle of rotation to measure the angle of rotation of the object.
Further:
in step S2, the centroid coordinates are calculated based on equation (1):
where n is the number of pixels in the image, pkIs the coordinate, p, of each pixel in the imagecenterIs the coordinates of the centroid;
calculating the difference between the centroid coordinates before and after translation;
and multiplying the difference value of the centroid coordinates by the unit interval of the sensor array to calculate the translation amount.
Step S3 includes:
remapping the haptic image to a log-polar space by log-polar transformation based on equations (2) and (3):
wherein x and y areCoordinates, x, of pixels in the imagecAnd ycIs the coordinate of the centroid, M is the magnitude, p is the logarithmic distance between the pixel and the centroid, theta is the direction of the pixel relative to the center, p and theta constitute the abscissa and ordinate of a log polar space,
converting the rotary motion of the tactile image from a Cartesian coordinate system to translation motion under logarithmic coordinates through logarithmic polar coordinate transformation, and converting the full search of the rotary motion into the full search of the optimal translation of a logarithmic polar coordinate space;
based on equations (4) to (6), the errors of all the translation-transformed images and the reference image are calculated:
I′lp1=Mtanslation×Ilp1(4)
wherein, Ilp1And Ilp2Is a log polar image, MtanslationIs that I islp1Conversion to l'lp1E is l'lp1And Ilp2The error between the two-dimensional data of the two-dimensional data,is a function that makes the method robust, where a is the outlier threshold, taken as ekAverage value of (d);
m that will minimize E after searchtanslationTransformed into a rotation angle, which is determined as the optimum rotation angle between the two images.
The method further comprises the steps of:
s4, obtaining the tactile image of the moving object with the sampling frequency f, taking two continuous tactile images separated by one sampling period, calculating to obtain the translation distance d and the rotation angle theta of the object in the interval, wherein the instantaneous speed is equal to the distance/angle divided by the time interval, namely the distance/angle is multiplied by the sampling frequency, and the calculation is shown in formulas (7) and (8):
v=f×d (7)
ω=f×θ (8)
where v and ω are the instantaneous velocities of translation and rotation.
The method further comprises the steps of:
s4', comparing the first sampled haptic image with each subsequent haptic image to obtain the translation and rotation of the subsequent haptic image with respect to the initial position, and selecting an appropriate mathematical model to accommodate the translation and rotation according to the different movements, by which the velocity to be measured is calculated.
Step S4' is used to measure an object moving at a uniform speed.
In step S4', a linear regression model is used to fit the translation and rotation, and the linear regression model fits the translation and rotation speeds with a minimized residual sum of squares.
Before translation and rotation, the haptic image is scaled by bilinear difference to enhance image resolution, and dilation and erosion in morphological transformation are used to eliminate noise.
The invention has the following beneficial effects:
the motion perception is the premise of dynamic touch perception, and the method provided by the invention can accurately sense translation, rotation and speed.
1. A method for measuring the degree of freedom motion (x, y, θ) of an object 3 with electronic skin tactile perception is proposed: and obtaining tactile images of different times by using the electronic skin, and measuring the translation and rotation amount by using a method of image translation alignment and image rotation alignment.
2. Preferably, scaling (e.g., bilinear difference) is applied to the haptic image to enhance image resolution and morphological transformations (e.g., dilation and erosion) are applied to remove image noise before translation and rotation are performed. The invention adopts a 5-time bilinear difference method to zoom the image and adopts expansion and corrosion in morphological transformation to eliminate noise.
3. Preferably, the rotation measurement in motion perception employs a full search method to avoid falling into a locally optimal solution.
4. Preferably, the touch images are remapped to a logarithmic polar coordinate space by adopting logarithmic polar coordinate transformation in the rotation measurement, the rotation complete search is transformed into translation complete search, and the dynamic perception efficiency of the electronic skin is improved.
5. Preferably, two methods for measuring the moving speed of the object by using electronic skin touch perception are further provided based on translation and rotation perception, so that the speed of the object can be accurately measured by the electronic skin touch perception. One method takes two consecutive tactile images separated by one sampling period, and calculates them to obtain the translation and rotation at that moment. The maximum error in the translation measurement does not exceed 1.0 mm. The average error of the rotation measurement is not more than 1.946 deg. Another approach is to compare each haptic image to the first image to obtain the amount of translation and rotation relative to the initial position and orientation. The maximum relative error of translation is 0.06 and the maximum relative error of rotation is 0.04.
Drawings
Fig. 1 is a schematic process diagram of position alignment (a) and direction alignment (b);
FIG. 2 is an example diagram of a remapping of a haptic image (a) to a log-polar space (b) (M is 13) by a log-polar operation;
FIGS. 3a and 3b show the translational fitting velocity in the X and Y directions, respectively;
figure 4 shows the fitting speed of the rotation.
Detailed Description
The embodiments of the present invention will be described in detail below. It should be emphasized that the following description is merely exemplary in nature and is not intended to limit the scope of the invention or its application.
In one embodiment, a method for sensing motion of an object by using electronic skin is provided to realize dynamic sensing of touch of a robot. The specific method can comprise the following steps:
(1) translation and rotation sensing
Although the position and direction of the moving object are always changed, the robot can obtain a tactile image using the electronic skin in contact with the object. The signal on the electronic skin sensor may generate a tactile image via the scanning circuit. The sensor can convert the force into a grey scale value ranging from 0 to 255 so that a tactile image of pixel grey scale values can represent the distribution of pressure.
By acquiring haptic images at two different points in time during the motion of an object, the amount of translation and rotation of the moving object is calculated in two steps. As shown in fig. 1, (a) and (b) represent the processes of position alignment and direction alignment, respectively.
The first step is to translate alignment dx、dyThis step brings the tactile image centroid to the same position and measures the shift in centroid.
The translation measurement uses the centroid of the tactile image as a reference point, and obtains the translation of the object on the plane by calculating the change of the centroid coordinate before and after the movement. The centroid is calculated by equation (2).
Where n is the number of pixels in the image, pkIs the coordinate, p, of each pixel in the imagecenterIs the coordinates of the centroid. The difference between the centroid coordinates represents the translation of the object. Since the pitch of the sensor array is 2.5mm, the translation can be calculated by multiplying the difference and the pitch.
Prior to alignment, scaling and morphological transformations are preferably employed to enhance image resolution and eliminate noise. Preferably, the image is scaled using a 5-fold bilinear difference method, and the noise is removed using dilation and erosion in the morphological transformation.
The second step is to measure the direction alignment d of the rotationθ。
The directional alignment is based on searching for the best rotation angle so that the two images have the least difference after rotational alignment. A full search for all angles can avoid trapping in local optima. However, the full angle rotation search takes a long time, limiting its application in robots that require fast response. Therefore, it is preferable to introduce a log-polar transformation to solve the problem. The haptic image is remapped to the log-polar space by a log-polar transformation, which is implemented as follows.
Where x and y are the coordinates of a pixel in the image, xcAnd ycIs the coordinates of the centroid, M is the magnitude, ρ is the logarithmic distance between the pixel and the centroid, and θ is the direction of the pixel relative to the center. ρ and θ constitute the abscissa and ordinate of a log-polar space.
The example of fig. 2 is the remapping of the haptic image (a) to the log-polar space (b) by a log-polar operation (M is 13).
The log-polar transformation transforms the haptic image from rotational motion in a cartesian coordinate system to translational motion in log coordinates, and transforms a full search for rotation to a full search for optimal translation in log-polar coordinate space. This approach is efficient because the graph-based translation search takes much less time than the rotation search. And calculating the errors of all the translation transformed images and the reference image. The image transformation and error calculation formulas are as follows.
I′lp1=Mtanslation×Ilp1(4)
Wherein, Ilp1And Ilp2Is a log polar image, MtanslationIs that I islp1Conversion to l'lp1E is l'lp1And Ilp2The error between the two-dimensional data of the two-dimensional data,is a function that makes the method robust, where a is the outlier threshold, taken as ekAverage value of (a). M after search to minimize EtanslationIs converted into a rotation angle, which is regarded as the optimum rotation angle of the two images.
Again, scaling is employed. And (4) carrying out scaling on the attitude dimension (theta dimension) again on the log polar coordinate image so as to improve the algorithm precision. The original image has been scaled by a factor of 5 to 220 x 220 before the translational alignment, and after the log-polar transformation, the new log-polar image size is also 220 x 220. With scaling again, the log polar image is scaled to 1760 × 220, after which the accuracy is enhanced to 360 °/1720 ≈ 0.205 °. In addition, if the pressure distribution is not uniform, image binarization is also a good method to enhance robustness.
With accurate translation and rotation perception, the robot can also be made to perceive speed. In the invention, the electronic skin acquires the tactile image of the moving object at a sampling frequency f. With tactile images, there are two ways to obtain speed. If the robot is required to track a moving object by touch, it is necessary to obtain the instantaneous velocity of the object. In this case, two consecutive haptic images separated by one sampling period are taken and calculated to obtain the translation distance d and the rotation angle θ of the object in the interval, and the instantaneous speed is equal to the distance/angle divided by the time interval, i.e., the distance/angle is multiplied by the sampling frequency, and is calculated as shown in equations (7) and (8):
v=f×d (7)
ω=f×θ (8)
where v and ω are the instantaneous velocities of translation and rotation.
If the type of motion of the target object is known and the robot needs to measure the velocity accurately, in this case, to avoid large random errors, each tactile image acquired by the electronic skin during contact is compared with the first image to obtain the amount of translation and rotation relative to the initial position and direction.
Depending on the different movements, a suitable mathematical model may be selected to accommodate the translation and rotation, from which the velocity to be measured is calculated. If the motion is uniform, a linear fitting method can be used to obtain the speed, and if the motion is variable, a second-order or higher-order fitting method can be used to obtain the speed.
(2) Experiment of dynamic sense of touch
The experiment was aimed at testing the performance of the method. In the experiments, a parallel displacement platform (PIHexapod M850 germany) was used to provide precise translation and rotation amounts. Foamed silicone gel 2.0mm thick was used as a substrate to improve the electronic skin tactile perception. In addition, because of the need to accurately sense object movement, the sensor array is placed over the substrate.
The parallel displacement platform enables the object to translate a designated distance or rotate a designated angle for analyzing the tactile images obtained by the electronic skin and calculating the distance or angle. The above experiment was repeated a plurality of times, and the results are shown in table 1.
TABLE 1 Performance of dynamic sensing method
Table 1 shows that the method has good accuracy. In translational alignment, the maximum error in all experiments was 1.0mm, smaller than the pitch of the sensor array (2.5mm), which also shows the advantage of image scaling. Furthermore, the average error of the translation is also very small. In rotational alignment, the e-skin is not sensitive to rotation due to the low resolution of the sensor array. Therefore, the accuracy of the rotational alignment is less than the accuracy of the translation. Although in some experiments the maximum error in rotational alignment was not very good, the average error was still acceptable.
The second method of sensing velocity is used to measure uniform motion in the experiment. The displacement platform translates at 0.5mm/s, 1.0mm/s, 1.5mm/s, 2.0mm/s, rotates at 0.25 °/s, 0.50 °/s, 0.75 °/s, 1.00 °/s. The tactile image is sampled by the electronic skin at a frequency of 1 Hz. These images are used to calculate translations and rotations with respect to the initial position and orientation. Thereafter, a linear regression model is used to fit the translation and rotation. Linear regression fits the velocities of translation and rotation with a minimized residual sum of squares. The fitting results are shown in fig. 3a, 3b and 4. Fig. 3a and 3b show the translational fitting velocities in the X-direction (left) and Y-direction (right). Figure 4 shows the fitting speed of the rotation.
The fitting result shows that the fitting speed is close to the actual speed, the maximum relative error of translation is 0.06, and the maximum relative error of translation is 0.04 in rotation, which shows that the speed of the object can be accurately measured through the touch of the robot.
The foregoing is a more detailed description of the invention in connection with specific/preferred embodiments and is not intended to limit the practice of the invention to those descriptions. It will be apparent to those skilled in the art that various substitutions and modifications can be made to the described embodiments without departing from the spirit of the invention, and these substitutions and modifications should be considered to fall within the scope of the invention.
Claims (7)
1. A method for dynamically sensing the touch of electronic skin is characterized by comprising the following steps:
s1, acquiring a first tactile image and a second tactile image of an object at two different time points in the motion process;
s2, translating the first tactile image to enable the centroids of the first tactile image and the second tactile image to be at the same position, taking the centroid of the translated tactile image as a reference point, and calculating the change of the centroid coordinates before and after translation to obtain the translation amount of the object on the plane;
s3, rotating the first tactile image to align the directions of the first tactile image and the second tactile image, measuring the rotating angle, and measuring the rotating angle of the object;
step S3 includes:
remapping the haptic image to a log-polar space by log-polar transformation based on equations (2) and (3):
where x and y are the coordinates of a pixel in the image, xcAnd ycIs the coordinate of the centroid, M is the magnitude, p is the logarithmic distance between the pixel and the centroid, theta is the direction of the pixel relative to the center, p and theta constitute the abscissa and ordinate of a log polar space,
converting the rotary motion of the tactile image from a Cartesian coordinate system to translation motion under logarithmic coordinates through logarithmic polar coordinate transformation, and converting the full search of the rotary motion into the full search of the optimal translation of a logarithmic polar coordinate space;
based on equations (4) to (6), the errors of all the translation-transformed images and the reference image are calculated:
I′lp1=Mtanslation×Ilp1(4)
wherein, Ilp1And Ilp2Is a log polar image, MtanslationIs that I islp1Conversion to l'lp1E is l'lp1And Ilp2The error between the two-dimensional data of the two-dimensional data,is a function that makes the method robust, where a is the outlier threshold, taken as ekAverage value of (d);
m that will minimize E after searchtanslationThe transformation determines the rotation angle, which is determined as the optimal rotation angle between the two images.
2. A haptic dynamic perception method according to claim 1, wherein in step S2, the centroid coordinate is calculated based on equation (1):
where n is the number of pixels in the image, pkIs the coordinate, p, of each pixel in the imagecenterIs the coordinates of the centroid;
calculating the difference between the centroid coordinates before and after translation;
and multiplying the difference value of the centroid coordinates by the unit interval of the sensor array to calculate the translation amount.
3. A method of haptic dynamic perception according to claim 1 or 2, further comprising the steps of:
s4, obtaining the tactile image of the moving object with the sampling frequency f, taking two continuous tactile images separated by one sampling period, calculating to obtain the translation distance d and the rotation angle theta of the object in the interval, wherein the instantaneous speed is equal to the distance/angle divided by the time interval, namely the distance/angle is multiplied by the sampling frequency, and the calculation is shown in formulas (7) and (8):
v=f×d (7)
ω=f×θ (8)
where v and ω are the instantaneous velocities of translation and rotation.
4. A method of haptic dynamic perception according to claim 1 or 2, further comprising the steps of:
s4', comparing the first sampled haptic image with each subsequent haptic image to obtain the translation and rotation of the subsequent haptic image with respect to the initial position, and selecting an appropriate mathematical model to accommodate the translation and rotation according to the different movements, by which the velocity to be measured is calculated.
5. A method for dynamically sensing tactile sensation according to claim 4, wherein step S4' is used for measuring objects moving at a constant speed.
6. A method for haptically dynamic sensing according to claim 4, wherein in step S4', a linear regression model is used for fitting the translation and rotation, and the linear regression model is used for fitting the velocity of the translation and rotation by minimizing the sum of squared residuals.
7. A method of haptic dynamic perception as in claim 1 or 2, wherein the haptic image is scaled using bilinear difference method to enhance image resolution and dilation and erosion in morphological transformation to remove noise before translation and rotation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811049863.2A CN109239388B (en) | 2018-09-10 | 2018-09-10 | Electronic skin touch dynamic sensing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811049863.2A CN109239388B (en) | 2018-09-10 | 2018-09-10 | Electronic skin touch dynamic sensing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109239388A CN109239388A (en) | 2019-01-18 |
CN109239388B true CN109239388B (en) | 2020-09-25 |
Family
ID=65060132
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811049863.2A Active CN109239388B (en) | 2018-09-10 | 2018-09-10 | Electronic skin touch dynamic sensing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109239388B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004103038A (en) * | 2003-11-14 | 2004-04-02 | Fujitsu Ltd | Image processing device |
CN1888913A (en) * | 2006-07-27 | 2007-01-03 | 上海交通大学 | Rotating speed measuring method based on rotary blurred image |
KR20090022393A (en) * | 2007-08-30 | 2009-03-04 | 삼성테크윈 주식회사 | Apparatus and method for processing moving picture |
CN101518482A (en) * | 2009-03-18 | 2009-09-02 | 东南大学 | Touch graphic context display device and display method |
CN101950419A (en) * | 2010-08-26 | 2011-01-19 | 西安理工大学 | Quick image rectification method in presence of translation and rotation at same time |
CN102120326A (en) * | 2011-01-14 | 2011-07-13 | 常州大学 | Manipulator arm grabbing and sliding detection method and device based on image processing technology |
CN103034998A (en) * | 2012-12-04 | 2013-04-10 | 中国科学院自动化研究所 | Detection method capable of detecting center and rotation angle of rotational symmetry figure and device thereof |
CN107322601A (en) * | 2017-08-14 | 2017-11-07 | 山东大学 | The attitudes vibration detection means and method of a kind of object gripped by manipulator |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010278931A (en) * | 2009-05-29 | 2010-12-09 | Toshiba Corp | Image processing apparatus |
CN103198465A (en) * | 2013-04-19 | 2013-07-10 | 中国石油大学(华东) | Rotation error correcting method of CT (Computerized Tomography) scanned images |
US9171211B2 (en) * | 2013-09-20 | 2015-10-27 | Rapsodo Pte. Ltd. | Image processing for launch parameters measurement of objects in flight |
WO2017011801A1 (en) * | 2015-07-16 | 2017-01-19 | Digimarc Corporation | Signal processors and methods for estimating geometric transformations of images for digital data extraction |
-
2018
- 2018-09-10 CN CN201811049863.2A patent/CN109239388B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004103038A (en) * | 2003-11-14 | 2004-04-02 | Fujitsu Ltd | Image processing device |
CN1888913A (en) * | 2006-07-27 | 2007-01-03 | 上海交通大学 | Rotating speed measuring method based on rotary blurred image |
KR20090022393A (en) * | 2007-08-30 | 2009-03-04 | 삼성테크윈 주식회사 | Apparatus and method for processing moving picture |
CN101518482A (en) * | 2009-03-18 | 2009-09-02 | 东南大学 | Touch graphic context display device and display method |
CN101950419A (en) * | 2010-08-26 | 2011-01-19 | 西安理工大学 | Quick image rectification method in presence of translation and rotation at same time |
CN102120326A (en) * | 2011-01-14 | 2011-07-13 | 常州大学 | Manipulator arm grabbing and sliding detection method and device based on image processing technology |
CN103034998A (en) * | 2012-12-04 | 2013-04-10 | 中国科学院自动化研究所 | Detection method capable of detecting center and rotation angle of rotational symmetry figure and device thereof |
CN107322601A (en) * | 2017-08-14 | 2017-11-07 | 山东大学 | The attitudes vibration detection means and method of a kind of object gripped by manipulator |
Non-Patent Citations (4)
Title |
---|
《Robotic tactile perception of object properties: A review》;Shan Luo等;《Mechatronics》;20171114(第48期);第54-67页 * |
《利用图像形心的变化来判断触滑觉程度的方法研究》;白羽等;《机床与液压》;20070430;第35卷(第4期);第41页第1节至第42页第3节 * |
《基于传感器阵列的触觉图像识别技术的研究》;季进军;《机器人技术》;20131030;第40卷(第10期);第67-71页 * |
《软体目标夹持滑移信息检测与姿态监测技术研究》;赵凯;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170915(第9期);第I138-227页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109239388A (en) | 2019-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104820439B (en) | A kind of visual apparatus as sensor parallel connection platform follow-up control apparatus and method | |
Solomon et al. | Extracting object contours with the sweep of a robotic whisker using torque information | |
Matungka et al. | Image registration using adaptive polar transform | |
Croom et al. | Visual sensing of continuum robot shape using self-organizing maps | |
Chakravarthy et al. | Noncontact level sensing technique using computer vision | |
US20110285648A1 (en) | Use of fingerprint scanning sensor data to detect finger roll and pitch angles | |
JP2007519064A (en) | System and method for generating rotational input information | |
Song et al. | DOE-based structured-light method for accurate 3D sensing | |
Chen et al. | A noise-tolerant algorithm for robot-sensor calibration using a planar disk of arbitrary 3-D orientation | |
Wang et al. | Data-driven glove calibration for hand motion capture | |
US8109007B2 (en) | Object profile sensing | |
CN112836558A (en) | Mechanical arm tail end adjusting method, device, system, equipment and medium | |
CN106462268B (en) | Execute the electronic pen of sensor drift compensation | |
CN116051600A (en) | Optimizing method and device for product detection track | |
Zhang et al. | Superellipse fitting to partial data | |
CN109239388B (en) | Electronic skin touch dynamic sensing method | |
WO2008032375A1 (en) | Image correcting device and method, and computer program | |
Lim et al. | Camera-based hand tracking using a mirror-based multi-view setup | |
Zhang et al. | Human deep squat detection method based on MediaPipe combined with Yolov5 network | |
JP7136737B2 (en) | Three-dimensional position measuring device, three-dimensional position measuring method and program | |
Yii et al. | Distributed visual processing for augmented reality | |
Yuan et al. | Identify finger rotation angles with ArUco markers and action cameras | |
CN115049744A (en) | Robot hand-eye coordinate conversion method and device, computer equipment and storage medium | |
Liang et al. | An integrated camera parameters calibration approach for robotic monocular vision guidance | |
CN114266776A (en) | Digital image correlation method applying composite crack displacement field function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |