WO2009147904A1 - 手指形状推定装置、手指形状の推定方法及びプログラム - Google Patents
手指形状推定装置、手指形状の推定方法及びプログラム Download PDFInfo
- Publication number
- WO2009147904A1 WO2009147904A1 PCT/JP2009/057851 JP2009057851W WO2009147904A1 WO 2009147904 A1 WO2009147904 A1 WO 2009147904A1 JP 2009057851 W JP2009057851 W JP 2009057851W WO 2009147904 A1 WO2009147904 A1 WO 2009147904A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- finger
- shape
- data
- estimation
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2431—Multiple classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
Definitions
- the present invention relates to a finger shape estimation apparatus suitable for estimating a finger shape from a finger image captured by a camera or the like, a finger shape estimation method, and a program for executing the same.
- Patent Literatures 1 and 2 and Non-Patent Literature 1 an unknown finger image input from a camera is stored in advance in a database by combining lower-order image feature amounts of finger images and joint angle data, and a database
- a finger shape estimation method has been proposed in which a similar image search is performed by collating with the data of.
- Non-Patent Document 2 Similar image search is performed by adding positional information of a nail in a finger image to the image feature amount used in the finger shape estimation method proposed in Patent Documents 1 and 2 and Non-Patent Document 1. The method of doing is proposed.
- Patent Document 2 and Non-Patent Document 1 further propose a method for rapidly searching for a finger shape similar to an unknown input image from a large-scale database.
- FIG. 41 is a schematic configuration diagram of a database
- FIGS. 42A and 42B and FIGS. 43A and 43B are diagrams showing a search procedure for input unknown continuous images.
- the database is illustrated in two layers for simplification.
- a multi-tier database as shown in FIG. 41 is constructed. However, in this case, self-organization with self-proliferation and self-destruction so that similar hand-shaped images gather close to each other and the number of data sets belonging to each class becomes approximately equal. Build multi-tiered database by map.
- the search space is narrowed by using the neighboring data of the search result one time earlier as the search target, and the processing time is shortened. ing.
- the method of searching similar images from a database having a multi-layered structure as described above has the following problems. (1) Since the vicinity data of the search result of one time ago is set as the search target, when the finger shape is drastically changed between successive finger images, the finger shape similar to the finger shape to be searched is the search area There is a risk of getting out of it. In this case, the closest image may not be found. (2) Even if the finger shape is slowly changing between consecutive finger images, an error may be mixed into the estimation, so once outputting a non-similar image, even when searching for finger images after the next time Since the search is performed in the neighborhood class of the class including the dissimilar images, the possibility of outputting the dissimilar images continuously increases.
- the database is effectively hierarchized using a priori knowledge, not simply hierarchizing the database according to the degree of statistical similarity. Therefore, decide what kind of image feature amount, joint angle information, nail position information, etc. should be assigned to data corresponding to the finger shape of the finger image, to each class of each hierarchy having representative values. Becomes difficult. As a result, it takes time to build a database.
- the present invention has been made to solve the above-mentioned problems, and an object of the present invention is to provide a finger image with high speed and high accuracy in a finger shape estimation device, a finger shape estimation method and a program for executing the same. It is possible to estimate the closest image and simplify the construction of the database.
- Another object of the present invention is to provide finger shape and forearm tilt even when the forearm extension direction of the finger image whose shape is to be estimated is not directed in a predetermined direction, that is, even when the user freely moves the upper limbs. And providing a finger shape estimation device and a finger shape estimation method and a program for executing the method.
- Still another object of the present invention is to provide a finger shape estimation device, a finger shape estimation method and a finger shape estimation method that enable stable shape estimation regardless of whether the position of the finger to be imaged is near or far from the camera. It is to provide a program to be executed.
- the finger shape estimating device of the present invention comprises: a finger image acquiring unit for acquiring a first finger image; and first shape data regarding dimensions of the first finger image in the longitudinal direction and the lateral direction. And a shape data calculation unit for calculating. Further, according to the present invention, angle data of a finger, second shape data on dimensions of a vertical direction and a horizontal direction of a second finger image obtained by imaging the finger, and second shape data of the second finger image. A matching unit that reads second shape data in a predetermined data set from a database having a plurality of data sets in which image feature amounts are combined and compares the second shape data with the first shape data. It was set as having composition. Furthermore, the present invention collates the second image feature of the data set including the second shape data adapted by the collation of the collation unit with the first image characteristic, It was set as the structure provided with the estimation part which estimates a shape.
- the matching unit collates the second shape data of the image.
- this process is also referred to as a first estimation process.
- the second image feature which is set with the second shape data matched in the first estimation process is compared with the first image feature of the first finger image to estimate the finger shape.
- this process is also referred to as a second estimation process. That is, in the present invention, the finger shape is estimated by the two-step estimation process.
- the first estimation process performs similar image search with the features of the overall shape of the finger image, and determines the number of matched image data (number of data sets) to be similar image candidates. Squeeze to a certain number. As a result, it is possible to reduce the amount of search (processing amount) when performing detailed similarity matching in the second estimation process. Therefore, according to the present invention, even when using a large scale database, it is possible to search for the most similar image at high speed.
- the present invention when searching successive finger images, it is possible to find the most similar image without using the search result of the finger image of the previous time. Therefore, according to the present invention, regardless of the speed of change of the finger shape between successive finger images, or even when an error is mixed in the estimation result in the past, it is not affected by the error.
- the most similar image can be searched with high estimation accuracy.
- one of the angle data of the finger, the second shape data indicating the feature of the overall shape of the second finger image obtained by imaging the finger, and the second image feature amount of the second finger image A database may be constructed in which a plurality of sets of data sets are stored. Therefore, according to the present invention, it is not necessary to construct a multi-layered database so that similar finger-shaped images gather close to each other as in the prior art, and construction of the database is simplified.
- the forearm is calculated based on the inclination calculation unit for calculating the inclination of the forearm in the original image of the first finger image, and the inclination of the forearm calculated by the inclination calculation unit.
- the image correction unit configured to rotate the first finger image so that the extension direction of the unit is directed to the predetermined direction, the extension direction of the forearm portion of the finger image whose shape is to be estimated is not directed to the predetermined direction Even in the case where the user freely moves the upper limbs, it is possible to estimate the finger shape and the forearm tilt.
- the extracted finger image is normalized to an image having a predetermined number of pixels.
- stable shape estimation can be performed regardless of whether the position of the finger to be imaged is near or far from the camera.
- FIG. 1 is a block diagram of a system to which a finger shape estimation apparatus according to an embodiment of the present invention is applied.
- FIG. 2 is a block diagram of threshold detection means according to an embodiment of the present invention.
- FIG. 3 is a flow chart showing the procedure for constructing a database.
- FIG. 4 is a block diagram of a finger image extraction unit.
- FIG. 5 is a flowchart showing the calculation procedure of the image shape ratio and the image feature amount.
- FIG. 6A is a diagram showing how to determine a base point in a finger image
- FIG. 6B is an enlarged view of a broken line area A in FIG. 6A.
- 7A to 7C are diagrams showing an example of cutting out an extracted hand image.
- 8A to 8C are diagrams showing an example of extracting extracted hand images.
- FIG. 9 is a diagram specifically showing the definition of the shape parameter of the extraction hand image.
- FIG. 10A is a diagram showing a state of extraction processing of an image feature amount of an extracted finger image
- FIG. 10B is a high-order autocorrelation function pattern diagram.
- FIG. 11 is a configuration example of a data set.
- FIG. 12 is a flowchart showing the operation procedure of the entire system.
- FIG. 13 is a flowchart showing a procedure of finger shape estimation processing.
- FIG. 14 is a flow chart showing the procedure of determining the threshold regarding the image shape ratio.
- FIG. 15 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 10A is a diagram showing a state of extraction processing of an image feature amount of an extracted finger image
- FIG. 10B is a high-order autocorrelation function pattern diagram.
- FIG. 11 is a configuration example of a data set.
- FIG. 16 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 17 is a diagram showing the relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 18 is a diagram showing the relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 19 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 20 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 21 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the average value and standard deviation of estimation errors.
- FIG. 22 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the number of data sets passing the first estimation process.
- FIG. 23 is a diagram showing a relationship between three threshold values regarding the vertical degree, the upper side and the right side, and the number of data sets passing the first estimation process.
- FIG. 24 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the number of data sets passing the first estimation process.
- FIG. 25 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the number of data sets passing the first estimation process.
- FIG. 26 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the number of data sets passing the first estimation process.
- FIG. 27 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree, and right longitudinal degree, and the number of data sets passing the first estimation process.
- FIG. 28 is a diagram showing a relationship between three threshold values regarding vertical degree, upper vertical degree and right longitudinal degree, and the number of data sets passing the first estimation process.
- FIG. 29 is an image showing the state of the estimation operation by the finger shape estimation device of the first modification.
- FIG. 30 is the image which showed the mode of the estimation operation
- FIG. FIG. 31 is an image showing the state of the estimation operation by the finger shape estimation device of the first modification.
- FIG. 32 is the image which showed the mode of the estimation operation
- FIG. 33 is the image which showed the mode of the estimation operation
- FIG. FIG. 34 is an image showing the state of the estimation operation by the finger shape estimation device of the first modification.
- FIG. 35 is a schematic configuration diagram of a main part that performs correction processing of an original image according to the second modification.
- FIG. 36 is a flowchart showing the procedure of the correction process of the original image of the second modification.
- FIG. 37 is a diagram showing an outline of the correction process of the original image of the second modification.
- FIG. 38 is a diagram showing an outline of how to determine the inclination of the contour.
- FIG. 39 is a diagram showing a change in the inclination of the contour along the contour.
- FIG. 40 is a diagram showing a change in the standard deviation of the slope of the contour along the contour.
- FIG. 41 is a diagram showing the structure of a database of a conventional finger shape estimation apparatus.
- 42A and 42B are diagrams showing the state of the estimation operation of the finger shape of the conventional finger shape estimation device.
- FIGS. 43A and 43B are diagrams showing the state of the estimation operation of the finger shape of the conventional finger shape estimation device.
- FIG. 1 is a configuration example of a system to which a finger shape estimation apparatus of the present embodiment is applied.
- the system shown in FIG. 1 is a system using the finger shape estimation apparatus of this embodiment for controlling a robot hand 34 and a three-dimensional CG (Computer Graphics) drawing hand 35.
- CG Computer Graphics
- the system of FIG. 1 includes a database construction device 10 that calculates and stores various data related to the finger shape, and a finger shape estimation device 20 that estimates the finger shape from the captured image.
- the database construction device 10 includes sampling means 11 and 13, time-series joint angle data storage means 12, and time-series rotation angle data storage means 14. Further, the database construction device 10 includes an image data storage unit 15, an image feature quantity extraction unit 16, and an image shape ratio calculation unit 17. Furthermore, the database construction device 10 stores the angle data, the image feature amount (second image feature amount), the image shape ratio (second shape data), and the storage unit 18 (hereinafter also referred to as database 18) in which the operation command is stored And threshold determination means 19 (threshold calculation device). Below, the function of each means which comprises the database construction apparatus 10 is demonstrated easily.
- the sampling means 11 samples the angle data of each joint of each finger outputted from the data glove 30 at a predetermined cycle, and outputs it to the time-series joint angle data storage means 12. Then, the time-series joint angle data storage means 12 stores the angle data of each joint of each sampled finger.
- the sampling unit 13 samples the rotation angle data of the forearm (wrist) output from the forearm rotation angle measurement unit 31 at a predetermined cycle, and outputs the sampled data to the time-series rotation angle data storage unit 14. Then, the time-series rotational angle data storage means 14 stores the rotational angle data of the sampled forearm (wrist).
- the image data storage unit 15 not only stores the image captured by the camera 32, but also captures a finger image (second finger image) within a predetermined range necessary to calculate the image shape ratio of the finger and the image feature amount. Extract from the image.
- the image feature quantity extraction unit 16 divides the finger image extracted by the image data storage unit 15 into a predetermined number, and calculates the feature quantity (specifically, a high-order local autocorrelation function described later) of each divided image. Do.
- the image shape ratio calculation unit 17 is an image shape ratio (specifically, the vertical length, the upper length, the right length, etc. described later) indicating the feature of the entire shape of the finger image from the finger image extracted by the image data storage unit 15 Calculate
- the database 18 stores a data set in which finger joint angle data, forearm (wrist) rotation angle data, an image shape ratio, and an image feature amount obtained for one finger shape are combined.
- the database 18 stores data sets respectively corresponding to various finger shapes. In the present embodiment, over 20,000 sets of data sets are stored in the database 18.
- the threshold value determining means 19 is used to compare the image shape ratio in the data set with the image shape ratio of the finger image acquired by the finger shape estimation device 20 in a first estimation process in the finger shape estimation device 20 described later.
- the determination parameter (threshold value) to be used is calculated. Specifically, the threshold determination unit 19 determines that the difference between the image shape ratio (second shape data) in the data set and the image shape ratio (first shape data) of the finger image acquired by the finger shape estimation apparatus 20 is A threshold for determining whether or not it is within the predetermined range is determined.
- FIG. 2 is a block diagram of the threshold determination means 19.
- the threshold determination means 19 determines the multiple regression equation calculation unit 41, the correlation coefficient calculation unit 42, the image shape ratio selection unit 43 (selection unit), the finger shape estimation unit 44, and the threshold determination. And a unit 45.
- the functions of each part are as follows.
- the multiple regression equation calculation unit 41 uses the image shape ratio stored in the database 18 as an objective variable, and multiple regression equation using joint angle data (finger joint angle data and forearm (wrist) rotation angle data) as explanatory variables. Are created for each image shape ratio.
- the correlation coefficient calculation unit 42 calculates the multiple correlation coefficient of each image shape ratio using the multiple regression equation calculated by the multiple regression equation calculation unit 41. Further, based on the calculation result of the correlation coefficient calculation unit 42, the image shape ratio selection unit 43 selects an image shape ratio having a large influence on the estimation of the finger shape.
- the finger shape estimation unit 44 variously changes the threshold related to the image shape ratio selected by the image shape ratio selection unit 43 to estimate the finger shape.
- the finger shape estimation unit 44 has the same function as the estimation processing function (second estimation processing) of the finger shape estimation apparatus 20 described later. Then, based on the estimation result of the finger shape estimation unit 44, the threshold determination unit 45 determines a threshold related to each image shape ratio.
- the finger shape estimation apparatus 20 includes an image data storage unit 21 (finger image acquisition unit), an image shape ratio calculation unit 22 (shape data calculation unit), and an image feature quantity extraction unit 23. .
- the finger shape estimation apparatus 20 further includes an image shape ratio specification unit 24 (collation unit), an image feature amount specification and operation command generation unit 25 (estimation unit), and a drive command unit 26.
- image shape ratio specification unit 24 collation unit
- image feature amount specification and operation command generation unit 25 estimate unit
- drive command unit 26 drive command unit
- the image data storage unit 21 (finger image acquisition unit) not only stores the image captured by the camera 33, but also the image shape ratio (first shape data) of the finger and the image feature amount (first image feature amount) A finger image (first finger image) in a predetermined range necessary for calculation is extracted from the captured image.
- the image shape ratio calculation unit 22 calculates an image shape ratio (specifically, the vertical length, the upper length, the right length, etc. described later) indicating the feature of the shape of the finger image from the finger image extracted by the image data storage unit 21. calculate.
- the image feature quantity extraction unit 23 divides the finger image extracted by the image data storage unit 21 into a predetermined number, and calculates the feature quantity (specifically, a high-order local autocorrelation function described later) of each divided image. Do.
- the image shape ratio specifying unit 24 (hereinafter also referred to as the matching unit 24) reads out the image shape ratio data in the data set stored in the database 18, and the image shape ratio data is calculated by the image shape ratio calculation unit 22. Check with the image shape ratio data. That is, the first estimation process is performed by the matching unit 24. Then, when the image shape ratio data read out from the database 18 and the image shape ratio data calculated by the image shape ratio calculation unit 22 match, the matching means 24 compares the image shape ratio data read out from the database 18. Output the number of the included data set.
- the image feature amount identification and action command generation device 25 determines the image feature amount in the data set of that number (second The image feature amount is read out, and the image feature amount is collated with the image feature amount (first image feature amount) of the input image extracted by the image feature amount extraction unit 23. That is, the second estimation process is performed by the estimation means 25. Then, by this estimation process, the finger shape (finger joint angle, forearm (wrist) rotation angle) of the data set most similar to the finger shape of the input image is specified. Then, the estimation unit 25 outputs the identified finger joint angle and the forearm (wrist) rotation angle to the drive command unit 26 as an operation command of the robot hand 34 or the CG drawing hand 35.
- the drive command means 26 sends the motion command inputted from the estimation means 25 to the robot hand 34 or the CG drawing hand 35, and drives the robot hand 34 or the CG drawing hand 35.
- the finger shape estimation device 20 and the database 18 are separately provided, but the finger shape estimation device 20 may include the database 18. Furthermore, in the present embodiment, the finger shape estimation device 20 and the threshold value determination means 19 are separately provided, but the finger shape estimation device 20 may include the threshold value determination means 19. In this case, the estimation processing function of the threshold value determination means 19 and the estimation processing function of the finger shape estimation device 20 may be common.
- FIG. 3 is a diagram showing the overall flow of the construction procedure of the database 18. The processing in steps S1 to S3 in FIG. 3 may be performed in the order shown in FIG. 3, but may be performed in parallel.
- the time-series joint angle data storage unit 12 acquires time-series data of finger joint angles from the data glove 30 (manufactured by Virtual Technologies, Cyber Glove (registered trademark)) through the sampling unit 11 (step S1). .
- Data on the finger joint angle is acquired by attaching the data glove 30 to the hand.
- a sensor for detecting the joint angle is provided at a location corresponding to each joint of the finger of the data glove 30, a sensor for detecting the joint angle is provided.
- a sensor is also provided on the palm portion.
- a strain sensor is used as the sensor.
- the data glove 30 used in the present embodiment can output angle information of 24 types of finger joints. Specifically, the following finger joint angle information can be output.
- the output of the angle data of the wrist can be used by combining it with the magnetic motion capture.
- ⁇ CM Carp Metacarpal: Carpal Metacarpal
- MP Metal Carpophalangeal: Intercarpal-phalangeal
- IP Interphalangeal: Interphalangeal: Three joints of the thumb (soft part of the base of the thumb of palm) Joint
- Flexion and extension Total 3 types, 3 joints of 4 fingers except thumb finger (MP joint from base of finger, PIP (Proximal Interphalangeal) joint, DIP (Distal Interphalangeal: distal phalanx)
- flexion and extension total 12 species, adduction and abduction of the joints of the base of the four fingers (three-finger MP and thumb CM) excluding the middle finger (tilting to the little finger side or thumb finger side): total 4 types
- the time series rotation angle data storage unit 14 acquires time series data of the forearm (wrist) joint angle from the forearm rotation angle measurement unit 31 through the sampling unit 13 (step S2).
- a USB Universal Serial Bus
- an optical index for example, a lightweight rod
- the wrist portion of the data glove 30 is attached to the wrist portion of the data glove 30, and the hand is imaged by the USB camera 31 installed above the hand on which the data glove 30 is fitted.
- the forearm rotation angle is measured based on the rotation angle of the optical index in the captured image.
- the present invention is not limited to this, and for example, a sensor of a magnetic motion capture may be attached to the data glove 30 to obtain data of the forearm rotation angle.
- the image data storage unit 15 acquires an image captured by the camera 32, and stores the image (step S3).
- a USB camera is used as the camera 32, and the finger on which the data glove 30 is fitted is imaged by the USB camera.
- the resolution was set to 320 ⁇ 240 pixels, and the finger was imaged in a state in which the finger appears in a sufficient size in the screen.
- a thin white glove is attached to the data glove 30 to perform imaging.
- a black screen is used as a background when acquiring a finger image, a finger joint angle, and a forearm rotation angle.
- step S3 a finger image (second finger image) in a predetermined range required to calculate the image shape ratio of the finger and the image feature amount is extracted from the captured image.
- step S4 the image feature amount extraction unit 16 and the image shape ratio calculation unit 17 calculate the image feature amount and the image shape ratio of the finger image, respectively, using the finger image acquired (extracted) in step S3 (step S4) ).
- the process of extracting a finger image (second finger image) from the captured image and the process of step S4 will be described in detail later with reference to FIGS. 4 and 5.
- step S5 a data set in which the finger joint angle, the forearm (wrist) rotation angle, the image shape ratio, and the image feature amount acquired in the above steps S1 to S4 are combined is stored in the database 18 (step S5).
- step S6 it is determined whether the number of data sets stored in the database 18 is equal to or more than a desired number (the number required for finger shape estimation) (step S6).
- a desired number the number required for finger shape estimation
- FIG. 4 is a block diagram of a finger image extraction unit for extracting a finger image from a captured image
- FIG. 5 is a flowchart showing a procedure from finger image extraction processing to image feature amount and image shape ratio calculation processing. It is.
- the configuration of the finger image extraction unit 50 for extracting a finger image from the captured image will be briefly described with reference to FIG.
- the image data storage unit 15 finger image acquisition unit extracts, from the image captured by the camera 32, a finger image of a predetermined range required to calculate the image shape ratio and the image feature of the finger. Do. Therefore, the finger image extraction unit 50 is included in the image data storage unit 15.
- the finger image extraction unit 50 includes a smoothing processing unit 51, a binarization processing unit 52, a base point calculation unit 53, and a finger image extraction unit 54, and these units operate from the input side of the original image (captured image). Connected in order.
- the function of each part is as follows.
- the smoothing processing unit 51 removes noise from the captured image.
- the binarization processing unit 52 (the outermost extraction unit) binarizes the noise-removed original image into a finger area and a background.
- the base point calculation unit 53 base point extraction unit obtains a reference point (base point) in the finger image when obtaining the image shape ratio in step S4 in FIG.
- a pixel (base point pixel) to be a base point in a finger image is determined by labeling processing of sequentially assigning label numbers to pixels from the outermost pixel of the finger area.
- the finger image cutout unit 54 is determined from the original image based on the data of the outermost pixel of the finger area obtained by the binarization processing unit 52 and the base point in the finger image obtained by the base point calculation unit 53. Cut out the range finger image.
- the image data storage unit 21 of the finger shape estimation apparatus 20 also includes the finger image extraction unit 50 as described above, similarly to the image data storage unit 15.
- step S3 in FIG. 3 is subjected to smoothing processing (filtering processing) in the smoothing processing unit 51 in the finger image extraction unit 50 to remove noise (step S11).
- step S11 the binarization processing unit 52 binarizes the captured image into a finger area and a background (step S12). Thereby, the outermost pixel of the finger area (finger part) can be obtained.
- the base point calculation unit 53 assigns a label number “1” to the pixel (outermost pixel) of the finger area adjacent to the background pixel in the captured image obtained in step S12 (step S13).
- a label number "2" is assigned to a pixel of the finger area adjacent to the pixel of label number "1" and to which a label number is not assigned (step S14).
- step S15 If there is a pixel to which a label number is not assigned in the finger area (Yes in step S15), the label number is incremented by 1 (step S16), and the process in step S14 is performed.
- the position of the pixel to which a label number is assigned last is an image shape ratio (longitudinal degree, upper length and right This is used as the base point of the finger image required to obtain the degree of length, etc.).
- the base point calculation process of steps S13 to S17 is performed in the base point calculation unit 53.
- FIGS. 6A and 6B show the state of the processing operation of the above steps S13 to S17.
- FIG. 6A is a diagram when pixels having the same label number in the captured image are connected by a line
- FIG. 6B is an enlarged view of a broken line area A in FIG. 6A.
- the label numbers of the pixels are given as 1, 2, 3,... Sequentially from the outermost pixels of the finger to the pixels inside the finger region by the processing operation of steps S13 to S17.
- the label numbers of the pixels are given as 1, 2, 3,... Sequentially from the outermost pixels of the finger to the pixels inside the finger region by the processing operation of steps S13 to S17.
- the area surrounded by the line is formed on the inner side of the finger area while narrowing the range. .
- an area formed by connecting pixels of the same label number by a line converges on one pixel (pixel with the largest labeling number) (pixel of label number L in FIG. 6A).
- the position of this pixel is used as a base point of the finger image.
- the image shape ratio calculation means 17 is required to calculate the image shape ratio from the captured image as follows based on the base point of the finger image obtained in step S17 and the outermost part (finger contour) of the finger region.
- a finger image (second finger image: hereinafter also referred to as an extracted hand image) of a certain range is cut out.
- the upper end, the left end, and the right end of the extracted finger image are positions of the uppermost end pixel, the leftmost pixel, and the rightmost pixel of the finger contour, respectively.
- the lower end of the extracted hand image is determined as follows. First, among the outermost pixels of the finger, the number M of pixels from the base point to the closest pixel is determined. Then, the position of the pixel on the lower side by the number of pixels M from the base point is set as the lower end of the extraction hand image. Based on the upper and lower ends and the left and right ends of the extracted hand image obtained in this manner, the extracted hand image is cut out from the captured image. Examples of extracted hand images cut out for various finger shapes are shown in FIGS. 7A-7C and 8A-8C.
- Regions surrounded by white frames in FIGS. 7A to 7C and 8A to 8C are the range of the extracted hand image.
- black square marks in the finger area in FIGS. 7A to 7C and 8A to 8C indicate the positions of the base points of the finger images. Note that the extraction range of the extraction hand image is not limited to the above example, and can be appropriately changed in consideration of the application, necessary accuracy, and the like.
- the image shape ratio calculation means 17 calculates the total number of pixels H in the vertical direction of the extracted hand image, the total number of pixels W in the horizontal direction of the extracted hand image, and the base point of the extracted hand image from the acquired extracted finger image and base point.
- the number H u of pixels up to the upper end of the extraction hand image and the number W r of pixels from the base point of the extraction hand image to the right end of the extraction hand image are calculated (step S 18).
- the definition of these pixel number parameters is concretely shown in FIG.
- the image shape ratio calculation unit 17 calculates shape data indicating the features of the entire shape of the extracted finger image using the shape parameters H, H u , W, and W r of the extracted finger image (step S19).
- the following three parameters are used as shape data indicating the characteristics of the entire shape of the extracted finger image.
- Right length: R rb [j] W r [j] / W [j]
- the variable j in parentheses is the number of the data set stored in the database 18. That is, for example, R t [j] is the vertical degree in the data set number j.
- a shape ratio obtained by the total number H of pixels in the vertical direction of the extracted finger image such as the vertical length R t and the total number W of pixels in the horizontal direction of the extracted finger image The shape ratio (second shape ratio) determined based on the positional relationship between (the first shape ratio), the base point such as the upper length R th and the right length R rb , and the pixel at the outer end of the extracted finger image Use.
- the shape data indicating the feature of the entire shape of the extracted finger image is not limited to the above three parameters, and the following parameters (4) to (7) may be used.
- the shape data indicating the feature of the entire shape of the extracted finger image is not limited to the parameters (1) to (7) above, and any parameter may be used as long as
- the image feature quantity extraction unit 16 binarizes the extracted finger image extracted by the finger image extraction unit 50 in the image data storage unit 15 into the outline of the finger and the other part, and the extracted finger image Is reduced (normalized) to an image of 64 ⁇ 64 pixels (step S20). 7A to 7C, 8A to 8C, and 9 and a finger image shown in a separate frame at the upper left part in the finger image are the reduced images. Next, the image feature quantity extraction unit 16 divides the reduced extracted finger image into eight vertical divisions and eight horizontal divisions (total 64 divisions) as shown in FIG. 10A (step S21).
- the image feature quantity extraction unit 16 calculates an image feature quantity in each divided image divided in step S21 (step S22).
- a high-order local autocorrelation function (high-order local autocorrelation feature) widely used for image analysis such as image recognition or measurement is used as the image feature quantity of a finger used for similar image search.
- the higher order local autocorrelation function is calculated for the correlation between the reference point and its neighborhood. Assuming that the reference point is r and the value of the pixel at the reference point is f (r), the N-order autocorrelation function x N near the reference point r has N displacement directions a 1 and a 2 near the reference point r. , ... a N is defined by the following equation.
- the order N of the high-order local autocorrelation function is 2.
- the displacement direction is limited to a local 3 ⁇ 3 pixel area around the reference point r.
- the image feature quantity is expressed by 25 types of patterns (M1 to M25 in FIG. 10B) such as points, straight lines, and broken lines as shown in FIG. 10B except for equivalent feature quantities due to parallel movement. .
- black squares in FIG. 10B indicate the arrangement of pixels corresponding to the local pattern.
- each feature amount is obtained by adding the product of pixel values corresponding to the local pattern to the entire image.
- the image feature quantity extraction unit 16 obtains feature quantities of 25 patterns as shown in FIG. 10B by the high-order local autocorrelation function with all pixels as reference points for each divided screen. Thereby, one divided screen is expressed in 25 dimensions, and feature quantity conversion (dimension reduction) of the divided screen is performed. Therefore, in the entire extraction hand image, one extraction hand image is represented with a total of 1600 dimensions of 64 divided screen ⁇ 25 patterns.
- the dimension reduction of the image feature amount is performed using the high-order local autocorrelation function, but the present invention is not limited to this, and it is a method that can reduce the dimension of the image feature amount. For example, any method can be used.
- the image feature amount is extracted with respect to the divided image, but the present invention is not limited to this, even if the extraction finger image is not divided, the image feature amount of the entire extraction finger image is extracted Good.
- the data set is stored in database 18.
- database 18 a specific example of the configuration of the data set stored in the database 18 is shown in FIG.
- FIG. 12 is a diagram showing an overall flow of finger shape estimation processing according to the present embodiment.
- FIG. 13 is a flow chart showing the process contents of step S33 in FIG.
- the database 18 is created (step S31).
- a database is created according to the above-described procedure of database construction.
- the image data storage unit 21 acquires and stores a captured image (input image) captured by the camera 33 (step S32).
- step S32 the image data storage unit 21 performs the image shape ratio and the image feature amount of the input image in the same manner as the image data storage unit 15 in the database construction device 10 (see steps S11 to S17 in FIG. 5).
- the base point of the finger image is extracted from the input image using the labeling processing technique described in FIGS. 6A and 6B, and a predetermined range (for example, white frame in FIGS. 7A to 7C) is extracted from the input image based on the base point. Extract the hand image of the enclosed area).
- a predetermined range for example, white frame in FIGS. 7A to 7C
- the finger shape estimation device 20 estimates the finger shape of the input image using the extracted finger image and the base point of the image output from the image data storage unit 21 (step S33). The process of step S33 will be described in detail later with reference to FIG.
- the finger shape estimation device 20 determines whether the end flag is input (step S34). If the end flag is input (in the case of Yes determination in step S34), the estimation process is ended. On the other hand, when the estimation process is continued (No in step S34), the process returns to step S32, and the processes of steps S32 to S34 are repeated.
- step S33 the image shape ratio calculation unit 22 and the image feature quantity extraction unit 23 acquire the extracted finger image and the base point of the image output from the image data storage unit 21 (step S41).
- the image shape ratio calculation means 22 and the image feature quantity extraction means 23 respectively use the acquired extracted hand image and the origin of the image to obtain an image shape ratio (vertical degree, upper length and right degree) of the extracted hand image and An image feature (high order local autocorrelation function) is calculated (step S42).
- the image shape ratio of the extracted hand image and the image shape ratio of the extracted finger image are obtained by performing the same processing as the calculation processing of the image shape ratio and the image feature quantity at the time of construction of the database 18 described above Calculate the image feature quantity.
- the image shape ratio (vertical degree, upper length and right degree) of the extracted finger image shape parameters H, H u , W and W r of the extracted finger image are obtained.
- the vertical length R t , the upper vertical length R th and the right vertical length R rb are calculated using the values of Further, the image feature amount first reduces (normalizes) the extracted hand image to an image of 64 ⁇ 64 pixels. Next, the reduced (normalized) image is divided into eight vertically and eight horizontally (totally 64 divided) images to calculate image feature amounts in each divided image.
- the collation unit 24 acquiring an image shape ratio of the extracted hand image calculated by the ratio calculation unit 22 (vertical size R tc, superior degree R thc and right length of R rbc). Further, the collation unit 24 reads out the image shape ratio (vertical degree R t [j], vertical length R th [j] and right horizontal degree R rb [j]) in the data set of the data set number j.
- the matching unit 24 determines whether the absolute value of the difference between the image shape ratio of the extracted finger image calculated by the image shape ratio calculating unit 22 and the image shape ratio of the data set number j is equal to or less than a predetermined threshold. Determination (step S46: first estimation process). If the shape of the finger image of the input image is similar to the shape of the finger image of the data set number j, the absolute difference between the image shape ratio of the extracted finger image and the image shape ratio of the data set number j The value decreases and becomes less than or equal to the threshold.
- step S46 it is determined whether each image shape ratio satisfies the following three conditional expressions.
- the absolute value of the difference between the image shape ratio of the extracted finger image and the image shape ratio of the data set number j is used as the determination parameter, the present invention is not limited to this.
- Any parameter can be used as long as it is a parameter relating to the difference between the image shape ratio of the extracted hand image and the image shape ratio of the data set number j.
- the square of the difference between the image shape ratio of the extracted hand image and the image shape ratio of the data set number j may be used as a parameter.
- step S46 in the case of No determination, the overall shape of the input finger image and the overall shape of the data image of the data set number j are not similar, so the process returns to step S44 and the data set number j is updated. Step S45 and Step S46 (first estimation process) are repeated (with another data set number).
- the data set number j is output to the estimation means 25.
- the estimation unit 25 reads out the image feature amount in the data set corresponding to the input data set number j. Further, the estimation unit 25 acquires an image feature amount of the finger image extracted by the image feature amount extraction unit 23. Then, the estimation unit 25 collates the image feature quantity of the data set number j with the image feature quantity of the finger image extracted by the image feature quantity extraction unit 23, and estimates the finger shape of the input image (second estimation Processing) is performed (step S47).
- the seek and the image characteristic amount x [j] lch data set number j, the Euclidean distance between the image characteristic amount x ClCH of the input image similar Perform a degree search.
- An image feature amount x [j] lch data set number j, the Euclidean distance E between the image characteristic amount x ClCH the input image [j] is calculated by the following equation.
- the subscripts l, c and h of the image feature x are the row numbers (1 to 8), column numbers (1 to 8) and higher-order local autocorrelation pattern numbers (1 to 25) of the divided image, respectively. (See FIGS. 10A and 10B).
- the estimating means 25 compares the Euclidean distance E [j] calculated by the above equation with the Euclidean distance E min which is the smallest of the Euclidean distances E calculated previously (step S48).
- step S48 If the Euclidean distance E [j] is smaller than E min (in the case of Yes determination in step S48), E min is updated, and the data set number j is stored in the storage unit (not shown) of the estimation means 26 , And return to step S44. On the other hand, if the Euclidean distance E [j] is equal to or greater than E min (No in step S47), the E min is updated and the storage unit (not shown) of the estimation unit 26 of the data set number j is Is not performed, the process returns directly to step S44.
- step S50 the finger joint angle and the forearm rotation angle of the data set number j stored in the storage unit (not shown) of the estimation unit 26 are output (step S50).
- the finger shape of the input image is estimated, and the finger joint angle and the forearm rotation angle of the most similar finger shape are output.
- the similar image search is performed based on the features of the overall shape of the finger image in the first estimation process, so in this first estimation process, the number of matching image data to be similar image candidates is a certain amount. It can be limited to the number. As a result, the amount of processing in the second estimation process can be minimized. Therefore, in the present embodiment, even when a large scale database is used, the most similar image can be searched at high speed.
- the most similar image is found without using the search result of the finger image of the previous time. Therefore, it is possible to retrieve the most similar image more reliably and with high estimation accuracy regardless of the speed and the size of the change in finger shape between successive finger images.
- the base point of the finger image is extracted from the captured image (original image) using the labeling processing technique described in FIGS. 6A and 6B, and it is necessary to estimate the shape based on the base point.
- Extraction hand images for obtaining various parameters (image shape ratio and image feature amount) are cut out (extracted) from the captured image. This method has the following advantages.
- the information stored in the database of the present embodiment is “image information of only fingers”.
- an image obtained by imaging with a camera is at least an “image of a finger including an arm (forearm)”. Therefore, in order to obtain high estimation accuracy in the method of performing similarity estimation with low-order image feature quantities such as high-order local autocorrelation as in the finger shape estimation method of the present embodiment, an image captured by a camera From this, it is necessary to extract "an image of only the finger area”.
- the following method can be considered as a method other than the present invention.
- the “neck portion” of the contour of the finger and arm appearing in the finger image (original image) including the forearm is detected, and the portion is regarded as a wrist.
- an image on the tip end side of the “neck portion” is cut out from the captured image as an “image of only the finger region”.
- the “image of only the finger area” is cut out from the captured image (original image) based on the origin in the finger image extracted using the labeling processing technology as described above . Therefore, regardless of the above-mentioned problem in the method of detecting the “neck portion” as described above, the extracted finger image can be cut out from the captured image, and stable shape estimation becomes possible.
- the above-mentioned labeling processing technology is a processing with very light processing load.
- the database A finger image having the same shape as the extracted finger image generated at construction time can be extracted from the captured image.
- the extracted finger image is normalized to an image of a predetermined size (64 ⁇ 64 pixels in the above example). Therefore, even when the distance between the camera and the finger to be imaged changes and the size of the extracted hand image changes, the image feature amount is calculated from the image of the same size (normalized image). Therefore, within the range in which the finger can be estimated (for example, 64 ⁇ 64 pixels or more) in the captured image, the shape of the finger to be imaged is stable regardless of whether it is near or far from the camera. It becomes possible to estimate.
- the estimation error of the finger shape decreases and converges to a certain value as the threshold Th t for the vertical degree, the threshold Th th for the upper vertical, and the threshold Th rb for the right longitudinal increase. Conversely, if these three threshold values are reduced, the number of finger images (data sets) passing through the first estimation process described above decreases, and similarity calculation is performed in the second estimation process (image feature quantity matching process). The number of finger images decreases. That is, the processing speed can be increased by reducing the three threshold values. From the above, when determining the above three threshold values, it is necessary to consider the balance between the estimation accuracy and the processing speed.
- the three thresholds are changed to various values respectively, the finger image estimation processing is performed on each combination of the three thresholds obtained, and the balance between the estimation error and the processing speed A combination of threshold values that gives the best results may be obtained.
- each threshold value is changed by seven types of 0.001, 0.011, 0.021, 0.031, 0.041, 0.051 and 0.061, three threshold values in total of 343 A combination of Then, the average and standard deviation of the estimation error in each combination, and the number of data sets selected (adapted) in the first estimation process are plotted on a graph, and each threshold value which appropriately satisfies both the estimation error and the processing speed You may decide the value of.
- FIG. 14 is a flowchart showing the procedure of the process of determining three thresholds in the present embodiment.
- the multiple regression equation calculation unit 41 of the threshold value determination means 19 targets all data sets in the database for each of the three image shape ratios (longitudinal degree R t , upper longitudinal degree R th and right longitudinal degree R rb ). Then, a multiple regression equation is created for each image shape ratio with the image shape ratio as the objective variable and joint angle data as the explanatory variable (step S61).
- joint angle data used in the multiple regression equation finger joint angle data (17 types) obtained by measurement with the data glove 30 and forearm rotation angle data (1 type) measured by the forearm rotation angle measurement means 31 are used. Use. Therefore, the number of explanatory variables in the multiple regression equation is 18.
- the correlation coefficient calculation unit 42 of the threshold value determination means 19 calculates a multiple correlation coefficient (correlation coefficient between predicted value by multiple regression equation and actual value) for each image shape ratio (step S62).
- the larger the image shape ratio of this coefficient the stronger the correlation with the finger shape (finger joint angle data). That is, in this step S62, an image shape ratio having a large influence on the estimation of the finger shape is determined.
- the image shape ratio selection unit 43 of the threshold value determination unit 19 selects an image shape ratio corresponding to a threshold whose value is changed in finger shape estimation processing described later based on the multiple correlation coefficient calculated in the above step Step S63). Specifically, the image shape ratio with the largest multiple correlation coefficient or the image shape ratio with the first and second largest multiple correlation coefficient is selected.
- the shape ratio may be selected. Since the forearm rotation is a factor that largely changes how the hand looks in the image, the correlation between the forearm rotation angle and the image shape ratio in the finger shape estimation is high. Therefore, in step S64, the image shape ratio may be selected based on the partial regression coefficient applied to the forearm rotation angle.
- the partial regression coefficient applied to each joint angle data in the multiple regression equation can be obtained by solving the multiple regression equation of the image shape ratio created for each data set as a simultaneous equation.
- the equation system is made overdetermined equation, singular value decomposition, etc. It is necessary to solve using.
- step S63 since the multiple correlation coefficient and the forearm rotational partial regression coefficient of the right degree of curvature R rb are smaller than that of the other two image shape ratios, in step S63, the longitudinal degree R t and the upper degree R th are obtained. Selected.
- the threshold value for the image shape ratio not selected is fixed to a predetermined value (step S64). Specifically, in the present embodiment, the threshold value regarding the right degree R rb is set to 0.011. However, the threshold for the image shape ratio not selected in step S63 may not be used in the first estimation process, and in the case of using it, the value may be set to a value that is not extremely small and fixed. Is preferred.
- the image shape ratio selecting unit 43 of the threshold value determining unit 19 sets the change width and the step width of the threshold (one or two) regarding the image shape ratio selected in step S63 (step S65).
- step S65 the change width of the thresholds Th t and Th th regarding the vertical length R t and the upper length R th is set to 0.001 to 0.061, and the step size is set to 0.01.
- the finger shape estimation unit 44 of the threshold determination means 19 estimates the finger shape in each combination of the three thresholds determined in step S65, and calculates the average value and standard deviation of estimation errors, and the first estimation process.
- the number of adapted data sets is actually determined (step S66).
- the finger shape estimation process in step S66 the same process as the estimation process (see FIG. 13) performed by the finger shape estimation processing apparatus 20 described above is performed.
- this step S66 it is necessary to compare the true value (that is, the actual measurement value) of the finger shape with the estimated value. Therefore, in this step S66, a thin white glove is mounted on the data glove 30 to acquire the true value of the finger shape in the same manner as when constructing the database.
- a thin white glove is mounted on the data glove 30 to acquire the true value of the finger shape in the same manner as when constructing the database.
- one half of the data set in the database is randomly selected, the data set is used as an estimated value, and the other half of the data set is used as a true value (data for input image), and the average value of estimated errors And standard deviation, and the number of data sets fitted in the first estimation process may be determined.
- FIGS. 15 to 21 show changes in the average value and the standard deviation of estimation errors with respect to the combination of the threshold values Th t and Th th with respect to the vertical length R t and the vertical length R th .
- the average value and the standard deviation of the estimation error are taken on the vertical axis
- the threshold Th t regarding the longitudinal degree R t is taken on the horizontal axis.
- FIGS. 22 to 28 are diagrams showing the number of adapted data sets with respect to the combination of the threshold values Th t and Th th regarding the vertical length R t and the upper length R th .
- the vertical axis represents the number of data sets sorted in the first estimation process
- the horizontal axis represents the threshold Th t for the vertical length R t .
- the threshold value determination unit 45 of the threshold value determination means 19 substantially converges both the average value and the standard deviation to a certain value. And select a threshold that is as small as possible (step S67: tentatively determined).
- a threshold that is as small as possible.
- a combination of thresholds is selected.
- the measurement results of FIGS. 15-21, the threshold Th th threshold Th t and superior degree R th preferred Vertical degree R t at the stage of step S67 is found to be 0.011 together.
- the threshold value determination unit 45 of the threshold value determination means 19 determines whether or not the number of data sets sorted in the first estimation process is equal to or less than a predetermined number in the threshold of the image shape ratio temporarily determined in step S67. (Step S68).
- the determination value (the above-mentioned predetermined number) of the number of data sets used for the determination of this step is appropriately set in accordance with the processing capacity of the apparatus.
- step S68 If the number of sorted data sets is less than or equal to the predetermined number (Yes in step S68), the threshold determination unit 45 outputs the threshold temporarily determined in step S67 as the final determined threshold (step S69: Final decision).
- step S70 if the number of sorted data sets is larger than the predetermined number (No in step S68), the threshold (one or two) regarding the image shape ratio selected in step S63 has a maximum value It is determined (step S70).
- step S70 If the determination is No in step S70, a threshold value slightly larger than the temporarily determined threshold value is selected (step S71), and the process returns to step S67 to repeat the subsequent processing.
- step S70 determines whether the determination in step S70 is YES.
- the threshold value of the image shape ratio not selected in step S63 is changed (step S72). Specifically, the threshold value of the image shape ratio at which the multiple correlation coefficient or the forearm partial regression coefficient is the smallest is slightly increased. Thereafter, the process returns to step S65 to repeat the subsequent processing.
- the purpose of the first estimation process using the image shape ratio is to narrow down the number of data sets that become similar candidates in the first estimation process to some extent, and to search for the detailed similarity check in the second estimation process. It is to reduce the (processing amount). Therefore, the threshold value may be set to such an extent that the similar image selected as the most similar image in the full search does not leak in the first estimation process.
- the example using three image shape ratios was demonstrated in the said embodiment, this invention is not limited to this.
- the image shape ratio where the multiple correlation coefficient and the forearm rotation partial regression coefficient are small at the threshold determination step described above, the image shape ratio is not used in the threshold setting processing and the first estimation processing of the finger shape estimation. It is good. That is, the finger shape estimation process may be performed using only the image shape ratio having a high correlation with the finger shape.
- Poser 5 (manufactured by Curious Labs Incorporated): Note that Poser may be a registered trademark to generate a finger image. More specifically, a finger image may be generated as follows.
- predetermined CG editing software is stored in the image data storage means 15 in the database construction device 10 in FIG.
- the image data storage unit 15 acquires time-series data of the finger joint angle and the forearm (wrist) joint angle from the time-series joint angle data storage unit 12 and the time-series rotational angle data storage unit 14 respectively.
- the image data storage unit 15 creates an extracted finger image necessary for calculating the image shape ratio and the image feature amount using CG acquisition software using the acquired data.
- various data of the finger image described above are acquired by the data glove 30, and in the acquired data, the length and thickness of the bone and the movable range of the joint are adjusted by CG editing software, and the finger image is stored in the database May be In this case, it is possible to cope with various finger shapes, such as thick fingers, long fingers, fingers that bend a lot, or when the fingers are shorter than palms. This makes it possible to estimate the shape of the finger regardless of age, sex, race, etc.
- the following effects can be obtained by the finger shape estimation device and the finger estimation process of the present embodiment described above. Even when using a large database to obtain high estimation accuracy, it is possible to search for the most similar image at high speed. It is possible to find the most similar image without being influenced by the previous search result, and even if the finger shape changes drastically at the time of estimation, it is possible to find the most similar image without lowering the estimation accuracy. In addition, database construction becomes easy. Therefore, the present invention is suitable, for example, for the following applications.
- the shape of the finger can be estimated with high speed and high accuracy by one camera, it is possible to realize an information input device which can be used while sleeping, for example, in an environment where it is difficult to use a keyboard or a mouse. More specifically, it is possible to change the content displayed on the screen of an HMD (Head Mounted Display) with a small camera according to the movement of the user's hand. Virtual keyboard operation is also possible. As another example, in an environment that can be used in combination with a large screen display, the user can also operate an icon or the like without touching the display by moving a finger on the desktop screen. Furthermore, even in the case of the input of three-dimensional structure data such as clay work, the user only needs to move the hands and arms to form the shape.
- HMD Head Mounted Display
- the present invention can also be applied to a virtual space game or the like in which virtual object operations can be enjoyed by the operation of fingers that knead, twist, or crush an object.
- FIGS. 29 to 34 are snapshots when the finger shape estimation device of the first modification actually estimates the finger shape of the three-dimensional motion. As is clear from FIGS. 29 to 34, it can be seen that the three-dimensional motion of human fingers is reproduced with high precision and stably by the robot hand.
- the number of images for collation is narrowed to about 20 to about 30,000 to about 150.
- the number of images for use is about 200. Therefore, in the embodiment and the first modification, at least the same estimated speed as that of the conventional example can be obtained. However, the estimated speed can be further increased by further devising or improving the estimation program.
- the outline of the finger is rotated so that the extending direction of the forearm in the original image is in a predetermined direction, for example, a direction orthogonal to the direction along the lower end of the original image.
- a predetermined direction for example, a direction orthogonal to the direction along the lower end of the original image.
- FIG. 35 shows an example of the configuration of the main part that performs image correction in the finger shape estimation device of the second modification.
- the main part that performs the correction process of the original image is configured by the inclination calculation unit 48 and the finger outline correction unit 49 (image correction unit). These are included in the image data storage unit 21 (finger image acquisition unit) in the finger shape estimation apparatus (FIG. 1) of the above embodiment.
- the tilt calculating unit 48 obtains the tilt of the forearm in the original image from the original image captured by the camera.
- the finger outline correction unit 49 rotates the outline of the finger based on the inclination of the forearm calculated by the inclination calculation unit 48, and the extending direction of the forearm is a predetermined direction (for example, the original image It is oriented in a direction perpendicular to the direction along the lower end).
- the data on the inclination of the forearm portion calculated by the inclination calculation unit 48 is directly output to the robot hand or the CG drawing hand.
- FIG. 36 is a flowchart showing a series of procedures from the calculation of the inclination of the forearm in the original image to the correction of the contour line of the finger.
- image correction is performed such that the extending direction of the forearm portion in the original image is orthogonal to the direction along the lower end of the original image.
- the direction (predetermined direction) in which the extending directions of the forearm parts are aligned can be appropriately changed in consideration of the specification of the shape estimation system, the application, the convenience of control, and the like.
- the inclination calculation unit 48 acquires an original image (captured image), and extracts outline images of the forearm and the finger (step S81).
- FIG. 37 shows an example of an original image to be acquired.
- the inclination calculating unit 48 searches for an edge of the original image (step S82). Specifically, for example, the search is made along the lower end from the pixel at the lower left end of the original image of FIG. Then, the inclination calculation unit 48 extracts the starting point pixel (the starting point in FIG. 37) of the contour image by the search of the edge (step S83). At this time, it is determined by the pixel value (brightness value) whether the image is a contour image or not.
- the inclination calculating unit 48 calculates the inclination of the contour at each contour pixel while tracing the pixels on the contour image from the starting pixel (hereinafter, referred to as contour pixels) (step S84).
- FIG. 38 shows a schematic view of a method of calculating the inclination of the outline pixel.
- the thick solid line in FIG. 38 indicates the outline of the outline image, and the broken line arrow indicates the direction in which the outline pixel is traced in step S84.
- the coordinate positions (x i ⁇ , y i ⁇ ) of (x i + ⁇ , y i + ⁇ ) and P i ⁇ are calculated by the following equation.
- step S84 of calculating the inclination of the outline at each outline pixel is performed until the end point pixel (the end point in FIG. 37) of the outline image.
- FIG. 39 shows a change in the inclination of the contour at each contour pixel calculated in step S84.
- the horizontal axis in FIG. 39 is the distance from the starting point pixel of the outline pixel, and the vertical axis is the inclination of the outline.
- the angle of inclination of the contour is 0 degrees in the direction from the lower left end to the lower right end of the original image shown in FIG. 37, and the angle in the counterclockwise direction from that direction is a positive angle. .
- the range from about 160 to about the 420th contour pixel in FIG. 39 corresponds to the region from around point A to the contour pixel near point B in FIG. 37, that is, the region of the finger portion.
- the region from the approximately 420th contour pixel in FIG. 39 to the contour pixel at the end point corresponds to the region from near point B in FIG. 37 to the contour pixel at the end point, that is, the region of the forearm.
- the variation of the inclination of the contour is reduced.
- the inclination calculating unit 48 calculates the standard deviation of the inclination of the contour calculated in step S84 (step S85).
- the standard deviation of a predetermined contour pixel is determined in a range between predetermined contour pixels including the contour pixel. More specifically, for example, the standard deviation at contour pixel P i in FIG. 38 is determined in the range between contour pixels P i + ⁇ and P i ⁇ that are ⁇ ⁇ pixels apart from the pixel along the contour. For example, 10 pixels are selected as the value of the ⁇ pixel.
- the results are shown in FIG.
- the horizontal axis in FIG. 40 is the distance from the start pixel of the outline pixel, and the vertical axis is the inclination of the outline.
- the standard deviation largely fluctuates in the range (the area of the finger) in the vicinity of the contour pixel of about 160 to about 420, and the fluctuation of the standard deviation becomes small in the other range (the area of the forearm) I understand that.
- the inclination calculating unit 48 specifies the positions of the outline pixels of the point A and the point B in FIG. 37 from the characteristics of the standard deviation of the inclination of the outline obtained in step S85. Specifically, first, as shown in FIG. 40, an appropriate threshold value is set to obtain an intersection point with the characteristics of the standard deviation. Of these intersection points, the intersection point of the rising portion of the standard deviation located closest to the start point is considered to be the point of the outline pixel near point A in FIG. 37. In this example, this intersection point is the point in FIG. A. In addition, since the intersection of the falling portion of the standard deviation located closest to the end point is considered to be a point of the outline pixel near point B in FIG. 37, in this example, this intersection is assumed to be point B in FIG. . In this example, outline pixels of point A and point B in FIG. 37 and their coordinate positions are extracted in this manner (step S86).
- the tilt calculating section 48 the position coordinates of the contour pixels of the A point extracted in step S86 (x A, y A) the position coordinates of the contour pixels and a point B (x B, y B) and the position of the origin pixel
- the inclination ⁇ of the forearm in the original image is calculated by the following equation (step S 87).
- ⁇ a is the inclination of the forearm to the lower end of the area from the start point to the point A
- ⁇ b is the inclination of the forearm to the lower end of the area from the end point to the point B ( See Figure 37).
- the inclination calculation unit 48 outputs the inclination ⁇ of the forearm portion calculated in step S87 to the hand contour line correction unit 49 and also outputs it to the robot hand or the CG drawing hand.
- the finger contour correction unit 49 performs the contour of the finger so that the extending direction of the forearm coincides with the direction orthogonal to the lower end of the original image.
- the image is rotated to generate a corrected image (step S88).
- the original image is corrected in this manner.
- shape estimation is performed on the corrected image in the same manner as in the embodiment and the first modification.
- the position of point A in FIG. 37 is in the vicinity of the wrist, but for example, in the original image that is substantially straight from the forearm to the tip of the little finger.
- the position of point A in FIG. 37 is in the vicinity of the tip of the little finger.
- the original image can be corrected in the same manner as the above-described procedure.
- the present invention is not limited to this.
- the program may be implemented to perform the process of the present invention.
- the program for executing the process of the present invention may be distributed via a medium such as an optical disk or a semiconductor memory, or may be downloaded via a transmission means such as the Internet.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
(1)1時刻前の検索結果の近傍データを検索対象としているので、連続する手指画像間で手指形状が激しく変化している場合には、検索すべき手指形状に類似した手指形状が検索領域から外れる恐れがある。この場合、最類似画像を見つけ出せなくなる可能性がある。
(2)連続する手指画像間で、手指形状がゆっくり変化している場合においても、推定には誤差が混入し得るため、一度、非類似画像を出力すると、次時刻以降の手指画像の探索でも、その非類似画像を含むクラスの近傍クラスで検索を行うので、連続して非類似画像を出力する可能性が高くなる。
(3)上記従来の手法では、ただ単に統計的類似度に従ってデータベースを多階層化するのではなく、先験的知識を用いてデータベースを効果的に多階層化する。それゆえ、代表的な値を持つ各階層の各クラスに、どのような画像特徴量、関節角度情報、爪位置情報などの手指画像の手指形状に対応するデータを割り当てれば良いのかを決めることが難しくなる。この結果、データベースの構築に手間がかかる。
図1は、本実施形態の手指形状推定装置を適用したシステムの構成例である。図1のシステムは、本実施形態の手指形状推定装置をロボットハンド34や3次元のCG(Computer Graphics)描画ハンド35の制御に用いたシステムである。
本実施形態のデータベースの構築手順を図3~11を参照しながら説明する。なお、下記説明における各装置及び手段の符号番号は、図1で各部に付した符号番号と同じである。まず、本実施形態における、データベース構築の全体的な手順を図3を用いて説明する。図3はデータベース18の構築手順の全体的な流れを示した図である。なお、図3中のステップS1~S3の処理は、図3に示した順で行っても良いが、併行して行ってもよい。
・拇指の3関節(拇指丘(掌の親指の付け根の柔らかい部分)のCM(Carp Metacarpal:手根中手)関節、MP(Meta Carpophalangeal:中手指節間)関節、IP(Interphalangeal:指節間)関節)の屈曲伸展:計3種
・拇指を除く4指の3関節(指の付け根側からMP関節、PIP(Proximal Interphalangeal:近位指節間)関節、DIP(Distal Interphalangeal:遠位指節間)関節)の屈曲伸展:計12種
・中指を除いた4指の付け根の関節(3指MPと拇指CM)の内転外転(小指側や拇指側に傾くこと):計4種(ただし、中指は内外転しないものとする)
・手首の内転外転(拇指側や小指側に傾くこと)と、屈曲伸展(掌側や手の甲側に曲がること):計2種
・未使用情報3種
次に、図3中のステップS3における撮像画像から手指画像を抽出する処理、並びに、ステップS4における画像特徴量及び画像形状比率の算出処理を図4及び5を用いて説明する。図4は、撮像画像から手指画像を抽出する手指画像抽出部のブロック構成図であり、図5は、手指画像の抽出処理から画像特徴量及び画像形状比率の算出処理までの手順を示したフローチャートである。
(1)縦長度:Rt[j]=H[j]/(H[j]+W[j])
(2)上長度:Rth[j]=Hu[j]/H[j]
(3)右長度:Rrb[j]=Wr[j]/W[j]
ただし、括弧内の変数jはデータベース18に記憶されているデータセットの番号である。すなわち、例えば、Rt[j]はデータセット番号j内の縦長度である。
(4)基点から抽出手指画像の左端までの画素数と、抽出手指画像の横方向の全画素数Wとの比(左長度)
(5)抽出手指画像の縦方向の全画素数Hと、抽出手指画像の横方向の全画素数Wとの比(縦横度)
(6)基点から抽出手指画像の上端までの画素数と、基点から抽出手指画像の下端までの画素数との比(上下度)
(7)基点から抽出手指画像の一方の側端までの画素数と、基点から抽出手指画像の他方の側端までの画素数との比(左右度)
また、抽出手指画像の全体形状の特徴を示す形状データとしては、上記(1)~(7)のパラメータに限定されず、抽出手指画像の全体形状の特徴を示す形状パラメータであれば任意のパラメータを用いることができる。
次に、本実施形態の手指形状推定装置20における手指形状の推定処理を図12及び図13を参照しながら説明する。なお、下記説明における各装置及び手段の符号番号は、図1で各部に付した符号番号と同じである。図12は、本実施形態の手指形状の推定処理の全体的な流れを示した図である。また、図13は、図12中のステップS33の処理内容を示したフローチャートである。
・ 縦長度に関する閾値
Tht≧|Rt[j]-Rtc|
(2)上長度に関する閾値
Thth≧|Rth[j]-Rthc|
・ 右長度に関する閾値
Thrb≧|Rrb[j]-Rrbc|
なお、上記閾値の決め方については後で詳述する。なお、本実施形態では、抽出手指画像の画像形状比率と、データセット番号jの画像形状比率との差の絶対値を判定パラメータとして用いているが、本発明はこれに限定されない。抽出手指画像の画像形状比率と、データセット番号jの画像形状比率との差に関するパラメータであれば、任意のパラメータを用い得る。例えば、抽出手指画像の画像形状比率と、データセット番号jの画像形状比率との差の2乗をパラメータとしてもよい。
次に、図13中のステップS45で用いた縦長度に関する閾値Tht、上長度に関する閾値Thth及び右長度に関する閾値Thrbの決定方法の一例を説明する。
上記実施形態においてデータベースを構築する際に使用したデータグローブ30では、上述のように、手首の内転外転(手首が拇指側や小指側に傾くこと)及び屈曲伸展(手首が掌側や手の甲側に曲がること)のデータも得ることができる。それゆえ、上記実施形態において、データセットの構築の際に、さらに手首の内転外転及び/又は屈曲伸展のデータをデータセットに追加してもよい。この場合、3次元動作を含むデータセットを簡単に生成することができる。また、この場合、新たな形状推定アルゴリズムを追加することなく、手首の内転外転及び/又は屈曲伸展動作を含む3次元動作の形状推定も可能となる。
上記実施形態では、抽出手指画像の原画像(撮像画像)に写っている前腕部の延在方向が、例えば原画像の下端(下辺部)に沿う方向対して略直交している例について説明したが、本発明はこれに限定されない。原画像中の前腕部の延在方向が、原画像の下端に対して直交していない場合や、原画像中の前腕部が原画像の側端に接している(側端から延在している)場合であってもよい。
Claims (15)
- 第1の手指画像を取得する手指画像取得部と、
第1の手指画像の縦方向及び横方向の寸法に関する第1の形状データを算出する形状データ算出部と、
第1の手指画像の第1の画像特徴量を抽出する画像特徴量抽出部と、
手指の角度データと、前記手指の第2の手指画像の縦方向及び横方向の寸法に関する第2の形状データと、第2の手指画像の第2の画像特徴量とを一組にしたデータセットを複数有するデータベースから所定の前記データセット内の第2の形状データを読み出して、その第2の形状データと、第1の形状データとを照合する照合部と、
前記照合部の照合で適合した第2の形状データを含むデータセットの第2の画像特徴量と、第1の画像特徴量とを照合して、第1の手指画像の手指形状を推定する推定部とを備える手指形状推定装置。 - さらに、前記データベースを内蔵した請求項1に記載の手指形状推定装置。
- 前記画像特徴量が、前記手指画像の分割画像の画像特徴量であることを特徴とする請求項1または2に記載の手指形状推定装置。
- 前記画像特徴量が、前記手指画像を所定サイズの画素数に変換した画像の画像特徴量であることを特徴とする請求項1~3のいずれか一項に記載の手指形状推定装置。
- 前記形状データが、前記手指画像の縦方向の全画素数と、横方向の全画素数とにより求められた手指形状の第1の形状比を含むことを特徴とする請求項1~4のいずれか一項に記載の手指形状推定装置。
- 前記形状データが、前記手指画像内の所定の基点画素と、前記手指画像の外端の画素との位置関係に基づいて求められた手指形状の第2の形状比を含むことを特徴とする請求項1~5のいずれか一項に記載の手指形状推定装置。
- 前記角度データが、手指の関節角度データと手首の回旋角度データとを含むことを特徴とする請求項1~6のいずれか一項に記載の手指形状推定装置。
- 前記角度データが、手首の屈曲伸展データ及び内転外転データを含むことを特徴とする請求項1~7のいずれか一項に記載の手指形状推定装置。
- 前記推定部が、第1の画像特徴量に最も類似した第2の画像特徴量を含むデータセット内の手指の角度データを出力することを特徴とする請求項1~8のいずれか一項に記載の手指形状推定装置。
- 前記照合部が、第1の形状データと第2の形状データとの差に基づいて、第1の手指画像と第2の手指画像との照合を行うことを特徴とする請求項1~9のいずれか一項に記載の手指形状推定装置。
- さらに、前記照合部で第1の形状データと第2の形状データとの差に基づいて照合を行う際に用いる閾値を算出する閾値算出装置を備える請求項10に記載の手指形状推定装置。
- 前記手指画像取得部が、
第1の手指画像の原画像中の前腕部の傾きを算出する傾き算出部と、
前記傾き算出部で算出された前記前腕部の傾きに基づいて、前記前腕部の延在方向が所定方向に向くように第1の手指画像を回転させる画像補正部と
を有することを特徴とする請求項1~11のいずれか一項に記載の手指形状推定装置。 - 前記手指画像取得部が、
前記手指画像の原画像から前記手指画像内の手指部の最外郭画素を求める最外郭抽出部と、
前記手指部の最外郭画素からラベリング処理により基点画素を求める基点抽出部と、
前記最外郭画素及び前記基点画素に基づいて、前記原画像から前記手指画像を切り出す範囲を決定する手指画像切出部と
を有することを特徴とする請求項1~12のいずれか一項に記載の手指形状推定装置。 - 第1手指画像を取得するステップと、
第1手指画像の縦方向及び横方向の寸法に関する第1形状データ、並びに、第1手指画像の第1画像特徴量を算出するステップと、
手指の角度データと、前記手指の第2手指画像の縦方向及び横方向の寸法に関する第2形状データと、第2手指画像の第2画像特徴量とを一組にしたデータセットを複数有するデータベースから所定の前記データセット内の第2形状データを読み出すステップと、
第1形状データと第2形状データとを照合するステップと、
前記照合するステップで適合した第2形状データを含む前記データセットの第2画像特徴量を読み出すステップと、
第1画像特徴量と第2画像特徴量とを照合して、第1手指画像の手指形状を推定するステップとを含む手指形状の推定方法。 - コンピュータ装置に実装して所定の処理をコンピュータ装置に実行させるプログラムであって、
第1手指画像を取得する処理と、
第1手指画像の縦方向及び横方向の寸法に関する第1形状データ、並びに、第1手指画像の第1画像特徴量を算出する処理と、
手指の角度データと、前記手指の第2手指画像の縦方向及び横方向の寸法に関する第2形状データと、第2手指画像の第2画像特徴量とを一組にしたデータセットを複数有するデータベースから所定の前記データセット内の第2形状データを読み出す処理と、
第1形状データと第2形状データとを照合する処理と、
前記照合処理で適合した第2形状データを含む前記データセットの第2画像特徴量を読み出す処理と、
第1画像特徴量と第2画像特徴量とを照合して、第1手指画像の手指形状を推定する処理とをコンピュータ装置に実行させることを特徴とするプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010515804A JP5403699B2 (ja) | 2008-06-04 | 2009-04-20 | 手指形状推定装置、手指形状の推定方法及びプログラム |
CN200980130236.4A CN102113012B (zh) | 2008-06-04 | 2009-04-20 | 手指形状推定装置、手指形状的推定方法及程序 |
US12/995,912 US9002119B2 (en) | 2008-06-04 | 2009-04-20 | Device method and program for human hand posture estimation |
EP09758171.4A EP2302581A4 (en) | 2008-06-04 | 2009-04-20 | FINGER SHAPE ESTIMATING DEVICE AND FINGER SHAPE ESTIMATING METHOD AND PROGRAM |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008147409 | 2008-06-04 | ||
JP2008-147409 | 2008-06-04 | ||
JP2008-306854 | 2008-12-01 | ||
JP2008306854 | 2008-12-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009147904A1 true WO2009147904A1 (ja) | 2009-12-10 |
Family
ID=41397983
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/057851 WO2009147904A1 (ja) | 2008-06-04 | 2009-04-20 | 手指形状推定装置、手指形状の推定方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9002119B2 (ja) |
EP (1) | EP2302581A4 (ja) |
JP (1) | JP5403699B2 (ja) |
CN (1) | CN102113012B (ja) |
WO (1) | WO2009147904A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013051681A1 (ja) * | 2011-10-07 | 2013-04-11 | 国立大学法人筑波大学 | 手指形状推定装置、手指形状推定方法、及び手指形状推定プログラム |
JP2015518378A (ja) * | 2012-05-03 | 2015-07-02 | ゼネラル・エレクトリック・カンパニイ | 細胞運動の自動セグメンテーション及び特徴付け |
JP2016014954A (ja) * | 2014-07-01 | 2016-01-28 | 国立大学法人 筑波大学 | 手指形状の検出方法、そのプログラム、そのプログラムの記憶媒体、及び、手指の形状を検出するシステム。 |
KR101631025B1 (ko) * | 2015-04-03 | 2016-06-16 | 충북대학교 산학협력단 | 깊이 카메라를 이용한 손 관절 데이터 추출 방법 및 이를 기록한 기록매체 |
JP2017227687A (ja) * | 2016-06-20 | 2017-12-28 | 聖 星野 | カメラアセンブリ、そのカメラアセンブリを用いる手指形状検出システム、そのカメラアセンブリを用いる手指形状検出方法、その検出方法を実施するプログラム、及び、そのプログラムの記憶媒体 |
CN109359566A (zh) * | 2018-09-29 | 2019-02-19 | 河南科技大学 | 利用手指特征进行层级分类的手势识别方法 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012176315A1 (ja) * | 2011-06-23 | 2012-12-27 | 富士通株式会社 | 情報処理装置、入力制御方法及び入力制御プログラム |
JP5607012B2 (ja) * | 2011-11-04 | 2014-10-15 | 本田技研工業株式会社 | 手話動作生成装置及びコミュニケーションロボット |
JP2013206412A (ja) * | 2012-03-29 | 2013-10-07 | Brother Ind Ltd | ヘッドマウントディスプレイ及びコンピュータプログラム |
US9041689B1 (en) * | 2012-08-02 | 2015-05-26 | Amazon Technologies, Inc. | Estimating fingertip position using image analysis |
FR2997210B1 (fr) * | 2012-10-18 | 2015-12-11 | Morpho | Procede de segmentation de doigts |
KR101436050B1 (ko) * | 2013-06-07 | 2014-09-02 | 한국과학기술연구원 | 손모양 깊이영상 데이터베이스 구축방법, 손모양 인식방법 및 손모양 인식 장치 |
US9772679B1 (en) * | 2013-08-14 | 2017-09-26 | Amazon Technologies, Inc. | Object tracking for device input |
US9649558B2 (en) * | 2014-03-14 | 2017-05-16 | Sony Interactive Entertainment Inc. | Gaming device with rotatably placed cameras |
US9413376B2 (en) | 2014-11-11 | 2016-08-09 | Helio Technology Inc. | Angle encoder and a method of measuring an angle using same |
CN105701806B (zh) * | 2016-01-11 | 2018-08-03 | 上海交通大学 | 基于深度图像的帕金森震颤运动特征检测方法及系统 |
KR102509067B1 (ko) * | 2016-04-08 | 2023-03-13 | 삼성디스플레이 주식회사 | 사용자 인증장치, 그것의 입력 센싱 모듈 및 사용자 인증방법 |
US9958951B1 (en) * | 2016-09-12 | 2018-05-01 | Meta Company | System and method for providing views of virtual content in an augmented reality environment |
US10444865B2 (en) * | 2017-05-01 | 2019-10-15 | Google Llc | Tracking of position and orientation of objects in virtual reality systems |
US10701247B1 (en) | 2017-10-23 | 2020-06-30 | Meta View, Inc. | Systems and methods to simulate physical objects occluding virtual objects in an interactive space |
US10229313B1 (en) | 2017-10-23 | 2019-03-12 | Meta Company | System and method for identifying and tracking a human hand in an interactive space based on approximated center-lines of digits |
WO2019159332A1 (ja) * | 2018-02-16 | 2019-08-22 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理システム、コントローラデバイス、情報処理方法、及びプログラム |
CN108764127A (zh) * | 2018-05-25 | 2018-11-06 | 京东方科技集团股份有限公司 | 纹理识别方法及其装置 |
CN111050071A (zh) * | 2019-12-23 | 2020-04-21 | 维沃移动通信有限公司 | 一种拍照方法及电子设备 |
CN111192314B (zh) * | 2019-12-25 | 2024-02-20 | 新绎健康科技有限公司 | 一种确定gdv能量图像中手指的内外轮廓比率的方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005031612A1 (ja) * | 2003-09-26 | 2005-04-07 | Nikon Corporation | 電子画像蓄積方法、電子画像蓄積装置、及び電子画像蓄積システム |
WO2005046942A1 (ja) | 2003-11-13 | 2005-05-26 | Japan Science And Technology Agency | ロボットの駆動方法 |
JP2006294018A (ja) | 2005-03-17 | 2006-10-26 | Japan Science & Technology Agency | データベースの高速検索方法及び該高速検索方法を用いたロボットの駆動方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6002808A (en) * | 1996-07-26 | 1999-12-14 | Mitsubishi Electric Information Technology Center America, Inc. | Hand gesture control system |
JP4332649B2 (ja) * | 1999-06-08 | 2009-09-16 | 独立行政法人情報通信研究機構 | 手の形状と姿勢の認識装置および手の形状と姿勢の認識方法並びに当該方法を実施するプログラムを記録した記録媒体 |
FR2858073B1 (fr) * | 2003-07-24 | 2007-08-10 | Adentis | Procede et systeme de commande gestuelle d'un appareil |
JP3752246B2 (ja) * | 2003-08-11 | 2006-03-08 | 学校法人慶應義塾 | ハンドパターンスイッチ装置 |
US7308112B2 (en) * | 2004-05-14 | 2007-12-11 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
FR2911983B1 (fr) * | 2007-01-25 | 2009-05-29 | St Microelectronics Sa | Procede de suivi automatique des mouvements de la mains dans une sequence d'images. |
JP5228439B2 (ja) * | 2007-10-22 | 2013-07-03 | 三菱電機株式会社 | 操作入力装置 |
-
2009
- 2009-04-20 WO PCT/JP2009/057851 patent/WO2009147904A1/ja active Application Filing
- 2009-04-20 CN CN200980130236.4A patent/CN102113012B/zh not_active Expired - Fee Related
- 2009-04-20 US US12/995,912 patent/US9002119B2/en active Active
- 2009-04-20 JP JP2010515804A patent/JP5403699B2/ja active Active
- 2009-04-20 EP EP09758171.4A patent/EP2302581A4/en not_active Ceased
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005031612A1 (ja) * | 2003-09-26 | 2005-04-07 | Nikon Corporation | 電子画像蓄積方法、電子画像蓄積装置、及び電子画像蓄積システム |
WO2005046942A1 (ja) | 2003-11-13 | 2005-05-26 | Japan Science And Technology Agency | ロボットの駆動方法 |
JP2006294018A (ja) | 2005-03-17 | 2006-10-26 | Japan Science & Technology Agency | データベースの高速検索方法及び該高速検索方法を用いたロボットの駆動方法 |
Non-Patent Citations (4)
Title |
---|
EMI TAMAKI; KIYOSHI HOSHINO: "3D Estimation of human hand posture with Wrist Motions", IEICE REPORT WIT, vol. 107, no. 179, 2007, pages 59 - 62 |
K. HOSHINO; E. TAMAKI; T. TANIMOTO: "Copycat hand - Robot hand imitating human motions at high speed and with high accuracy", ADVANCED ROBOTICS, vol. 21, no. 15, 2007, pages 1743 - 1761, XP055302442, DOI: doi:10.1163/156855307782506183 |
See also references of EP2302581A4 |
TAKANOBU TANIMOTO ET AL.: "Ningen-Robot-kan Communication no Tameno Jitsujikan · Koseido Hito Shushi Keijo Suitei", JOURNAL OF HUMAN INTERFACE SOCIETY, vol. 7, no. 4, 25 September 2005 (2005-09-25), pages 535 - 540, XP008142970 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013051681A1 (ja) * | 2011-10-07 | 2013-04-11 | 国立大学法人筑波大学 | 手指形状推定装置、手指形状推定方法、及び手指形状推定プログラム |
JPWO2013051681A1 (ja) * | 2011-10-07 | 2015-03-30 | 国立大学法人 筑波大学 | 手指形状推定装置、手指形状推定方法、及び手指形状推定プログラム |
JP2015518378A (ja) * | 2012-05-03 | 2015-07-02 | ゼネラル・エレクトリック・カンパニイ | 細胞運動の自動セグメンテーション及び特徴付け |
JP2016014954A (ja) * | 2014-07-01 | 2016-01-28 | 国立大学法人 筑波大学 | 手指形状の検出方法、そのプログラム、そのプログラムの記憶媒体、及び、手指の形状を検出するシステム。 |
KR101631025B1 (ko) * | 2015-04-03 | 2016-06-16 | 충북대학교 산학협력단 | 깊이 카메라를 이용한 손 관절 데이터 추출 방법 및 이를 기록한 기록매체 |
JP2017227687A (ja) * | 2016-06-20 | 2017-12-28 | 聖 星野 | カメラアセンブリ、そのカメラアセンブリを用いる手指形状検出システム、そのカメラアセンブリを用いる手指形状検出方法、その検出方法を実施するプログラム、及び、そのプログラムの記憶媒体 |
CN109359566A (zh) * | 2018-09-29 | 2019-02-19 | 河南科技大学 | 利用手指特征进行层级分类的手势识别方法 |
CN109359566B (zh) * | 2018-09-29 | 2022-03-15 | 河南科技大学 | 利用手指特征进行层级分类的手势识别方法 |
Also Published As
Publication number | Publication date |
---|---|
US20110142353A1 (en) | 2011-06-16 |
JPWO2009147904A1 (ja) | 2011-10-27 |
CN102113012A (zh) | 2011-06-29 |
EP2302581A1 (en) | 2011-03-30 |
US9002119B2 (en) | 2015-04-07 |
EP2302581A4 (en) | 2013-05-22 |
CN102113012B (zh) | 2016-10-12 |
JP5403699B2 (ja) | 2014-01-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009147904A1 (ja) | 手指形状推定装置、手指形状の推定方法及びプログラム | |
JP3863809B2 (ja) | 手の画像認識による入力システム | |
JP6066093B2 (ja) | 手指形状推定装置、手指形状推定方法、及び手指形状推定プログラム | |
CN111488824A (zh) | 运动提示方法、装置、电子设备和存储介质 | |
US20100103092A1 (en) | Video-based handwritten character input apparatus and method thereof | |
JP2016091108A (ja) | 人体部位検出システムおよび人体部位検出方法 | |
JP6487642B2 (ja) | 手指形状の検出方法、そのプログラム、そのプログラムの記憶媒体、及び、手指の形状を検出するシステム。 | |
KR101559502B1 (ko) | 실시간 손 포즈 인식을 통한 비접촉식 입력 인터페이스 방법 및 기록 매체 | |
CN113065505A (zh) | 身体动作快速识别方法及系统 | |
JP6915786B2 (ja) | 学習装置、認識装置、学習方法及びコンピュータプログラム | |
JP3182876B2 (ja) | 画像信号処理方法とその装置 | |
Pamplona Berón et al. | Human activity recognition using penalized support vector machines and hidden Markov models. | |
JP4410208B2 (ja) | 高速検索方法を用いたロボットの駆動方法 | |
KR101869304B1 (ko) | 컴퓨터를 이용한 수화어 인식시스템, 방법 및 인식프로그램 | |
WO2018135326A1 (ja) | 画像処理装置、画像処理システム、画像処理プログラム、及び画像処理方法 | |
JP2019159470A (ja) | 推定装置、推定方法、及び推定プログラム | |
Osimani et al. | Point Cloud Deep Learning Solution for Hand Gesture Recognition | |
CN115798030A (zh) | 基于旋量的手势识别方法、装置、电子设备和存储介质 | |
JP2022121257A (ja) | 周期動作検知装置、周期動作検知方法及び周期動作検知プログラム | |
Shankar et al. | Sketching in three dimensions: A beautification scheme | |
JP2009151516A (ja) | 情報処理装置および情報処理装置用操作者指示点算出プログラム | |
Farouk | Principal component pyramids using image blurring for nonlinearity reduction in hand shape recognition | |
CN114816054B (zh) | 一种基于物联网的显示器手势动态控制系统及方法 | |
JP7376446B2 (ja) | 作業分析プログラム、および、作業分析装置 | |
Zhang et al. | A Non-parametric RDP Algorithm Based on Leap Motion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980130236.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09758171 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2010515804 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009758171 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12995912 Country of ref document: US |