WO2011158542A1 - ジェスチャ認識装置、ジェスチャ認識方法およびプログラム - Google Patents
ジェスチャ認識装置、ジェスチャ認識方法およびプログラム Download PDFInfo
- Publication number
- WO2011158542A1 WO2011158542A1 PCT/JP2011/057944 JP2011057944W WO2011158542A1 WO 2011158542 A1 WO2011158542 A1 WO 2011158542A1 JP 2011057944 W JP2011057944 W JP 2011057944W WO 2011158542 A1 WO2011158542 A1 WO 2011158542A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- gesture
- captured image
- gesture recognition
- front surface
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present invention relates to a gesture recognition device, a gesture recognition method, and a program.
- Gesture recognition is usually performed by detecting the shape of a detection target from a captured image of an image sensor and performing pattern recognition processing on the detection result (see Non-Patent Document 1 above). For this reason, even if an attempt is made to recognize a gesture that shields the front surface of the imaging sensor with an object such as a hand, the shape of the object located in the immediate vicinity of the imaging sensor cannot be detected, so that the gesture can be appropriately recognized. Can not.
- gesture recognition may be performed using an infrared light emitting element and a light receiving element (see Patent Document 1 above).
- the infrared light emitted from the light emitting element is reflected by the detection target, the reflected infrared light is received by the light receiving element, whereby an object that shields the light emitting element is recognized.
- a gesture cannot be properly recognized unless special devices such as an infrared light emitting element and a light receiving element are used.
- the present invention is intended to provide a gesture recognition device, a gesture recognition method, and a program capable of recognizing a gesture that shields the sensor surface of an imaging sensor without using a special device.
- a gesture recognition device for recognizing a gesture that shields a front surface of an imaging sensor, and a change in a captured image between a state in which the front surface of the image sensor is not shielded and a state in which the front surface of the image sensor is shielded.
- Gesture recognition comprising: a first detection unit that detects a first detection unit; and a second detection unit that detects a region where the gradient of the luminance value of the captured image is less than a threshold in a captured image in a state where the front surface of the imaging sensor is shielded
- the first detection unit may detect a change in the captured image based on the result of tracking the feature points in the captured image.
- the first detection unit detects that a feature point tracked in a captured image in a state where the front surface of the imaging sensor is not blocked disappears in a captured image in which the front surface of the imaging sensor is covered with a hand. May be.
- the first detection unit may determine whether the ratio of feature points lost during tracking among feature points tracked in a plurality of captured images included in a predetermined period is equal to or greater than a threshold value.
- the gesture recognition device may further include a movement determination unit that determines the movement of the imaging sensor based on the movement tendency of a plurality of feature points, and the predetermined period may be set as a period during which the imaging sensor is not moving.
- the second detection unit may detect an area where the gradient of the brightness value of the captured image is less than a threshold based on the calculation result of the brightness value histogram relating to the captured image.
- the second detection unit uses a luminance value histogram relating to a plurality of captured images included in a predetermined period, and a value obtained by normalizing a sum of frequencies near the maximum frequency by a sum of the frequencies is equal to or greater than a threshold value over a predetermined period. You may determine whether there is.
- the second detection unit may detect a region where the gradient of the brightness value of the captured image is less than a threshold based on the edge image related to the captured image.
- the second detection unit may determine whether the ratio of the edge regions in the edge image is less than a threshold value over a predetermined period using edge images relating to a plurality of captured images included in the predetermined period.
- the first and second detection units may perform processing on a partial area of the captured image instead of the captured image.
- the first and second detection units may perform processing on a grayscale image generated from a captured image with a coarser resolution than the captured image.
- the gesture recognition device may recognize a gesture composed of a combination of a gesture that blocks the front surface of the image sensor and a gesture that opens the front surface of the image sensor.
- the gesture recognition device may further include a photographing sensor that captures a front image.
- a gesture recognition method for recognizing a gesture that shields a front surface of an image sensor, wherein the image is captured between a state where the front surface of the image sensor is shielded and a state where the front surface of the image sensor is not shielded.
- a gesture recognition method including a step of detecting a change in an image, and a step of detecting a region where the gradient of the luminance value of the captured image is less than a threshold in a captured image in a state where the front surface of the imaging sensor is shielded.
- a program for causing a computer to execute a step of detecting a region where the gradient of the luminance value of the photographed image is less than a threshold in the photographed image in a state may be provided using a computer-readable recording medium, or may be provided via a communication unit or the like.
- a gesture recognition device As described above, according to the present invention, it is possible to provide a gesture recognition device, a gesture recognition method, and a program capable of recognizing a gesture that shields the sensor surface of an imaging sensor without using a special device.
- the gesture recognition apparatus 1 can recognize a gesture that blocks the sensor surface of the imaging sensor 3 without using a special device.
- a sensor surface is provided on the front surface of the imaging sensor 3, but the sensor surface may be provided on another surface.
- the gesture recognition device 1 is an information processing device such as a personal computer, a television receiver, a portable information terminal, or a mobile phone.
- a video signal is input to the gesture recognition device 1 from an imaging sensor 3 such as a video camera mounted on or connected to the gesture recognition device 1.
- an imaging sensor 3 such as a video camera mounted on or connected to the gesture recognition device 1.
- the gesture recognition device 1 recognizes the gesture based on the video signal input from the image sensor 3.
- the gesture include a shielding gesture that shields the front surface of the image sensor 3 with an object such as a hand, and a flick gesture that moves the object left and right on the front surface of the image sensor 3.
- the gesture recognition device 1 when the gesture recognition device 1 is applied to a music playback application, the shielding gesture corresponds to a music playback stop operation, and the left and right flick gestures correspond to a playback forward / back operation, respectively.
- the gesture recognition device 1 When the gesture is recognized, the gesture recognition device 1 notifies the user U of the gesture recognition result, and executes a process corresponding to the recognized gesture.
- the gesture recognition device 1 recognizes the shielding gesture by the following procedure.
- the photographed image changes between the captured image Pa in a state where the front surface of the imaging sensor 3 is not shielded (before the gesture) and the captured image Pb in a state where it is shielded (during the gesture). Is detected (first detection). Further, an area where the gradient of the luminance value i of the captured image is less than the threshold is detected in the captured image Pb in the shielded state (during gesture) (second detection).
- the captured image changes greatly from an image Pa that captures the state in front of the image sensor 3 to an image Pb that captures the object. A change in the captured image is detected.
- the gradient of the luminance value i is small in the captured image Pb that captures the object that shields the front surface of the imaging sensor 3 in the immediate vicinity, an area where the gradient of the luminance value i is less than the threshold value is detected.
- the imaging sensor 3 detects the change in the captured image and the gradient of the luminance value i, it is not necessary to detect the shape of the object located directly in front of the imaging sensor 3.
- the gesture is recognized based on the captured image of the image sensor 3, it is not necessary to use a special device.
- the gesture recognition apparatus 1 includes a frame image generation unit 11, a grayscale image generation unit 13, a feature point detection unit 15, a feature point processing unit 17, a sensor movement determination unit 19, A histogram calculation unit 21, a histogram processing unit 23, a gesture determination unit 25, a motion region detection unit 27, a motion region processing unit 29, a recognition result notification unit 31, a feature point storage unit 33, a histogram storage unit 35, and a motion region storage unit 37 Consists of including.
- the frame image generation unit 11 generates a frame image based on the video signal input from the imaging sensor 3. Note that the frame image generation unit 11 may be provided in the imaging sensor 3.
- the gray image generation unit 13 generates a gray image M (generic name of the gray image) having a coarser resolution than the frame image based on the frame image supplied from the frame image generation unit 11.
- the grayscale image M is generated as a monotone image obtained by compressing a frame image to a resolution of 1/256, for example.
- the feature point detector 15 detects a feature point in the gray image M based on the gray image M supplied from the gray image generator 13.
- the feature point in the grayscale image M means a pixel pattern corresponding to a characteristic part such as a corner portion of an object captured by the image sensor 3.
- the feature point detection result is temporarily stored in the feature point storage unit 33 as feature point data.
- the feature point processing unit 17 processes the feature point data for a plurality of grayscale images M included in a determination period corresponding to the immediately preceding several frames to several tens of frames.
- the feature point processing unit 17 tracks feature points in the grayscale image M based on the feature point data read from the feature point storage unit 33. Then, the movement vectors of the feature points are calculated, and the movement vectors are clustered according to the movement direction of the feature points.
- the feature point processing unit 17 targets feature points (disappearing feature points) that disappeared during tracking among the feature points tracked in the gradation image M for a plurality of gradation images M included within a predetermined period.
- a ratio is calculated and compared with a predetermined threshold.
- the predetermined period is set as a period shorter than the determination period.
- An erasure feature point means a feature point that is lost during tracking for a predetermined period and becomes untrackable.
- the comparison result of the disappearance feature points is supplied to the gesture determination unit 25.
- the sensor movement determination unit 19 determines the movement of the imaging sensor 3 (or the gesture recognition device 1 equipped with the imaging sensor 3) based on the clustering result supplied from the feature point processing unit 17.
- the sensor movement determination unit 19 calculates a ratio of movement vectors representing movement in a certain direction among the movement vectors of the feature points, and compares the ratio with a predetermined threshold value. Then, it is determined that the imaging sensor 3 has moved when the calculation result is equal to or greater than the predetermined threshold value, and it is determined that the imaging sensor 3 has not moved when the calculation result is less than the predetermined value.
- the determination result of the sensor movement is supplied to the gesture determination unit 25.
- the histogram calculation unit 21 calculates a histogram H (generic name of histogram) indicating the frequency distribution of the luminance value i for the pixels constituting the gray image M based on the gray image M supplied from the gray image generation unit 13.
- the calculation result of the histogram H is temporarily stored in the histogram storage unit 35 as histogram data.
- the histogram processing unit 23 calculates the ratio of pixels with a certain luminance value i for a plurality of grayscale images M included within a predetermined period based on the histogram data read from the histogram storage unit 35. Then, the histogram processing unit 23 determines whether the ratio of pixels with a certain luminance value i is greater than or equal to a predetermined threshold over a predetermined period.
- the predetermined period is set as a period shorter than the determination period. The determination result of the pixel ratio is supplied to the gesture determination unit 25.
- the gesture determination unit 25 is supplied with the comparison result of the lost feature points from the feature point processing unit 17 and is supplied with the determination result of the pixel ratio from the histogram processing unit 23.
- the gesture determination unit 25 determines whether the ratio of disappearing feature points is equal to or greater than a predetermined threshold and whether the ratio of pixels with a certain luminance value i is equal to or greater than the predetermined threshold over a predetermined period.
- a shielding gesture is recognized.
- the result of the shielding determination is supplied to the recognition result notification unit 31.
- the gesture determination unit 25 recognizes the shielding gesture only when the imaging sensor 3 is not moving based on the determination result of the sensor movement supplied from the sensor movement determination unit 19.
- the motion region detection unit 27 detects a motion region based on the frame difference of the gray image M supplied from the gray image generation unit 13.
- the detection result of the motion area is temporarily stored in the motion area storage unit 37 as motion area data.
- the movement area is an area representing an object moving in the grayscale image M.
- the motion region processing unit 29 processes the motion region data of the grayscale image M for a plurality of grayscale images M included in a predetermined period.
- the center of gravity position of the motion region is calculated based on the motion region data read from the motion region storage unit 37, and the movement locus of the motion region in the continuous grayscale image M is calculated.
- the predetermined period is set as a period shorter than the determination period.
- the gesture determination unit 25 described above calculates the movement amount (speed if necessary) of the movement region based on the calculation result of the movement locus supplied from the movement region processing unit 29. Then, the gesture determination unit 25 determines whether the movement amount (speed as necessary) of the movement area satisfies a predetermined criterion. Here, if the determination result is affirmative, a flick gesture is recognized. The result of the flick determination is supplied to the recognition result notification unit 31.
- the recognition result notification unit 31 notifies the user U of the gesture recognition result based on the determination result supplied from the gesture determination unit 25.
- the gesture recognition result is notified, for example, as character information, image information, audio information, or the like through a display, a speaker, or the like connected to the gesture recognition device 1.
- predetermined periods used in the feature point process, the histogram process, and the motion region process may be set as the same period or may be set as periods slightly shifted from each other.
- predetermined threshold values used in the feature point process, the histogram process, and the movement determination process are set for each according to the required detection accuracy.
- the feature point detection unit 15 and the feature point processing unit 17 function as a first detection unit, and the histogram calculation unit 21 and the histogram processing unit 23 function as a second detection unit.
- the feature point storage unit 33, the histogram storage unit 35, and the motion region storage unit 37 are configured as, for example, an internal storage device or an external storage device controlled by a processor or the like.
- the motion region processing unit 29 and the recognition result notification unit 31 are configured as an information processing apparatus with a processor such as a CPU or DSP, for example.
- the functions of the above constituent elements may be at least partially realized as hardware such as a circuit, or may be realized as software such as a program.
- each component is realized as software, the function of each component is realized through a program executed on the processor.
- the gesture recognition device 1 performs a recognition process for recognizing the occlusion gesture and the flick gesture (step S1). Details of the recognition process will be described later.
- a shielding gesture or a flick gesture is recognized (in the case of “Yes” in S3 and S5)
- the recognition result is notified to the user U (S7)
- processing corresponding to the recognized gesture is executed (S8). ).
- the recognition process is repeated until the recognition process ends (S9). Note that the recognition result may be notified when the gesture is not recognized.
- shielding gesture Next, a recognition process for recognizing the shielding gesture will be described.
- the frame image generation unit 11 when the recognition process is started, the frame image generation unit 11 generates a frame image based on the video signal input from the imaging sensor 3 (S11).
- the frame image may be generated for each frame, or may be generated at intervals of several frames by thinning out the video signal.
- the gray image generation unit 13 generates a gray image M based on the frame image supplied from the frame image generation unit 11 (S13).
- the gray image generation unit 13 by performing the detection process using the grayscale image M having a coarser resolution than the frame image, it is possible to efficiently detect the change in the frame image and the gradient of the luminance value i. Further, by using the monotone image, it is possible to detect the change of the frame image and the gradient of the luminance value i with relatively high accuracy even in an environment where the shadow is relatively poor.
- the feature point detection unit 15 detects feature points in the gray image M based on the gray image M supplied from the gray image generation unit 13 (S15).
- the feature point detection result is temporarily stored in the feature point storage unit 33 in association with the frame number as feature point data including a pixel pattern of the feature point, a detection position, and the like (S15).
- FIG. 5 shows the detection result of the feature points before the gesture.
- a marker C indicating a plurality of feature points detected from the image is displayed together with a grayscale image M1 including an image capturing the upper body and background of the user U.
- pixel patterns corresponding to characteristic portions of the user U and the background are detected as feature points.
- the histogram calculation unit 21 calculates the histogram H of the luminance value i for the pixels constituting the gray image M based on the gray image M supplied from the gray image generation unit 13 (S17).
- the calculation result of the histogram H is temporarily stored in the histogram storage unit 35 in association with the frame number as histogram data indicating the frequency distribution of the luminance value i.
- the histogram H may be calculated when the grayscale image M is generated (S13).
- FIG. 6 shows the calculation results of the gray image M1 and the luminance value histogram H1 before the gesture.
- the histogram H represents the frequency distribution of the luminance value i with the horizontal axis representing the luminance value i (class value) and the vertical axis representing the frequency hi of the luminance value i.
- the distribution of the luminance value i can be expressed using a normalized index r of the following equation.
- the sum of the frequencies hi is hsum
- the maximum frequency luminance value i is imax
- a predetermined range near the maximum frequency luminance value imax is w.
- the predetermined range w is set according to the required detection accuracy.
- the normalized index r is an index obtained by normalizing the sum of the frequencies hi in the predetermined range w near the luminance value imax having the maximum frequency by the sum total hsum of the frequencies.
- the normalization index r is calculated as a larger value as the grayscale image M is composed of pixels with a constant luminance value i, that is, as the region having a smaller gradient of the luminance value i has more regions.
- the processing in steps S11 to S17 is performed on a plurality of frame images included in a determination period (0.5 seconds or the like) corresponding to the immediately preceding several frames to several tens of frames, for example, in the first determination period.
- the processes of numbers 1 to 10 and the processes of frame numbers 2 to 11 in the second determination period are sequentially executed.
- feature point data and histogram data are temporarily stored so as to correspond to at least the determination period.
- the feature point processing unit 17 first tracks feature points in the plurality of gray images M based on the feature point data read from the feature point storage unit 33 (S19).
- the tracking of the feature points is performed by specifying the same feature points in the continuous grayscale image M based on the pixel pattern.
- the tracking result of the feature point can be expressed as a movement locus of the feature point. Note that feature points that have disappeared from the grayscale image M during tracking of feature points are considered disappearance feature points.
- the movement vector of the feature point is calculated, and the movement vector is clustered according to the movement direction of the feature point (S21).
- the feature point movement vector is represented as a straight line or a curve connecting the movement start point and the movement end point of the feature point tracked in the plurality of grayscale images M included in the determination period.
- the sensor movement determination unit 19 determines the movement of the image sensor 3 based on the clustering result supplied from the feature point processing unit 17 (S23). First, a ratio of movement vectors representing movement in a certain direction among movement vectors of feature points is calculated and compared with a predetermined threshold (ratio 0.8, etc.). Then, when the calculation result is equal to or greater than the predetermined threshold, it is determined that the imaging sensor 3 has moved, and when the calculation result is less than the predetermined threshold, it is determined that the imaging sensor 3 has not moved.
- a predetermined threshold ratio 0.8, etc.
- FIG. 7 shows the detection results of feature points when the image sensor 3 moves.
- the gray image M3 shown in FIG. 7 is the gray image M several frames after the gray image M1 shown in FIG.
- the image sensor 3 moves in the lower right direction, so that the feature point in the grayscale image M3 moves in the upper left direction.
- the movement of the feature point is displayed by the marker C representing the movement locus of the feature point together with the grayscale image M3.
- the marker C representing the movement locus of the feature point
- the gesture determination unit 25 determines that the imaging sensor 3 is not shielded in order to prevent the shielding gesture from being erroneously recognized due to erroneous detection of the disappearance feature point when the imaging sensor 3 moves. (S25).
- the feature point processing unit 17 based on the feature point data read out from the feature point storage unit 33, disappearance of feature points tracked in the grayscale image M for a plurality of grayscale images M included within a predetermined period.
- the ratio of feature points is calculated and compared with a predetermined threshold (ratio 0.8, etc.) (S27). That is, the ratio of the feature points that disappeared within the predetermined period to the feature points detected within the predetermined period (the total of feature points that have been detected over the predetermined period and the feature points that disappeared during the predetermined period) is a predetermined threshold value.
- FIG. 8 illustrates a feature point detection result at the time of gesture.
- a grayscale image M ⁇ b> 2 that captures a hand that shields the front surface of the imaging sensor 3 is displayed.
- the image capturing the upper body and background of the user U is hidden by shielding the front surface of the imaging sensor 3, and the marker C indicating the feature point detected from the image disappears. .
- the histogram processing unit 23 calculates the ratio of pixels with a certain luminance value i for a plurality of grayscale images M included within a predetermined period based on the histogram data read from the histogram storage unit 35.
- the ratio of pixels with a constant luminance value i can be expressed by the normalization index r described above. Then, it is determined whether the ratio of pixels with a certain luminance value i is equal to or greater than a predetermined threshold (r> 0.7 etc.) over a predetermined period (S29).
- FIG. 9 shows calculation results of the grayscale image M2 and the luminance value histogram H2 at the time of the gesture.
- the grayscale image M2 shown in FIG. 9 captures a hand that shields the front surface of the image sensor 3, and includes many pixels with a certain luminance value i.
- the frequency hi is concentrated in the predetermined range w in the vicinity of the maximum frequency luminance value imax, and there is no large variation in the distribution of the luminance value i.
- a large normalized index r is calculated for the predetermined period. Therefore, it is determined that the ratio of pixels with a certain luminance value i is equal to or greater than a predetermined threshold over a predetermined period, that is, there are many regions over the predetermined period where the gradient of the luminance value i is smaller than the predetermined threshold.
- the gesture determination unit 25 the comparison result of the lost feature points is supplied from the feature point processing unit 17, and the determination result of the pixel ratio is supplied from the histogram processing unit 23. Then, it is determined whether the ratio of disappearing feature points is equal to or greater than a predetermined threshold and the ratio of pixels with a certain luminance value i is equal to or greater than the predetermined threshold over a predetermined period.
- the determination result is affirmative, it is determined that the image sensor 3 is shielded (S31), and the shielding gesture is recognized. If at least one of the conditions is not satisfied, it is determined that the imaging sensor 3 is not shielded (S25), and the shielding gesture is not recognized.
- the recognition result notification unit 31 notifies the user U of the recognition result according to the result of the shielding determination supplied from the gesture determination unit 25. Then, when a shielding gesture is recognized, a corresponding process is executed.
- the motion region detection unit 27 detects a motion region based on the frame difference of the gray image M supplied from the gray image generation unit 13. That is, the motion region is detected by obtaining the change region included in the continuous grayscale image M. The detection result of the motion area is temporarily stored in the motion area storage unit 37 as motion area data.
- the motion region processing unit 29 processes the motion region data of the grayscale image M for a plurality of grayscale images M included in a predetermined period. Based on the motion region data read from the motion region storage unit 37, the center of gravity position of the motion region is calculated, and the movement locus of the motion region in the continuous grayscale image M is calculated.
- the gesture determination unit 25 calculates the movement amount (speed if necessary) of the movement region based on the calculation result of the movement locus supplied from the movement region processing unit 29.
- the gesture determination unit 25 first determines whether the size of the motion region is less than a predetermined threshold so that the movement caused by the movement of the imaging sensor 3 is not recognized as a flick gesture (when the imaging sensor 3 moves, Because the image moves as a whole). Next, it is determined whether or not the movement amount of the motion region is equal to or greater than a predetermined threshold so that a motion with a very small movement amount is not recognized as a flick gesture.
- the moving direction of the moving area is a predetermined direction. For example, when recognizing the left / right flick gesture, it is determined whether the moving direction of the motion region can be recognized as the left / right direction in consideration of an allowable error with respect to the imaging sensor 3. Here, if the determination result is affirmative, a flick gesture is recognized. The result of the flick determination is supplied to the recognition result notification unit 31 and notified to the user U, and processing corresponding to the flick gesture is executed according to the recognition result.
- ⁇ Second Embodiment> [4. Configuration and Operation of Gesture Recognition Device 2]
- the gesture recognition device 2 according to the second embodiment recognizes the occlusion gesture using the edge region A in the edge image E (a generic name of the edge image) instead of the histogram H indicating the frequency distribution of the luminance value i.
- the description which overlaps with 1st Embodiment is abbreviate
- the gesture recognition device 2 includes an edge region extraction unit 41 and an edge region processing unit 43 instead of the histogram calculation unit 21 and the histogram processing unit 23.
- the edge region extraction unit 41 generates an edge image E based on the grayscale image M supplied from the grayscale image generation unit 13 and extracts the edge region A from the edge image E.
- the edge region A is extracted using, for example, a Sobel filter, a Laplacian filter, a LOG filter, a Canny method, or the like.
- the extraction result of the edge area A is temporarily stored in the edge area storage unit 45 as edge area data.
- the edge region processing unit 43 calculates the ratio of the edge region A in the edge image E for a plurality of edge images E included in a predetermined period based on the edge region data read from the edge region storage unit 45. To do. Then, the edge region processing unit 43 determines whether the ratio of the edge region A is less than a predetermined threshold (0.1 or the like) over a predetermined period.
- the predetermined period is set as a period shorter than the determination period corresponding to several tens of frames from the immediately preceding several frames.
- the determination result of the edge region A is supplied to the gesture determination unit 25.
- the gesture determination unit 25 is supplied with the comparison result of the lost feature point from the feature point processing unit 17 and is supplied with the determination result of the edge region A from the edge region processing unit 43. Then, the gesture determination unit 25 determines whether the ratio of the disappearing feature points is equal to or greater than a predetermined threshold and the ratio of the edge region A is less than the predetermined threshold over a predetermined period.
- the edge region extraction unit 41 generates an edge image E based on the grayscale image M supplied from the grayscale image generation unit 13, and extracts the edge region A (S41).
- the edge region A is temporarily stored in the edge region storage unit 45 as edge region data indicating the ratio of the edge region A in the edge image E in association with the frame number (S41).
- the ratio of the edge region A in the edge image E is determined based on the edge region data read from the edge region storage unit 45 for a plurality of edge images E included in a predetermined period. It is determined whether it is less than a predetermined threshold over a period (S43).
- FIG. 12 shows a gray image M1 and an edge image E1 before the gesture.
- the edge image E is an image showing an edge region A that forms a boundary between pixels having a large difference in luminance value i among the pixels constituting the grayscale image M.
- the edge image E1 shown in FIG. 12 captures the upper body and background of the user U, the edge image E1 is composed of pixels with various luminance values i. For this reason, in the edge image E1, there are many pixels having different luminance values i, and many edge regions A that form boundaries between pixels having a large difference in luminance values i are recognized.
- FIG. 13 shows a grayscale image M2 and an edge image E2 at the time of gesture.
- the grayscale image M2 shown in FIG. 13 captures the hand that shields the front surface of the image sensor 3, it is configured to include many pixels with a certain luminance value i.
- the edge image E2 there are not many pixels with different luminance values i, and many edge regions A that form boundaries between pixels with large luminance values i are not recognized.
- an edge image E that does not include many edge regions A is generated for a predetermined period. Therefore, it is determined that the ratio of the edge area A is less than the predetermined threshold over a predetermined period, that is, there are many areas where the gradient of the luminance value i is smaller than the predetermined threshold over the predetermined period.
- the gesture determination unit 25 the comparison result of the lost feature point is supplied from the feature point processing unit 17, and the determination result of the edge region ratio is supplied from the edge region processing unit 43. Then, it is determined whether the ratio of the disappearing feature points is equal to or greater than a predetermined threshold and the ratio of the edge region A is less than the predetermined threshold over a predetermined period.
- the determination result is affirmative, it is determined that the image sensor 3 is shielded (S31), and the shielding gesture is recognized.
- the shielding gesture is recognized using the grayscale image M corresponding to the partial area of the captured image instead of the grayscale image M corresponding to the entire captured image.
- the description which overlaps with 1st and 2nd embodiment is abbreviate
- the frame image generation unit 11 or the gray image generation unit 13 generates a frame image or gray image M corresponding to a partial area of the frame image.
- the partial area of the frame image means an area that is shielded by an object such as a hand during the shielding gesture in the front area of the imaging sensor 3.
- the partial area is set in advance as a predetermined range such as the upper part of the captured image.
- the first and second detection processes are performed on the partial region of the frame image (region F in FIG. 14) as in the first and second embodiments.
- the ratio of disappearing feature points is equal to or greater than a predetermined threshold
- the ratio of pixels with a constant luminance value i is equal to or greater than a predetermined threshold over a predetermined period. It is determined whether the ratio is less than a predetermined threshold over a predetermined period.
- the upper part of the captured image is partially shielded.
- the detection result of the feature point at the time of the gesture is shown.
- Processing is performed on the area.
- a shielding gesture can be recognized by partially shielding a predetermined range. Further, even when the object that shields the imaging sensor 3 is slightly shaded by the influence of illumination, lighting, etc., the shielding gesture can be recognized.
- the change in the captured image (grayscale image M) and the gradient of the luminance value i are detected. It is not necessary to detect the shape of the object located in the immediate vicinity of the front of the camera, and since the gesture is recognized based on the captured image (grayscale image M) of the imaging sensor 3, it is not necessary to use a special device. Therefore, it is possible to recognize a gesture that blocks the sensor surface of the image sensor 3 without using a special device.
- a shielding gesture that shields the front surface of the imaging sensor 3 is recognized.
- a gesture for opening the front surface of the image sensor 3 determines whether the ratio of newly detected feature points is equal to or greater than a predetermined threshold (ratio 0.8, etc.), and the pixel with a certain luminance value i is determined. It is recognized by determining whether the ratio is less than a predetermined threshold (ratio 0.2 or the like).
- the gesture recognition device 1 may be applied to, for example, an application that can perform a toggle operation such as playback stop of a moving image or a slide show, on / off switching of a menu display, or an application that can perform a mode operation such as a change of a playback mode. Good.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
まず、図1を参照して、本発明の実施形態に係るジェスチャ認識装置1の概要について説明する。
[2.ジェスチャ認識装置1の構成]
つぎに、図2を参照して、第1の実施形態に係るジェスチャ認識装置1の構成について説明する。
つぎに、図3から図9を参照して、第1の実施形態に係るジェスチャ認識装置1の動作について説明する。
つぎに、遮蔽ジェスチャを認識するための認識処理について説明する。
つぎに、フリックジェスチャを認識するための認識処理について説明する。
[4.ジェスチャ認識装置2の構成および動作]
つぎに、図10から図13を参照して、第2の実施形態に係るジェスチャ認識装置2について説明する。第2の実施形態に係るジェスチャ認識装置2は、輝度値iの頻度分布を示すヒストグラムHに代えて、エッジ画像E(エッジ画像の総称)中のエッジ領域Aを用いて遮蔽ジェスチャを認識する。なお、以下では、第1の実施形態と重複する説明を省略する。
つぎに、図14を参照して、第1および第2の実施形態の変形例に係るジェスチャ認識装置について説明する。変形例では、撮影画像全体に相当する濃淡画像Mに代えて、撮影画像の部分領域に相当する濃淡画像Mを用いて遮蔽ジェスチャを認識する。なお、以下では、第1および第2の実施形態と重複する説明を省略する。
以上説明したように、本発明の実施形態に係るジェスチャ認識装置1、2およびジェスチャ認識方法によれば、撮影画像(濃淡画像M)の変化および輝度値iの勾配を検出するので、撮像センサ3の前面に真近かに位置する物体の形状を検出しなくてもよく、撮像センサ3の撮影画像(濃淡画像M)に基づきジェスチャを認識するので、特殊なデバイスを利用しなくてもよい。よって、特殊なデバイスを利用せずに、撮像センサ3のセンサ面を遮蔽するジェスチャを認識することができる。
11 フレーム画像生成部
13 濃淡画像生成部
15 特徴点検出部
17 特徴点処理部
19 センサ移動判定部
21 ヒストグラム算出部
23 ヒストグラム処理部
25 ジェスチャ判定部
27 動き領域検出部
29 動き領域処理部
31 認識結果通知部
33 特徴点記憶部
35 ヒストグラム記憶部
37 動き領域記憶部
41 エッジ領域抽出部
43 エッジ領域処理部
45 エッジ領域記憶部
Pa、Pb 撮影画像
M1、M2、M3、M4 濃淡画像
H1、H2 輝度値ヒストグラム
E1、E2 エッジ画像
C 特徴点マーク
A エッジ領域
Claims (15)
- 撮像センサの前面を遮蔽していない状態と遮蔽している状態の間で撮影画像の変化を検出する第1の検出部と、
前記撮像センサの前面を遮蔽している状態の前記撮影画像において、前記撮影画像の輝度値の勾配が閾値未満の領域を検出する第2の検出部と、
を備えるジェスチャ認識装置。 - 前記第1の検出部は、前記撮影画像中の特徴点の追跡結果に基づき前記撮影画像の変化を検出する、請求項1に記載のジェスチャ認識装置。
- 前記第1の検出部は、前記撮像センサの前面を遮蔽していない状態の前記撮影画像中で追跡される特徴点が前記撮像センサの前面を手で覆った状態の前記撮影画像中で消失することを検出する、請求項2に記載のジェスチャ認識装置。
- 前記第1の検出部は、所定期間に含まれる複数の前記撮影画像中で追跡される前記特徴点のうち、追跡中に消失した前記特徴点の比率が閾値以上であるかを判定する、請求項3に記載のジェスチャ認識装置。
- 複数の前記特徴点の移動傾向に基づき前記撮像センサの移動を判定する移動判定部をさらに備え、
前記所定期間は、前記撮像センサが移動していない期間として設定される、請求項4に記載のジェスチャ認識装置。 - 前記第2の検出部は、前記撮影画像に関する輝度値ヒストグラムの算出結果に基づき前記撮影画像の輝度値の勾配が閾値未満の領域を検出する、請求項1に記載のジェスチャ認識装置。
- 前記第2の検出部は、前記所定期間に含まれる複数の前記撮影画像に関する前記輝度値ヒストグラムを用いて、最大頻度近傍の頻度の和を頻度の総和で正規化した値が前記所定期間に亘って閾値以上であるかを判定する、請求項6に記載のジェスチャ認識装置。
- 前記第2の検出部は、前記撮影画像に関するエッジ画像に基づき前記撮影画像の輝度値の勾配が閾値未満の領域を検出する、請求項1に記載のジェスチャ認識装置。
- 前記第2の検出部は、前記所定期間に含まれる複数の前記撮影画像に関する前記エッジ画像を用いて、前記エッジ画像中のエッジ領域の比率が前記所定期間に亘って閾値未満であるかを判定する、請求項8に記載のジェスチャ認識装置。
- 前記第1および第2の検出部は、前記撮影画像に代えて前記撮影画像の部分領域に対して処理を行う、請求項1に記載のジェスチャ認識装置。
- 前記第1および第2の検出部は、前記撮影画像から前記撮影画像よりも粗い解像度で生成された濃淡画像に対して処理を行う、請求項1に記載のジェスチャ認識装置。
- 前記撮像センサの前面を遮蔽するジェスチャと前記撮像センサの前面を開放するジェスチャの組合せからなるジェスチャを認識する、請求項1に記載のジェスチャ認識装置。
- 前方の画像を捉える前記撮影センサをさらに備える、請求項1に記載のジェスチャ認識装置。
- 撮像センサの前面を遮蔽している状態と遮蔽していない状態の間で撮影画像の変化を検出するステップと、
前記撮像センサの前面を遮蔽している状態の前記撮影画像において、前記撮影画像の輝度値の勾配が閾値未満の領域を検出するステップと、
を含むジェスチャ認識方法。 - 撮像センサの前面を遮蔽している状態と遮蔽していない状態の間で撮影画像の変化を検出するステップ、
前記撮像センサの前面を遮蔽している状態の前記撮影画像において、前記撮影画像の輝度値の勾配が閾値未満の領域を検出するステップ、
をコンピュータに実行させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112012031335A BR112012031335A2 (pt) | 2010-06-15 | 2011-03-30 | dispositivo e método de reconhecimento de gesto, e, programa |
US13/702,448 US20130088426A1 (en) | 2010-06-15 | 2011-03-30 | Gesture recognition device, gesture recognition method, and program |
RU2012152935/08A RU2012152935A (ru) | 2010-06-15 | 2011-03-30 | Устройство распознавания жестов, способ распознавания жестов и программа |
CN2011800283448A CN102939617A (zh) | 2010-06-15 | 2011-03-30 | 姿势识别装置、姿势识别方法和程序 |
EP11795450.3A EP2584531A1 (en) | 2010-06-15 | 2011-03-30 | Gesture recognition device, gesture recognition method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010136399A JP5685837B2 (ja) | 2010-06-15 | 2010-06-15 | ジェスチャ認識装置、ジェスチャ認識方法およびプログラム |
JP2010-136399 | 2010-06-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011158542A1 true WO2011158542A1 (ja) | 2011-12-22 |
Family
ID=45347954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/057944 WO2011158542A1 (ja) | 2010-06-15 | 2011-03-30 | ジェスチャ認識装置、ジェスチャ認識方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20130088426A1 (ja) |
EP (1) | EP2584531A1 (ja) |
JP (1) | JP5685837B2 (ja) |
CN (1) | CN102939617A (ja) |
BR (1) | BR112012031335A2 (ja) |
RU (1) | RU2012152935A (ja) |
WO (1) | WO2011158542A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013235449A (ja) * | 2012-05-09 | 2013-11-21 | Kddi Corp | 情報管理装置、情報管理方法、及びプログラム |
US20140192245A1 (en) * | 2013-01-07 | 2014-07-10 | Samsung Electronics Co., Ltd | Method and mobile terminal for implementing preview control |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110181587A1 (en) * | 2010-01-22 | 2011-07-28 | Sony Corporation | Image display device having imaging device |
US20130211843A1 (en) * | 2012-02-13 | 2013-08-15 | Qualcomm Incorporated | Engagement-dependent gesture recognition |
US9600744B2 (en) * | 2012-04-24 | 2017-03-21 | Stmicroelectronics S.R.L. | Adaptive interest rate control for visual search |
US9791921B2 (en) * | 2013-02-19 | 2017-10-17 | Microsoft Technology Licensing, Llc | Context-aware augmented reality object commands |
JP5928386B2 (ja) * | 2013-03-22 | 2016-06-01 | カシオ計算機株式会社 | 表示制御装置、表示制御方法及びプログラム |
US9600993B2 (en) * | 2014-01-27 | 2017-03-21 | Atlas5D, Inc. | Method and system for behavior detection |
US9536136B2 (en) * | 2015-03-24 | 2017-01-03 | Intel Corporation | Multi-layer skin detection and fused hand pose matching |
JP7083809B2 (ja) | 2016-08-02 | 2022-06-13 | アトラス5ディー, インコーポレイテッド | プライバシーの保護を伴う人物の識別しおよび/または痛み、疲労、気分、および意図の識別および定量化のためのシステムおよび方法 |
DE112017004394T5 (de) * | 2016-09-01 | 2019-05-16 | Mitsubishi Electric Corporation | Gestenbeurteilungseinrichtung, Gestenbedienungseinrichtung und Gestenbeurteilungsverfahren |
US20200042105A1 (en) * | 2017-04-27 | 2020-02-06 | Sony Corporation | Information processing apparatus, information processing method, and recording medium |
CN107479712B (zh) * | 2017-08-18 | 2020-08-04 | 北京小米移动软件有限公司 | 基于头戴式显示设备的信息处理方法及装置 |
CN108288276B (zh) * | 2017-12-29 | 2021-10-19 | 安徽慧视金瞳科技有限公司 | 一种投影交互系统中触摸模式下的干扰滤除方法 |
CN109409236B (zh) * | 2018-09-28 | 2020-12-08 | 江苏理工学院 | 三维静态手势识别方法和装置 |
US20230252821A1 (en) * | 2021-01-26 | 2023-08-10 | Boe Technology Group Co., Ltd. | Control Method, Electronic Device, and Storage Medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07146749A (ja) | 1993-11-25 | 1995-06-06 | Casio Comput Co Ltd | スイッチ装置 |
JP2006302199A (ja) * | 2005-04-25 | 2006-11-02 | Hitachi Ltd | 部分的にウィンドウをロックする情報処理装置およびこの情報処理装置を動作させるプログラム |
JP2009530726A (ja) * | 2006-03-22 | 2009-08-27 | フオルクスワーゲン・アクチエンゲゼルシヤフト | 対話型操作装置および対話型操作装置を動作させるための方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6625216B1 (en) * | 1999-01-27 | 2003-09-23 | Matsushita Electic Industrial Co., Ltd. | Motion estimation using orthogonal transform-domain block matching |
US6346933B1 (en) * | 1999-09-21 | 2002-02-12 | Seiko Epson Corporation | Interactive display presentation system |
KR100444784B1 (ko) * | 2001-11-15 | 2004-08-21 | 주식회사 에이로직스 | 에지검출을 통한 경보발생방법 및 보안 시스템 |
DE602004006190T8 (de) * | 2003-03-31 | 2008-04-10 | Honda Motor Co., Ltd. | Vorrichtung, Verfahren und Programm zur Gestenerkennung |
US9317124B2 (en) * | 2006-09-28 | 2016-04-19 | Nokia Technologies Oy | Command input by hand gestures captured from camera |
JP4967666B2 (ja) * | 2007-01-10 | 2012-07-04 | オムロン株式会社 | 画像処理装置および方法、並びに、プログラム |
JP4898531B2 (ja) * | 2007-04-12 | 2012-03-14 | キヤノン株式会社 | 画像処理装置及びその制御方法、並びにコンピュータプログラム |
US20090174674A1 (en) * | 2008-01-09 | 2009-07-09 | Qualcomm Incorporated | Apparatus and methods for a touch user interface using an image sensor |
US8184196B2 (en) * | 2008-08-05 | 2012-05-22 | Qualcomm Incorporated | System and method to generate depth data using edge detection |
-
2010
- 2010-06-15 JP JP2010136399A patent/JP5685837B2/ja not_active Expired - Fee Related
-
2011
- 2011-03-30 BR BR112012031335A patent/BR112012031335A2/pt not_active IP Right Cessation
- 2011-03-30 EP EP11795450.3A patent/EP2584531A1/en not_active Withdrawn
- 2011-03-30 CN CN2011800283448A patent/CN102939617A/zh active Pending
- 2011-03-30 RU RU2012152935/08A patent/RU2012152935A/ru not_active Application Discontinuation
- 2011-03-30 WO PCT/JP2011/057944 patent/WO2011158542A1/ja active Application Filing
- 2011-03-30 US US13/702,448 patent/US20130088426A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07146749A (ja) | 1993-11-25 | 1995-06-06 | Casio Comput Co Ltd | スイッチ装置 |
JP2006302199A (ja) * | 2005-04-25 | 2006-11-02 | Hitachi Ltd | 部分的にウィンドウをロックする情報処理装置およびこの情報処理装置を動作させるプログラム |
JP2009530726A (ja) * | 2006-03-22 | 2009-08-27 | フオルクスワーゲン・アクチエンゲゼルシヤフト | 対話型操作装置および対話型操作装置を動作させるための方法 |
Non-Patent Citations (1)
Title |
---|
IKE ET AL.: "Hand Gesture User Interface Using Cell Broadband EngineTM", TOSHIBA REVIEW, vol. 62, no. 6, 2007, pages 52 - 55 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013235449A (ja) * | 2012-05-09 | 2013-11-21 | Kddi Corp | 情報管理装置、情報管理方法、及びプログラム |
US20140192245A1 (en) * | 2013-01-07 | 2014-07-10 | Samsung Electronics Co., Ltd | Method and mobile terminal for implementing preview control |
US9635267B2 (en) * | 2013-01-07 | 2017-04-25 | Samsung Electronics Co., Ltd. | Method and mobile terminal for implementing preview control |
Also Published As
Publication number | Publication date |
---|---|
JP2012003414A (ja) | 2012-01-05 |
JP5685837B2 (ja) | 2015-03-18 |
RU2012152935A (ru) | 2014-06-20 |
BR112012031335A2 (pt) | 2016-10-25 |
EP2584531A1 (en) | 2013-04-24 |
US20130088426A1 (en) | 2013-04-11 |
CN102939617A (zh) | 2013-02-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5685837B2 (ja) | ジェスチャ認識装置、ジェスチャ認識方法およびプログラム | |
US11138709B2 (en) | Image fusion processing module | |
US8498444B2 (en) | Blob representation in video processing | |
JP5445460B2 (ja) | なりすまし検知システム、なりすまし検知方法及びなりすまし検知プログラム | |
KR101173802B1 (ko) | 대상물 추적 장치, 대상물 추적 방법, 및 제어 프로그램이 기록된 기록 매체 | |
US8379987B2 (en) | Method, apparatus and computer program product for providing hand segmentation for gesture analysis | |
US9076257B2 (en) | Rendering augmented reality based on foreground object | |
JP5935308B2 (ja) | 利用者検知装置、方法及びプログラム | |
US10853927B2 (en) | Image fusion architecture | |
US9020210B2 (en) | Image processing system, image processing apparatus, image processing method, and program | |
US8254630B2 (en) | Subject extracting method and device by eliminating a background region using binary masks | |
JP5878924B2 (ja) | 画像処理装置、撮像装置および画像処理方法 | |
EP2079009A1 (en) | Apparatus and methods for a touch user interface using an image sensor | |
WO2007105768A1 (ja) | 顔画像登録装置、顔画像登録方法、顔画像登録プログラム、および記録媒体 | |
WO2011161579A1 (en) | Method, apparatus and computer program product for providing object tracking using template switching and feature adaptation | |
CN108776822B (zh) | 目标区域检测方法、装置、终端及存储介质 | |
JP6551226B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
WO2019134505A1 (zh) | 图像虚化方法、存储介质及电子设备 | |
JP4848521B2 (ja) | プロジェクタ投影映像のマウス代替操作方法およびそのマウス代替操作システム | |
CN113194253A (zh) | 去除图像反光的拍摄方法、装置和电子设备 | |
JP2010271876A (ja) | 移動体追跡装置、移動体追跡方法及びプログラム | |
CN112532884A (zh) | 识别方法、装置及电子设备 | |
CN109040604B (zh) | 拍摄图像的处理方法、装置、存储介质及移动终端 | |
CN114693702B (zh) | 图像处理方法、装置、电子设备及存储介质 | |
JP2006190106A (ja) | パターン検出プログラムおよびパターン検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180028344.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11795450 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011795450 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13702448 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2012152935 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10256/CHENP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012031335 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012031335 Country of ref document: BR Kind code of ref document: A2 Effective date: 20121207 |