WO2012144145A1 - ポインティング制御装置とその集積回路、およびポインティング制御方法 - Google Patents
ポインティング制御装置とその集積回路、およびポインティング制御方法 Download PDFInfo
- Publication number
- WO2012144145A1 WO2012144145A1 PCT/JP2012/002376 JP2012002376W WO2012144145A1 WO 2012144145 A1 WO2012144145 A1 WO 2012144145A1 JP 2012002376 W JP2012002376 W JP 2012002376W WO 2012144145 A1 WO2012144145 A1 WO 2012144145A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- instructor
- face
- cursor
- fingertip
- distance
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
Definitions
- the present invention relates to a pointing control technique in which an operator recognizes an instruction operation indicating a position on a display screen.
- Patent Document 1 Patent Document 2, and Patent Document 4 disclose a technique that includes a photographing unit such as a camera, detects the position of an instructor's fingertip from a plurality of images obtained by the photographing unit, and performs pointing. ing.
- a photographing unit such as a camera
- Patent Document 1 Patent Document 2, and Patent Document 4 are techniques for pointing using one cursor, and the number of fingertips included in the image obtained by the photographing unit is the same as the number of cursors. It is a premise.
- the number of cursors used for pointing is not limited to one, but may be two or more.
- the cursor is determined from among the fingertips to be photographed. For this reason, there is a problem that it is impossible to determine a fingertip.
- An object of the present invention is to provide a pointing control technique in which even when a plurality of operators are photographed, an instructor is determined from the operators, and only the determined instructor can perform a pointing operation. .
- a pointing control device that determines an instructor who can specify a cursor position from among operators who perform an instruction operation.
- the position acquisition unit for acquiring the position of each operator, the position of each operator's fingertip or its substitute member, and the acquired position of each operator's face and the position of each operator's fingertip or its substitute member
- a distance calculation unit that calculates the distance between the position of the associated face and the position of the fingertip or its substitute member, and an instructor determination unit that determines the instructor using the distance
- a cursor position calculation unit that calculates a cursor position using the position of the fingertip of the instructor or a substitute member thereof and outputs the calculated cursor position.
- the distance between the face position and the fingertip or its substitute member is used to identify an instructor who designates the position of the pointing target from among a plurality of operators, and other operations Only the instructor can perform the pointing operation without being influenced by the person.
- FIG. 1 is a block diagram showing a configuration of a display control device including a pointing control device according to Embodiment 1 of the present invention.
- the image figure which shows the external appearance of the input-output terminal connected to the display control apparatus in Embodiment 1 of this invention
- the flowchart which shows operation
- the figure which shows the positional relationship of the detected fingertip and camera in Embodiment 1 of this invention.
- the figure which shows an example of the frame of a camera viewing angle from the position of the camera front in Embodiment 1 of this invention, a face position, and a fingertip position The flowchart which shows an example of operation
- the inventors have noted that the distance between the fingertip and the face is different between when the instruction operation is performed and when the instruction operation is not performed.
- a person who performs an instruction operation extends his hand and brings his fingertip closer to the screen, so the distance between the face and the fingertip is large and often not constant.
- a person who has not performed an instruction operation generally has a small movement of the hand and hardly stretches the elbow, so the distance between the face and the fingertip is small and often does not fluctuate. Therefore, by using the distance between the face and the fingertip, it is possible to select only a person who performs an instruction operation from a plurality of persons.
- the distance between the face and the fingertip changes greatly before and after starting the instruction operation and before and after the end. Therefore, the distance between the face and the fingertip can be used to detect that the operator has started the instruction operation and that the operator has stopped the instruction operation. For this reason, when the instructor stops the instruction operation, the instructor can be excluded from selection targets, and when another operator starts the instruction operation, the operator can be selected.
- FIG. 1 is a block diagram illustrating a configuration of a display control apparatus 100 including a pointing control apparatus according to an embodiment of the present invention.
- the display control device 100 is connected to an input / output terminal 200 and a personal computer (not shown) that outputs a display image, and a plurality of operators can view the screen displayed on one display device while viewing the screen. A case where a pointing operation is performed will be described.
- the display control device 100 includes a pointing control device 101, a display image input unit 102, a cursor composition unit 103, and an image output unit 104.
- the display control device 100 is realized by, for example, a main device of a TV conference system, and the pointing control device 101, the display image input unit 102, the cursor composition unit 103, and the image output unit 104 are realized as part of an integrated circuit.
- the input / output terminal 200 includes a camera 201 and a display device 202.
- the display device 202 is realized by a large liquid crystal display, for example.
- the camera 201 captures an image of an operator who performs a pointing operation for designating a position on the screen of the display device, and outputs the image to the display control device 100.
- the camera image output from the camera 201 includes not only the detection of the subject but also information that can acquire the distance to the subject.
- the camera 201 includes a pair of left and right cameras 201a and 201b, and outputs two captured images to the display control apparatus 100 as stereo images.
- the display control apparatus 100 calculates the distance from the camera to the subject using the parallax between the left image and the right image in the stereo images captured by the cameras 201a and 201b.
- the pointing control device 101 receives an image from the camera 201, and acquires the distance between the face of the operator photographed as the subject and the fingertip. When a plurality of operators are photographed, an instructor who designates the position of the pointing target is determined using the acquired plurality of face-to-face distances. The pointing control device 101 calculates the position of the cursor to be displayed from the determined instructor information, and outputs it as cursor position information.
- the pointing control device 101 includes a CPU (not shown) of the display control device 100 and a storage unit (not shown).
- the storage unit is realized by a nonvolatile memory such as an EEPROM, for example, and the CPU reads and executes a program stored in the storage unit.
- the display image input unit 102 has, for example, an HDMI (High-Definition Multimedia Interface) input terminal, and acquires information to be displayed on the screen as a display image from the personal computer through the input terminal.
- HDMI High-Definition Multimedia Interface
- the cursor composition unit 103 receives the cursor position information from the pointing control device 101, performs composition to superimpose the cursor on the image acquired by the display image input unit 102, and generates a composite display image. Further, until the cursor composition unit 103 receives the cursor position information from the pointing control device 101, the cursor composition unit 103 uses the predetermined cursor position as the cursor position information and composes the cursor.
- the predetermined cursor position is the upper left corner of the display image.
- the image output unit 104 receives the composite display image generated by the cursor composition unit 103 and outputs it to the display device 202.
- the display device 202 displays a composite display image on the screen.
- the pointing control device 101 includes a position acquisition unit 111, a face-to-finger distance calculation unit 112, an instructor determination unit 113, and a cursor position calculation unit 114.
- the position acquisition unit 111 receives an image from the camera 201, detects the position of the operator's face photographed as a subject and the position between the fingertips, and outputs the detected position information to the inter-finger distance calculation unit 112.
- the position acquisition unit 111 includes an image input unit 121, a face detection unit 122, and a fingertip detection unit 123.
- the image input unit 121 receives an image from the camera 201 and outputs it to the face detection unit 122 and the fingertip detection unit 123.
- the face detection unit 122 detects the face of the operator photographed as a subject from the camera image received from the image input unit 121 and outputs position information. When a person's face is shown in the camera image, the face detection unit 122 detects all of the person's faces as the operator's face. The face detection unit 122 outputs the detected face position information to the face-to-finger distance calculation unit 112.
- the position information includes three pieces of information such as a distance from the camera and horizontal and vertical positions.
- the fingertip detection unit 123 detects the fingertip of the operator photographed as the subject from the camera image received from the image input unit 121 and outputs position information. When the fingertip of the person is shown in the camera image, the fingertip detection unit 123 detects all of the fingertips shown as the operator's fingertips. The fingertip detection unit 123 outputs the detected fingertip position information to the face-to-finger distance calculation unit 112.
- the position information includes three pieces of information such as a distance from the camera and horizontal and vertical positions.
- the face-to-finger distance calculation unit 112 associates the face of the operator and the fingertip of the operator from the face position and fingertip position output by the position acquisition unit 111, and the distance between the associated face and fingertip ( (Face to finger distance) is calculated.
- the position acquisition unit 111 outputs a plurality of pieces of face position information as described above.
- the face-to-finger distance calculation unit 112 determines one corresponding fingertip for each of the plurality of face positions output by the position acquisition unit 111, and calculates the distance between the face fingers for a plurality of operators.
- the instructor determination unit 113 determines an instructor who has the authority to move the cursor and specifies the position of the pointing target using the inter-finger distance calculated by the inter-finger distance calculation unit 112.
- the instructor determination unit 113 outputs the position information of the face and the fingertip, which is information of the determined instructor, and the distance between the face and finger of the instructor.
- the instructor is the only operator who has the authority to move the cursor.
- the instructor determination unit 113 determines an operator having the largest inter-finger distance as an instructor when a plurality of operators are captured as subjects.
- the instructor determination unit 113 determines the operator who has acquired the face-to-finger distance as the instructor. Further, the instructor determining unit 113 determines that there is no instructor when no inter-face finger distance calculating unit 112 calculates the inter-finger finger distance.
- the cursor position calculation unit 114 calculates the position of the cursor to be displayed from the position information of the face and the fingertip, which is the information of the instructor received from the instructor determination unit 113, and the distance between the instructor's faces and fingers. Output as position information.
- the cursor position calculation unit 114 holds the position of the currently displayed cursor and the position information of the face and fingertip of the instructor who has designated the cursor position. From the newly received position information of the face and fingertip of the instructor Then, it is determined whether or not the instructor has changed. When there is a change of the instructor, the cursor position calculation unit 114 outputs the cursor initial position as current cursor position information.
- the initial cursor position is the center position of the image.
- the cursor position calculation unit 114 calculates the cursor position by a calculation method to be described later and outputs it as current cursor position information.
- FIG. 2 is an image diagram showing an appearance of the input / output terminal 200 connected to the display control device 100 in the present embodiment.
- the input / output terminal 200 includes one display device 202 for displaying a screen and a set of cameras 201 for photographing.
- the camera 201 is a stereo camera that includes a pair of left and right cameras 201a and 201b, and outputs a stereo image.
- the camera 201 is installed on the upper part of the display device 202.
- the camera 201 and the display device 202 are installed close to each other in the same direction, when the operator performs a pointing operation while looking at the screen displayed on the display device 202, the operator's face and fingertip are moved to the camera 201.
- FIG. 3 is a flowchart showing the operation of the display control apparatus 100 including the pointing control apparatus 101 in the present embodiment.
- the display control apparatus 100 acquires an image of the camera 201 (S10).
- the position acquisition unit 111 activates the face detection unit 122 and the fingertip detection unit 123.
- the face detection unit 122 analyzes the image acquired by the image input unit 121, detects all the faces of the person being shown, and outputs the respective position information to the face-to-finger distance calculation unit 112 (S20).
- the fingertip detection unit 123 analyzes the image acquired by the image input unit 121, detects all the fingertips of the person being shown, and outputs the respective position information to the face-to-finger distance calculation unit 112 (S30).
- a method for acquiring the position information of the face and fingertip will be described.
- feature points closest to the camera 201 are extracted from the detected area.
- an outer frame (outline) of the face is detected. Therefore, a point closest to the camera 201 within the range of the outer frame of the face is used as a feature point.
- the characteristic point is usually the head of the nose
- the head of the nose will be described below as the position of the face.
- detecting a fingertip generally a contour line is detected, and a point closest to the camera is used as a feature point.
- the characteristic point is usually the tip of the index finger, hereinafter, the tip of the index finger will be described as the position of the fingertip.
- FIG. 4 shows the positional relationship between the camera 201a and the fingertip.
- 301 is the position of the fingertip.
- the viewing angle frame 300 is a range of the viewing angle of the camera 201 a at a distance from the camera 201 a to the fingertip 301.
- the parallax of the fingertip 301 is detected from the image of the camera 201a and the image of the camera 201b, and the distance Z from the camera to the fingertip is calculated.
- a plane having a normal vector from the fingertip 301 toward the camera 201a is derived, and the camera viewing angle is projected onto the plane to obtain the viewing angle frame 300.
- the size of the viewing angle frame 300 that is, the length in the vertical direction and the length in the horizontal direction are both determined by the angle of view of the camera and the distance Z.
- the position (X, Y, Z) of the fingertip is acquired using the horizontal position X and the vertical position Y within the viewing angle frame 300. That is, X and Y are relative values depending on the size of the viewing angle frame 300.
- the lower left corner of the viewing angle frame 300 is the origin (0, 0), and the upper right corner is (200). , 150).
- the face-to-finger distance calculating unit 112 associates the face with the fingertip, and calculates the face-to-finger distance between the associated face and the fingertip (S40).
- FIG. 5 is a relational diagram showing the positional relationship between the camera 201a and the fingertip and face of the subject from the horizontal direction.
- Reference numeral 311 denotes a fingertip position
- reference numeral 321 denotes a face position
- 310 is a frame for the camera viewing angle at the fingertip position 311
- 320 is a frame for the camera viewing angle at the face position 321.
- the two camera viewing angle frames 310 and 320 are represented as straight lines when viewed from the side.
- FIG. 6 is a relational diagram showing the positional relationship between the camera 201a and the fingertips and faces of a plurality of subjects from the lateral direction.
- Reference numeral 331 denotes the fingertip position of the subject a
- 341 denotes the face position of the subject a
- 351 denotes the fingertip position of the subject b
- 361 denotes the face position of the subject b.
- Reference numerals 330, 340, 350, and 360 denote camera viewing angle frames at the fingertip position 331 of the subject a, the face position 341 of the subject a, the fingertip position 351 of the subject b, and the face position 361 of the subject b, respectively.
- FIG. 331 denotes the fingertip position of the subject a
- 341 denotes the face position of the subject a
- 351 denotes the fingertip position of the subject b
- 361 denotes the face position of the subject b.
- FIG. 7 is a view of the positional relationship of FIG. 6 viewed from a point on the optical axis of the camera 201a.
- FIG. 8 is a flowchart of the operation of the face-to-finger distance calculation unit 112.
- the operation of the face-to-finger distance calculator 112 associating the operator's face with the fingertip and acquiring the face-to-finger distance from the face position and the fingertip position will be described with reference to FIG.
- the face-to-finger distance calculating unit 112 “condition 1: the fingertip position is closer to the camera than the face position” and “condition 2: associate the face position and the fingertip position are closest” The face and fingertip are matched so that both of the above are satisfied.
- the face-to-finger distance calculation unit 112 selects one face position that has not yet been selected from the face positions acquired from the position acquisition unit 111 and is closest to the camera 201a (S410).
- the inter-face finger distance calculation unit 112 selects one fingertip position that is not associated with any face from the fingertip positions acquired from the position acquisition unit 111 (S420).
- the face-to-finger distance calculating unit 112 combines the selected face position and the selected fingertip position, and determines which is closer to the camera 201a (S430).
- the fingertip position is closer to the camera 201a (S430)
- the distance between the combined face and the fingertip is acquired (S440).
- the face-to-finger distance calculation unit 112 does not hold the face-to-finger distance for the selected face (S450)
- the face-to-finger distance calculation part 112 holds the acquired face-to-finger distance and selects the selected finger.
- the face is associated (S460).
- the face-to-finger distance calculation unit 112 compares the held face-to-finger distance with the newly acquired face-to-finger distance ( S450) If the new face-to-finger distance is smaller, the association with the face being selected and the face-to-finger distance are deleted, the acquired face-to-finger distance is retained, and the selected finger is being selected. (S460).
- the face-to-finger distance calculation unit 112 determines that this combination is out of the matching condition.
- the inter-face finger distance calculation unit 112 ends the association process.
- the face-to-finger distance calculation unit 112 sequentially associates the fingertip position at the most appropriate position with the selected one face position when a plurality of operators are captured as subjects. Thereby, the face-to-finger distance calculation unit 112 can associate the face with the fingertip and obtain the face-to-finger distance for the operator shown as the subject.
- the face-to-finger distance calculation unit 112 finds the correspondence between the detected face positions 341 and 361 and the fingertip positions 331 and 351, and the fingertip position at the most appropriate position with respect to each face position Are combined in order.
- the face-to-finger distance calculation unit 112 selects a face position 341 close to the camera among the face positions (S410).
- the face-to-finger distance calculating unit 112 selects the fingertip position 331 (S420).
- the face-to-finger distance calculation unit 112 determines which of the face position 341 and the fingertip position 331 is closer to the camera 201a (S430). Since the fingertip position 331 is closer to the camera, the face position 341 and the fingertip position 331 The distance between is acquired (S440). Since there is no fingertip associated with the face position 341 (S450), the face-to-finger distance calculation unit 112 associates the face position 341 with the fingertip position 331, and holds the face-to-finger distance (S460).
- the inter-finger distance calculation unit 112 selects the fingertip position 351 (S420).
- the face-to-finger distance calculation unit 112 determines which of the face position 341 and the fingertip position 351 is closer to the camera 201a (S430), and selects the face position 341 because the face position 341 is closer to the camera. It is determined whether there is a fingertip that has not been selected (S470), and it is determined whether there is an unselected face because there is no fingertip that has not been selected (S480).
- the inter-face finger distance calculation unit 112 selects an unselected face position 361 (S410).
- the face-to-finger distance calculation unit 112 selects the fingertip position 351 (S420).
- the face-to-finger distance calculation unit 112 determines which of the face position 361 and the fingertip position 351 is closer to the camera 201a (S430), and since the fingertip position 351 is closer to the camera, the face position 361 and the fingertip position 351 The distance between is acquired (S440). Since there is no fingertip associated with the face position 361 (S450), the face-to-finger distance calculation unit 112 associates the face position 361 with the fingertip position 351, and holds the face-to-finger distance (S460).
- the face-to-finger distance is calculated by associating the face position 341 and the fingertip position 331, and the face-to-finger distance is calculated by associating the face position 361 and the fingertip position 351.
- the instructor determination unit 113 determines the instructor from the distance between the face fingers (S50).
- FIG. 9 is a flowchart showing the operation of the instructor determination unit 113. The operation in which the instructor determination unit 113 determines the instructor will be described with reference to FIG.
- the instructor determination unit 113 selects one pair from the correspondence relationship between the face and the fingertip (hereinafter referred to as a pair) associated with the face-to-finger distance calculation unit 112.
- the selected pair is a comparison source pair (S510).
- the instructor determination unit 113 determines whether there is another pair associated with the face-to-finger distance calculation unit 112 (Yes in S520), and if one exists, selects one of the pairs. (S530).
- the selected pair is a comparison target pair.
- the instructor determination unit 113 determines which face-to-finger distance is larger between the comparison source pair and the comparison destination pair (S540). 113 sets the distance between comparison target facial fingers as a new comparison source pair (S550).
- the instructor determination unit 113 determines whether there is another pair associated with the face-to-finger distance calculation unit 112 (S520), and if it exists, repeats the process of selecting one pair as a comparison destination. .
- the instructor determination unit 113 determines that the comparison source pair is a face finger.
- the inter-distance is determined to be the maximum, and the pair is determined as the instructor (S560).
- the cursor position calculation unit 114 calculates and outputs the cursor position from the determined instructor information (S60).
- the cursor position calculation unit 114 receives the information of the instructor for one person from the instructor determination unit 113.
- the cursor position calculation unit 114 holds the currently displayed cursor position and the information of the instructor who has designated the cursor position (face position information and fingertip position information). From the newly received information of the instructor Then, it is determined whether or not the instructor has been changed, and then the cursor position is calculated. This is because the cursor position calculation method differs depending on whether or not the new instructor is the same as the previous instructor.
- the cursor position calculation unit 114 determines whether or not the instructor has changed based on whether or not the instructor's face position has moved significantly.
- the pointing control device 101 is based on the premise that the operator performs a pointing operation without moving from his / her seat. Therefore, the cursor position calculation unit 114 sets the distance between the face position of the immediately preceding instructor and the position of the current instructor's face using a half of the distance between the centers of adjacent seats as a threshold value. If the threshold value is exceeded, it is determined that there has been a change of the instructor. If not, it can be assumed that the previous instructor and the new instructor are in the same seat. to decide.
- the cursor position calculation unit 114 When there is a change of the instructor, the cursor position calculation unit 114 outputs the cursor initial position held in advance as current cursor position information. As described above, the cursor initial position is the center position of the screen.
- the cursor position calculation unit 114 When there is no change of the instructor, the cursor position calculation unit 114 outputs the position of the intersection of the straight line from the fingertip to the display device and the display surface of the display device as the current cursor position.
- the pointing control device 101 outputs the cursor position based on the image acquired from the camera 201 by the operations of S10 to S60.
- the display image input unit 102 acquires an image to be displayed on the display device 202 from the personal computer, and the cursor composition unit 103 positions the cursor acquired by the pointing control device 101 with respect to the image acquired from the display image input unit 102.
- the cursor is superimposed and synthesized and output to the image output unit 104 (S70).
- FIG. 10 is a flowchart showing the operation of the cursor composition unit 103.
- the cursor composition unit 103 receives a display image from the display image input unit 102 (S710). This display image is data of a display original image.
- the cursor composition unit 103 determines whether or not a cursor position is designated from the pointing control device 101 (S720).
- the cursor composition unit 103 composes the cursor at a predetermined position of the display image to generate a composite display image (S730).
- the predetermined position is the upper left corner of the screen as described above.
- the cursor composition unit 103 When there is a position designation (S720), the cursor composition unit 103 composes the cursor at the designated position indicated by the cursor position information in the display image, and generates a composite display image (S740).
- the cursor composition unit 103 outputs the generated composite display image to the image output unit 104 (S750).
- FIG. 11 shows an example of a composite display image generated by the cursor composition unit 103.
- Reference numeral 401 in FIG. 11A denotes a display image that the cursor composition unit 103 receives from the display image input unit 102.
- Reference numeral 402 denotes image data of the cursor.
- Reference numeral 403 denotes a designated position indicated by the cursor position information received from the pointing control apparatus 101 by the cursor composition unit 103.
- 11B is a composite display image generated using each piece of information shown in FIG.
- the cursor composition unit 103 superimposes the cursor image data 402 on the designated position 403 indicated by the cursor position information in the display image 401 received from the display image input unit 102 and generates a composite display image 404.
- the display control device including the pointing control device can determine an instructor that specifies the position of the pointing target from a plurality of operators, and can display a cursor at the position specified by the determined instructor. It becomes possible.
- the pointing control device determines an instructor having cursor movement authority from among a plurality of operators using a distance between the faces of the operators captured as a subject. To do.
- the operator can gain the authority to move the cursor by extending his / her hand toward the camera from the instructor currently operating.
- the operation of the operator extending his / her hand toward the camera from the instructor who is currently operating does not disturb the progress of the conference as compared with the case of moving to the operator area, so the efficiency of the conference can be improved. In particular, it is effective in a physically small room because difficult movement of the seat is unnecessary.
- the cursor moving authority as the instructor can be obtained by performing the operation of extending his hand toward the camera from the instructor currently operating.
- the pointing control device moves the cursor to the initial position, for example, the center of the screen when the instructor having the right to operate the cursor changes, so that the cursor is moved regardless of the position pointed to by the previous instructor. It becomes possible to start moving.
- the cursor position calculation unit 114 uses the position of the intersection of the straight line from the fingertip to the display device and the display surface of the display device as the cursor position.
- the display surface is large, an area that cannot be indicated by the instructor may occur.
- the cursor is moved using a mouse, if the mouse is moved to the end on the plane where the mouse is moved, it can be moved further in the same direction by floating the mouse once. It is.
- the pointing control device it is impossible to distinguish between the movement of the fingertip for moving the cursor and the movement of the fingertip that does not accompany the cursor movement, and thus the movement range of the cursor is expanded in this way. I can't.
- the moving distance of the cursor is determined using the distance moved by the fingertip.
- the moving distance of the cursor is a constant multiple of the moving distance of the fingertip, if the constant is small, a range in which the cursor cannot be moved may occur.
- the constant is large, the cursor moves greatly even if the fingertip is moved a little, and it is difficult to move the cursor minutely, and the cursor may not be operated appropriately.
- the cursor position calculation unit 114 determines the amount of movement of the cursor using the distance between the fingertips of the instructor. Thereby, the instructor can properly operate the cursor by properly using pointing with the fingertip close to the face and pointing with the fingertip away from the face.
- the pointing control device Since the pointing control device according to the present modification is the same as that of the first embodiment except for the cursor position calculation method, the description of the same configuration and operation as those of the first embodiment will be omitted, and only the cursor position calculation method will be described. Will be described.
- FIG. 12 is an image diagram showing the distance between the face and the fingertip.
- FIG. 12A illustrates a case where the distance between the face and the fingertip is long.
- FIG. 12B illustrates a case where the distance between the face and the fingertip is short.
- FIG. 12A shows a case where the distance between the face and the fingertip is long, 502 is the fingertip position, and 503 is the face-to-finger distance between the face position 501 and the fingertip position 502.
- FIG. 12B shows a case where the distance between the face and the fingertip is short, 504 indicates the position of the fingertip, and 505 indicates the distance between the face and finger between the face position 501 and the fingertip position 504. ing.
- FIG. 13 is a graph showing an example of the correspondence between the face-to-face distance and the cursor movement rate.
- Reference numeral 601 is a graph showing the correspondence between the distance between face fingers and the cursor movement rate.
- the cursor movement rate corresponding to the graph 601 is set in accordance with the given face-to-face distance. For example, when the face-to-finger distance is 503, the cursor movement rate is 603. When the face-to-finger distance is 505, the cursor movement rate is 605.
- the cursor movement rate is set to increase as the face-to-finger distance decreases. Therefore, when it is desired to move the cursor greatly, the instructor only needs to bring the fingertip closer to his / her face. Further, the instructor can instruct a minute movement of the cursor by moving the fingertip away from his / her face.
- the cursor position calculation unit 114 calculates the direction and distance when the instructor moves the fingertip, and outputs the position moved to the calculated direction and distance from the displayed cursor position as current cursor position information. To do.
- the pointing control device can control the moving distance of the cursor by changing the distance between the face and the fingertip, so that the instructor can move the cursor while maintaining the moving distance of the fingertip.
- the moving distance can be changed, and the cursor can be controlled as intended by the instructor.
- the problem that the cursor cannot move on the screen can be prevented by increasing the cursor movement rate by controlling the distance between the fingers. .
- the display control device including the pointing control device in this embodiment is characterized in that a plurality of cursors can be displayed simultaneously, and a plurality of instructors respectively operate the plurality of cursors.
- the display control apparatus is the same as that of the first embodiment except that the operations of the instructor determination unit, the cursor position calculation unit, and the cursor synthesis unit are different, and thus the same configuration and operation as those of the first embodiment. Description of the above will be omitted, and only the difference will be described.
- the display control device 150 includes a pointing control device 151 and a cursor composition unit 153 instead of the pointing control device 101 and the cursor composition unit 103, respectively.
- the pointing control device 151 includes an instructor determination unit 163 and a cursor position calculation unit 164 instead of the instructor determination unit 113 and the cursor position calculation unit 114, respectively.
- the instructor determining unit 163 has the same function as that of the instructor determining unit 113 except that it determines a plurality of instructors.
- the cursor position calculation unit 164 has the same function as the cursor position calculation unit 114 except that it obtains information of a plurality of instructors and outputs a plurality of cursor positions.
- the cursor synthesis unit 153 has the same function as the cursor synthesis unit 103 except that a plurality of cursor positions are obtained and a plurality of cursors are synthesized.
- FIG. 14 is an example of a combined display image generated by the cursor combining unit 153, and shows a case where a plurality of cursors are combined.
- reference numeral 700 denotes a composite display image
- reference numerals 701, 702, and 703 denote cursors.
- the instructor determination unit 163 determines the number of instructors corresponding to the number of cursors that can be displayed, using the face-to-finger distance acquired by the face-to-finger distance calculation unit 112.
- the instructor determination unit 163 outputs the information of the determined instructor (position information between the face and fingertip, the distance between the face fingers).
- the instructor determination unit 163 selects the number of instructors that does not exceed the number of cursors that can be displayed, in descending order of the distance between face fingers.
- FIG. 16 is a flowchart illustrating an example of the operation of the instructor determination unit 163. The operation for selecting the instructor will be described with reference to FIG.
- the instructor determination unit 163 selects one pair from the corresponding face and fingertip pairs received from the face-to-finger distance calculation unit 112.
- the selected pair is a comparison source pair (S810).
- the instructor determination unit 163 selects the comparison source pair as an instructor (S820).
- the instructor determination unit 163 determines whether there is another correspondence between the face and the fingertip (S830). If there is another correspondence between the face and the fingertip, one pair is selected as the comparison destination pair (S840).
- the instructor determination unit 163 determines whether or not the current number of instructors is less than the upper limit of the number of cursors (S850). When the current number of instructors has already reached the upper limit of the number of cursors (No in S850), the instructor determination unit 163 compares the face-to-finger distance between the comparison source pair and the comparison destination pair (S860). ).
- the comparison source pair is removed from the instructor, and the comparison destination pair is selected as the instructor (S880). Subsequently, among the pairs selected as the instructor, the pair with the shortest distance between the face fingers is selected as a new comparison source pair (S890). Thereafter, the instructor determination unit 163 determines the remaining pairs (return to S830).
- the comparison destination pair is selected as the instructor (S860), and then the most face among the pairs selected as the instructor A pair having a small inter-finger distance is selected as a new comparison source pair (S890). Thereafter, the instructor determination unit 163 determines the remaining pairs (return to S830).
- the instructor determination unit 163 determines that the pair currently selected as the instructor is the instructor.
- the cursor position calculation unit 164 calculates the positions of the plurality of cursors to be displayed from the information on the plurality of instructions received from the instruction determining unit 163, and outputs the calculated positions as the cursor position information.
- the cursor composition unit 153 receives a plurality of pieces of cursor position information output from the pointing control device 151, generates a composite display image by superimposing the plurality of cursors on the display image.
- FIG. 15 shows an example of management information for managing a plurality of instructors.
- FIG. 15 shows an example in which the number of cursors that can be displayed is three as in FIG. 14 and the management information format is a table.
- the management information holds an identifier (No.) for identifying the instructor, the face position, the fingertip position, and the shape of the cursor for one instructor.
- the identifier (No.) for identifying the instructor is created based on the instructor identity determination as described in the first embodiment.
- the cursor composition unit 153 for example, has an identifier No.
- the identifier of the new instructor corresponding to the face position whose distance from the face position of the immediately preceding instructor is equal to or less than the threshold is No. 1. Set to 1.
- the cursor composition unit 153 assigns a different cursor shape to each identifier, and maintains the assigned cursor shape while the instructor holds the cursor movement authority so that the cursor shape does not change during the operation. To control.
- the pointing control device can determine a plurality of instructors and assign an individual cursor to each instructor.
- the pointing control device does not assign a cursor to a subject person who is not pointing, and can prevent unnecessary cursor display.
- the display control device including the pointing control device in this embodiment is characterized in that it can be connected to a plurality of input / output terminals, and the same screen is shared at a plurality of bases and the same cursor is operated.
- FIG. 17 is a diagram showing a system configuration of the display control apparatus according to the present embodiment.
- FIG. 17 shows an example of a conference at three sites (meeting rooms A, B, and C).
- the conference room A is provided with an input / output terminal 810a
- the conference room B is provided with an input / output terminal 810b and a display control device 800 having a pointing control device
- the conference room C is provided with an input / output terminal 810c. It is installed.
- the display control device 800 provided with the pointing control device is connected to the input / output terminal 810a of the conference room A via the network 821 and to the input / output terminal 810c of the conference room C via the network 822.
- the display control apparatus 800 provided with the pointing control apparatus inputs camera images taken by the respective cameras of the input / output terminals 810a and 810c at other bases via the networks 821 and 822.
- the display control device 800 detects a face position and a fingertip position from images captured by the respective cameras of the input / output terminals 810a, 810b, and 810c, determines an instructor, and calculates a cursor position.
- the display control apparatus 800 associates the face and the fingertip with the base, for example, by adding information of the captured input / output terminal to the position information of the face and the position information of the fingertip. It is possible to calculate the face-to-finger distance.
- the display control device 800 outputs the same screen to the display devices of the input / output terminals 810a and 810c at other bases via the networks 821 and 822.
- a display control apparatus 800 including a pointing control apparatus determines an instructor having cursor movement authority from operators participating in multiple locations, and displays a composite display image generated by combining the cursors with a plurality of display images. Output to the base display device.
- the screen control apparatus can share the same screen at a plurality of bases via a network when performing a conference at multiple bases.
- the pointing control device can determine an instructor who has cursor movement authority from operators participating at multiple locations, and can easily switch the instructor. Further, the instructor can appropriately move the cursor that specifies the position of the pointing target by performing an operation of moving the fingertip toward the camera regardless of the distance from the camera being operated.
- the pointing control device determines the instructor using the distance between the face and the fingertip, compared with the case where the instructor is determined based on the distance between the camera and the fingertip, Even when the size of the conference room and the arrangement of the conference room are different, it is possible to appropriately determine the instructor regardless of these external factors.
- the camera 201 is a set of two cameras for acquiring the positions of the face and the fingertip.
- the present invention is not necessarily limited to this case.
- the camera 201 may be realized by a combination of one camera and distance detection means such as an infrared sensor camera.
- the face detection unit 122 and the fingertip detection unit 123 detect the position of the face and fingers in the horizontal direction and the vertical direction using an image of a normal camera, and an infrared sensor camera detects between the subject and the camera. By detecting the distance, the position between the face and the fingertip can be detected.
- the input / output terminal 200 includes the display device 202 realized by a large liquid crystal display and the camera 201 installed on the upper portion of the display device 202 has been described.
- the invention is not necessarily limited to this case.
- the display device 202 may be realized by, for example, a plasma display, an organic EL display, or a projector. Further, for example, the display device 202 may be realized by a television receiver, and the display control device 100 may be realized as a part of an integrated circuit included in the television receiver.
- the camera 201 is not limited to the upper part of the display device 202, and may be installed at an arbitrary position where the operator can be the subject, such as a wall surface or a ceiling of a conference room.
- the facial feature point is the point closest to the camera 201 within the range of the outer frame of the face, but the present invention is not necessarily limited to this case.
- the facial feature point may be, for example, the head of the nose regardless of the distance from the camera 201, or when the operator is wearing glasses, one point on the glasses is the facial feature point. Those that move together with the face and do not separate during shooting may be considered as part of the face.
- the feature point of the fingertip is the closest point from the camera 201 within the range of the outline of the hand
- the present invention is not necessarily limited to this case.
- the feature point of the fingertip may be the tip of the index finger regardless of the distance from the camera 201.
- the pointing operation may be performed, for example, at the tip of a writing instrument, an indicator stick, etc. held in the hand.
- the fingertip detection unit 123 detects an object such as a hand and a writing instrument held in the hand by image recognition using an outer frame, a color, etc., and uses the tip of the writing instrument held in the hand as a feature point. This can be realized by acquiring the position as the position of the fingertip. By doing in this way, the tip of a writing instrument or the like held in the hand can be used as a substitute member for the fingertip.
- the substitute material for the fingertip is not limited to a writing instrument or a pointing stick. Any device that can indicate the position or direction and can easily approach or move away from the face can be used as a substitute for the fingertip.
- the position acquisition unit 111 includes the image input unit 121, the face detection unit 122, and the fingertip detection unit 123 has been described, but the present invention is not necessarily limited to this case.
- the position acquisition unit 111 may acquire face position information and fingertip position information from, for example, another pointing control device or a device that detects the position of the face and the fingertip.
- the lower left corner of the frame is the origin, and the upper right corner is (200, 150).
- the lengths of the four sides of the camera viewing angle frame may be arbitrary values depending on the resolution of the camera, and are not limited to the rectangular shape.
- the position of the origin may be an arbitrary position such as the center of the frame or the upper left corner of the frame.
- the horizontal direction is the left-to-right direction toward the X axis
- the vertical direction is the Y-axis direction from bottom to top, but the X-axis is the direction from right to left, or the Y-axis.
- Arbitrary coordinate systems may be used, such as with the direction from top to bottom.
- the face-to-finger distance calculation unit 112 has been described as searching for the corresponding fingertip in order from the face closest to the camera, but the present invention is not necessarily limited to this case.
- the face-to-finger distance calculation unit 112 may search for a fingertip corresponding to the face in order from the face farthest to the camera, or may search for a fingertip corresponding to the face in the order acquired from the position acquisition unit 111.
- the face-to-finger distance calculating unit 112 may search for a corresponding face after first selecting a fingertip.
- the face-to-finger distance calculation unit 112 associates the fingertip that is closer to the camera 201a than the face and closest to the face, but the present invention is not necessarily limited to this case.
- the face detection unit 122 and the fingertip detection unit 123 may output unique identification information for each face or each fingertip in addition to the position information, and perform association using the identification information.
- a threshold value may be provided for the distance between face fingers, and combinations of faces and fingertips where the distance between face fingers exceeds the threshold value may be excluded.
- a combination having a difference greater than or equal to a threshold value may be excluded using a difference in Z value or a difference in X value.
- the cursor position calculation unit 114 uses the half of the distance between the centers of adjacent seats as a threshold value, and the position of the face of the previous instructor and the face of the new instructor.
- the threshold value is not limited to 1 ⁇ 2 of the distance between the centers of adjacent seats, and may be, for example, a seat width or an arbitrary value such as a predetermined value that does not depend on the conference room. May be used.
- the cursor position calculation unit 114 also acquires position information of a face other than the instructor from the position acquisition unit 111, and 1 ⁇ 2 of the distance between the instructor's face and another face closest to the instructor's face. May be used as a threshold value.
- the face detection unit 122 outputs identification information unique to each face together with the position information of the face, and the cursor position calculation unit 114 indicates whether or not the face identification information output by the face detection unit 122 matches. It may be determined whether the person has switched. In this way, it is possible to make a more accurate determination of the identity of the instructor using image recognition.
- the cursor position calculation unit 114 has been described with respect to the case where the cursor initial position is the center position of the screen, but the present invention is not necessarily limited to this case.
- the cursor initial position may be a specific position such as the upper left position of the screen.
- the initial cursor position may be the position of the intersection of the straight line from the fingertip to the display device and the display surface of the display device.
- the predetermined cursor position in the cursor composition unit 103 is the upper left corner of the display screen
- the present invention is not necessarily limited to this case.
- the predetermined cursor position may be an arbitrary coordinate position such as the center position of the display image.
- the predetermined cursor position may be a position where the previous cursor was synthesized.
- the predetermined cursor position may be the same as or different from the cursor initial position held by the cursor position calculation unit 114.
- the cursor composition unit 103 does not hold a predetermined cursor position and the pointing control device 101 does not output the cursor position
- the cursor composition unit 103 does not compose the cursor and the image input from the display image input unit 102 is not displayed.
- the image output unit 104 may output the image as it is. In this way, when there is no instructor, the cursor can be erased so as not to interfere with the display image.
- the display image input unit 102 may include an arbitrary interface capable of receiving a display image, such as an analog RGB input terminal, a DVI input terminal, a DisplayPort input terminal, or a composite input terminal.
- the display image acquisition source is not limited to a personal computer, and may be an arbitrary image output device such as a BD player.
- the display image input unit 102 may include an interface for acquiring a display image from a storage medium or a network, such as a memory card reader that acquires a display image as a file from a memory device such as an SDXC memory card.
- a memory card reader that acquires a display image as a file from a memory device such as an SDXC memory card.
- the cursor composition unit 103 synthesizes the cursor with the image input from the display image input unit 102 according to the cursor position information output from the cursor position calculation unit 114.
- the display control device 100 does not include the display image input unit 102, the cursor synthesis unit 103, and the image output unit 104, and the pointing control device 101 has an HID (Human Interface Device) class USB interface or a PS / 2 mouse interface.
- the cursor position information calculated by the cursor position calculation unit 114 may be output to a personal computer, and the personal computer may synthesize the cursor with the display image using the cursor position information. In this way, it is not necessary to synthesize the cursor by the display control device, and it is possible to output an image directly from the personal computer to the display device 202 by causing the personal computer to perform cursor synthesis processing.
- the relationship between the face-to-finger distance and the cursor movement rate is expressed by a linear function
- the relationship between the face-to-finger distance and the cursor movement rate may be inversely proportional, or may be expressed by other functions.
- the relationship between the distance between the face fingers and the cursor movement rate may be defined using a conversion table, and the distance between the face fingers may be converted into the cursor movement rate using the table to calculate the cursor movement distance. .
- the identifier (No.) for identifying the instructor has been described based on the case where the identifier is determined based on the instructor identity determination as described in the first embodiment.
- the face detection unit 122 outputs unique identification information for each face, and the identifier (No.) for identifying the instructor is identification information unique to the face. It may be generated based on
- the cursor shape stored in the management information may include information for specifying the cursor other than the shape, such as the size and color of the cursor.
- the display control apparatus 800 including the pointing control apparatus has been described with respect to the case where information of input / output terminals taken is added to face position information and fingertip position information.
- the invention is not necessarily limited to this case.
- the coordinate system (origin position) may be determined so that the coordinate values do not overlap with each other, and the fingertip and face position information may be acquired. By doing in this way, it can prevent that the face of a certain base and the finger of another base are matched.
- display control apparatus 800 having a pointing control apparatus receives camera images taken by cameras of input / output terminals 810a and 810c at other bases via networks 821 and 822.
- the input / output terminal 810a includes a position acquisition unit 111 connected to the camera, and the input / output terminal 810a transmits the face position information and the fingertip position information to the position acquisition unit 111 of the display control device 800 via the network 821. It may be output.
- the input / output terminal 810a includes a position acquisition unit 111 and a face-to-finger distance calculation unit 112 connected to the camera, and the input / output terminal 810a communicates with the instructor determination unit 113 of the display control device 800 via the network 821.
- the inter-finger distance information may be output.
- the network 821 does not need to directly transmit the camera image, and the network utilization efficiency can be improved.
- the input / output terminal 810c and the network 822 may have the same configuration.
- a display control device that is connected to the input / output terminal 810c and lacks the pointing control device 101 is installed in the conference room C, and the display control device 800 transmits the cursor position information to the display control device in the conference room C via the network 822. May be output.
- the display control device in the conference room C can output the cursor to the input / output terminal 810c using the cursor position information, so the network 822 does not need to directly transmit the display image after the cursor composition, and the network Utilization efficiency can be improved.
- the input / output terminal 810a and the network 821 may have the same configuration.
- the input / output terminal 810a of the conference room A may be realized by a camera-equipped mobile phone terminal incorporating a camera, a display, and a network function.
- the conference room A is realized using one room at the home of the operator who owns the mobile phone terminal, and the operator can participate in the conference from home or the like.
- the pointing control device may be realized by all or part of its constituent elements by an integrated circuit of one chip or a plurality of chips, or by a computer program. It may be implemented in any other form.
- the display control device including the pointing control device may be realized by one chip, or only the pointing control device may be realized by one chip, and the cursor composition unit or the like may be realized by another chip.
- LSI Large Scale Integration
- IC system LSI
- super LSI ultra LSI depending on the degree of integration.
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- the display control device 100 is implemented by an integrated circuit. Since each part of the display control apparatus 100 is the same as that of the first embodiment, the same reference numerals are given, and the description of the same configuration is omitted.
- the display control apparatus 100 inputs a camera image from the camera image input unit 211 and outputs a composite display image to the image output unit 212.
- the camera image input unit 211 inputs a camera image to the image input unit 121.
- the image output unit 212 outputs an image output from the image output unit 104 (not shown).
- the pointing control device may be realized by a program written in a storage medium and a computer that reads and executes the program.
- the storage medium may be any recording medium such as a memory card or a CD-ROM.
- the pointing control device according to the present invention may be realized by a program downloaded via a network and a computer that downloads and executes the program from the network. This program includes a position acquisition step for acquiring the position of each operator's face and the position of each operator's fingertip, the acquired position of each operator's face, and the position of each operator's fingertip.
- a face-to-finger distance calculating step for calculating a face-to-face distance between the face position and the fingertip position associated with each other, and an instructioner determination for determining an instructor using the face-to-finger distance
- This is a program for executing a pointing process including a step and a cursor position calculating step of calculating a cursor position using the determined position of the fingertip of the instructor and outputting the calculated cursor position.
- the personal computer that outputs the display image may be the same hardware as the computer that reads the program and executes the program.
- the present invention is useful not only for use in a specific place such as a conference room, but also as a pointing device for a home television, a personal computer, a mobile phone, a personal digital assistant, a video camera, a digital camera, and the like. is there.
- the pointing control device is a pointing control device that determines an instructor who can designate a cursor position from among operators who perform an instruction operation, and includes a position of each operator's face,
- the position acquisition unit that acquires the position of the operator's fingertip or its substitute member, the acquired face position of each operator, and the position of each operator's fingertip or its substitute member
- a distance calculating unit that calculates a distance between the position of the given face and the position of the fingertip or a substitute member thereof, an instructor determining unit that determines an instructor using the distance, and the determined fingertip of the instructor or
- a cursor position calculating unit that calculates a cursor position using the position of the substitute member and outputs the calculated cursor position.
- the integrated circuit according to the embodiment is an integrated circuit that determines an instructor who can specify a cursor position from among operators who perform an instruction operation, and includes the position of each operator's face and each operator's face.
- the position acquisition unit that acquires the position of the fingertip or its substitute member, the acquired face position of each operator, and the position of each operator's fingertip or its substitute member
- a distance calculating unit that calculates a distance between the position of the fingertip or the position of the fingertip or a substitute member thereof, an instructioner determining unit that determines an instructor using the distance, and the determined fingertip of the instructor or a substitute member thereof
- a cursor position calculating unit that calculates a cursor position using the position and outputs the calculated cursor position.
- the pointing control method is a pointing control method for determining an instructor who can specify a cursor position from among operators who perform an instruction operation.
- the position acquisition step of acquiring the position of the fingertip of the operator or the substitute member thereof is associated with the position of the acquired face of each operator and the position of the fingertip of each operator or the substitute member
- a distance calculating step for calculating a distance between the position of the face and the position of the fingertip or its substitute member
- an instructor determining step for determining an instructor using the distance
- the position acquisition unit three-dimensionally displays the position of each operator's face and the position of each operator's fingertip or a substitute member thereof.
- the distance calculation unit creates a plurality of pairs consisting of the position of one face and the position of one fingertip or its substitute member associated with the position of the one face, and all of the pairs The distance may be calculated for.
- the instructor determining unit may compare the distance between the pairs and determine the instructor. Good.
- any operator can be an instructor by changing the distance between the face and the fingertip or its substitute member, and the instructor can be easily changed.
- the instructor determining unit may determine a pair having the largest distance as the instructor among all the pairs. Good.
- the pointing control device according to the embodiment (b) according to the embodiment is a pointing control device that outputs a plurality of pieces of cursor position information at the same time. The magnitudes of the distances are compared, and a plurality of pairs are determined as the instructor from all the pairs, and the cursor position calculation unit calculates the cursor position from each of the position of the fingertip of the instructor or its substitute member. The cursor position information of each of the instructors may be output.
- the instructor determining unit determines a plurality of instructors in order from the largest of the plurality of pairs. Also good.
- the position acquisition unit includes an image acquisition unit that acquires a plurality of camera images, and a position of an operator's face from the plurality of camera images. And a fingertip position detector that detects the position of the fingertip of the operator or a substitute member thereof from the plurality of camera images.
- the pointing control device can acquire the distance between the face and the fingertip or its substitute member from the images photographed using a plurality of cameras, and determine the instructor.
- the cursor position calculation unit holds information for specifying the current instructor, and a new one determined by the instructor determination unit The position of the cursor is determined to be the direction in which the position of the fingertip of the instructor or a substitute member thereof is moved, if the result of the determination is the same. May be calculated.
- the direction of the cursor matches the direction of movement of the fingertip or its substitute member, so the instructor can intuitively control the cursor position. Is possible.
- the cursor position calculation unit holds the current cursor position, and calculates the movement distance of the cursor using the distance of the instructor. Then, a position moved in the movement direction of the cursor by the movement distance of the cursor from the current cursor position may be output as a new cursor position.
- the relationship between the movement distance of the fingertip or its substitute member and the movement distance of the cursor can be controlled as intended by the instructor by moving the fingertip or its substitute member closer to or away from the face.
- the instructor can control the movement of the cursor in more detail.
- the cursor position calculation unit may be configured such that the new indicator determined by the indicator determination unit is not the same as the current indicator.
- the initial position may be output as the cursor position.
- the pointing control device has a function of determining an instructor that designates the position of a pointing target using the distance between the operator's face and a fingertip or a substitute member thereof, and is useful in a conference system. is there. Further, it can be applied to uses such as a personal computer, a mobile phone, a portable information terminal, a video camera, and a digital camera.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
従来の指先を撮影してポインティングする方法では、1つの指先が撮影されることを前提としているため、同時に2以上の指先が撮影される場合に、カーソル位置を定めるための1つの指先を決めることができない。つまり、カーソル数より多い指先が同時に撮影される場合に、カーソルと1対1で対応付けるための指先を選び出すことができない。
以下本発明の実施の形態について、図面を参照しながら説明する。
図1は、本発明の実施の形態におけるポインティング制御装置を備えた表示制御装置100の構成を示すブロック図である。表示制御装置100は、入出力端末200と、表示画像を出力するパーソナルコンピュータ(図示せず)に接続されており、複数の操作者が、1つの表示装置に表示された画面を見ながら、指先でポインティング操作を行う場合を説明する。
図1において、表示制御装置100は、ポインティング制御装置101、表示画像入力部102、カーソル合成部103、画像出力部104を備えている。表示制御装置100は、例えば、TV会議システムの主装置などで実現され、ポインティング制御装置101、表示画像入力部102、カーソル合成部103、画像出力部104は集積回路の一部として実現される。また、入出力端末200は、カメラ201と表示装置202を備えている。表示装置202は、例えば、大型の液晶ディスプレイで実現される。
図3は、本実施の形態におけるポインティング制御装置101を含む表示制御装置100の動作を示すフローチャートである。
以上により、ポインティング制御装置を備えた表示制御装置は、複数の操作者の中から、ポインティング対象の位置を指定する指示者を決定し、決定した指示者が指定する位置に、カーソル表示することが可能となる。
実施の形態1におけるカーソル位置算出部114は、指先から表示装置へ向かう直線と、表示装置のディスプレイ面との交点の位置を、カーソル位置としたが、指先の移動可能範囲に対して表示装置のディスプレイ面が大きい場合、指示者が指し示すことができない領域が発生することがある。このような場合に、例えば、マウスを用いたカーソル移動であれば、マウスを動かす平面上の端まで動かしてしまった時には、一旦マウスを浮かせて戻すことで、さらに同じ方向に移動させることが可能である。しかしながら、実施の形態1のポインティング制御装置では、カーソルを動かすための指先の移動と、カーソル移動を伴わない指先の移動を区別することができないため、このような方法でカーソルの移動範囲を拡張することができない。
以上により、本変形例に係るポインティング制御装置は、指示者が顔と指先との距離を変更することでカーソルの移動距離を制御できるので、指示者は、指先の移動距離を保ったままカーソルの移動距離を変化させることが可能になり、より指示者の意図通りにカーソルを制御することが可能となる。また、指示者の指先の移動可能な範囲が狭い場合に、顔指間距離の制御によってカーソル移動率を大きくすることで、画面上にカーソルの移動不可能な範囲が生じる問題を防ぐことができる。
本実施の形態におけるポインティング制御装置を備えた表示制御装置は、複数のカーソルを同時に表示可能とし、複数の指示者が、複数のカーソルをそれぞれ操作する点に特徴がある。
以上により、本実施の形態に係るポインティング制御装置は、複数の指示者を決定し、それぞれの指示者に個別のカーソルを割り当てることが可能となる。
本実施の形態におけるポインティング制御装置を備えた表示制御装置は、複数の入出力端末との接続を可能とし、複数の拠点において同一の画面を共有し同一のカーソルを操作する点に特徴がある。
以上により、本実施の形態に係る画面制御装置は、多拠点で会議を行う場合、ネットワークを介して、複数の拠点で同じ画面を共有することができる。また、本実施の形態に係るポインティング制御装置は、多拠点で参加している操作者の中から、カーソル移動権限を有する指示者を決定し、容易に指示者を切り替えることができる。また、指示者は、操作中のカメラからの距離に関係なく、カメラに向かって指先を動かす動作を行なうことで、ポインティング対象の位置を指定するカーソルを適切に移動することができる。
(1)実施の形態1では、顔と指先の位置を取得するために、カメラ201は2つのカメラからなる一組のカメラである場合について説明したが、本発明は必ずしもこの場合に限定されない。例えば、カメラ201は、1つのカメラと赤外線センサーカメラなど距離検出手段との組み合わせで実現されてもよい。この場合、顔検出部122と指先検出部123は、通常のカメラの画像を用いて顔と指の水平方向、垂直方向の位置検出を行い、赤外線センサーカメラによってそれらの被写体とカメラとの間の距離を検出することで、顔と指先との位置を検出することができる。
101 ポインティング制御装置
102 表示画像入力部
103 カーソル合成部
104 画像出力部
111 位置取得部
112 顔指間距離算出部
113 指示者決定部
114 カーソル位置算出部
121 画像入力部
122 顔検出部
123 指先検出部
200、810a、810b、810c 入出力端末
201 カメラ
202 表示装置
211 カメラ画像入力部
212 表示画像入力部
Claims (12)
- 指示操作を行う操作者のうちから、カーソル位置を指定できる指示者を決定するポインティング制御装置であって、
各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを取得する位置取得部と、
取得された各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを対応付けて、対応付けられた顔の位置と指先若しくはその代用部材の位置との間の距離を算出する距離算出部と、
前記距離を用いて指示者を決定する指示者決定部と、
決定された指示者の指先若しくはその代用部材の位置を用いてカーソル位置を算出し、算出されたカーソル位置を出力するカーソル位置算出部と
を備えることを特徴とするポインティング制御装置。 - 前記位置取得部は、各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを3次元情報として取得し、
前記距離算出部は、1の顔の位置と、前記1の顔の位置に対応付けられた1の指先若しくはその代用部材の位置からなるペアを複数作成し、前記ペアの全てに対して前記距離を算出する
ことを特徴とする請求項1に記載のポインティング制御装置。 - 前記指示者決定部は、
前記ペアの間で前記距離の大きさを比較し、前記指示者を決定する
ことを特徴とする請求項2記載のポインティング制御装置。 - 前記指示者決定部は、
前記全てのペアのうち、前記距離が最も大きいペアを前記指示者として決定する
ことを特徴とする請求項3記載のポインティング制御装置。 - 同時に複数のカーソル位置情報を出力するポインティング制御装置であって、
前記指示者決定部は、前記複数ペアの間で前記距離の大きさを比較して、全てのペアの中から複数のペアを指示者として決定し、
前記カーソル位置算出部は、前記指示者の指先若しくはその代用部材の位置それぞれからカーソル位置を算出し、前記指示者それぞれのカーソル位置情報を出力する
ことを特徴とする請求項2記載のポインティング制御装置。 - 前記指示者決定部は、前記複数のペアのうち、前記距離の大きいものから順に複数の指示者を決定する
ことを特徴とする請求項5記載のポインティング制御装置。 - 前記位置取得部は、
複数のカメラ画像を取得する画像取得部と、
前記複数のカメラ画像から操作者の顔の位置を検出する顔位置検出部と、
前記複数のカメラ画像から操作者の指先若しくはその代用部材の位置を検出する指先位置検出部とを含む
ことを特徴とする請求項1記載のポインティング制御装置。 - 前記カーソル位置算出部は、現在の指示者を特定するための情報を保持し、
前記指示者決定部が決定する新たな指示者が前記現在の指示者と同一であるか判断し、
前記判断の結果が同一である場合、前記指示者の指先若しくはその代用部材の位置が移動した方向をカーソルの移動方向としてカーソル位置を算出する
ことを特徴とする請求項1記載のポインティング制御装置。 - 前記カーソル位置算出部は、現在のカーソル位置を保持し、
前記指示者の前記距離を用いてカーソルの移動距離を算出し、前記現在のカーソル位置から前記カーソルの移動距離だけ前記カーソルの移動方向に移動させた位置を、新たなカーソル位置として出力する
ことを特徴とする請求項8記載のポインティング制御装置。 - 前記カーソル位置算出部は、前記指示者決定部が決定した新たな指示者が前記現在の指示者と同一でない場合、初期位置をカーソル位置として出力する
ことを特徴とする請求項8記載のポインティング制御装置。 - 指示操作を行う操作者のうちから、カーソル位置を指定できる指示者を決定する集積回路であって、
各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを取得する位置取得部と、
取得された各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを対応付けて、対応付けられた顔の位置と指先若しくはその代用部材の位置との間の距離を算出する距離算出部と、
前記距離を用いて指示者を決定する指示者決定部と、
決定された指示者の指先若しくはその代用部材の位置を用いてカーソル位置を算出し、算出されたカーソル位置を出力するカーソル位置算出部と
を備えることを特徴とする集積回路。 - 指示操作を行う操作者のうちから、カーソル位置を指定できる指示者を決定するポインティング制御方法であって、
各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを取得する位置取得ステップと、
取得された各操作者の顔の位置と、各操作者の指先若しくはその代用部材の位置とを対応付けて、対応付けられた顔の位置と指先若しくはその代用部材の位置との間の距離を算出する距離算出ステップと、
前記距離を用いて指示者を決定する指示者決定ステップと、
決定された指示者の指先若しくはその代用部材の位置を用いてカーソル位置を算出し、算出されたカーソル位置を出力するカーソル位置算出ステップと
を含むことを特徴とするポインティング制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013510860A JP5798183B2 (ja) | 2011-04-22 | 2012-04-05 | ポインティング制御装置とその集積回路、およびポインティング制御方法 |
US13/701,944 US9081430B2 (en) | 2011-04-22 | 2012-04-05 | Pointing control device, integrated circuit thereof and pointing control method |
CN201280001738.9A CN102959491B (zh) | 2011-04-22 | 2012-04-05 | 定位控制装置和定位控制装置的集成电路、以及定位控制方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-095745 | 2011-04-22 | ||
JP2011095745 | 2011-04-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012144145A1 true WO2012144145A1 (ja) | 2012-10-26 |
Family
ID=47041280
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/002376 WO2012144145A1 (ja) | 2011-04-22 | 2012-04-05 | ポインティング制御装置とその集積回路、およびポインティング制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9081430B2 (ja) |
JP (1) | JP5798183B2 (ja) |
CN (1) | CN102959491B (ja) |
WO (1) | WO2012144145A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014006661A (ja) * | 2012-06-22 | 2014-01-16 | Yahoo Japan Corp | 表示制御装置、表示制御方法、情報表示システム、およびプログラム |
JP2016015078A (ja) * | 2014-07-03 | 2016-01-28 | Necパーソナルコンピュータ株式会社 | 表示制御装置、表示制御方法、及び、プログラム |
JP2019087136A (ja) * | 2017-11-09 | 2019-06-06 | シャープ株式会社 | 画面表示制御方法および画面表示制御システム |
JP2022008717A (ja) * | 2016-08-19 | 2022-01-14 | ヒュンダイ・アイティー カンパニー リミテッド | 音声と動作認識に基づいたスマートボードを制御する方法およびその方法を使用した仮想レーザーポインター |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2851281B1 (en) * | 2013-09-19 | 2016-04-06 | Airbus Operations GmbH | System and method for interactive visualization of information in an aircraft cabin |
US10102423B2 (en) * | 2016-06-30 | 2018-10-16 | Snap Inc. | Object modeling and replacement in a video stream |
US10902243B2 (en) * | 2016-10-25 | 2021-01-26 | Deep North, Inc. | Vision based target tracking that distinguishes facial feature targets |
US10942575B2 (en) * | 2017-06-07 | 2021-03-09 | Cisco Technology, Inc. | 2D pointing indicator analysis |
CN109325394B (zh) * | 2017-08-01 | 2022-06-21 | 苹果公司 | 确定稀疏图案照明与密集图案照明 |
CN110008802B (zh) * | 2018-12-04 | 2023-08-29 | 创新先进技术有限公司 | 从多个脸部中选择目标脸部及脸部识别比对方法、装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0519957A (ja) * | 1991-07-15 | 1993-01-29 | Nippon Telegr & Teleph Corp <Ntt> | 情報の入力方法 |
JPH11134089A (ja) * | 1997-10-29 | 1999-05-21 | Takenaka Komuten Co Ltd | ハンドポインティング装置 |
JPH11327753A (ja) * | 1997-11-27 | 1999-11-30 | Matsushita Electric Ind Co Ltd | 制御方法及びプログラム記録媒体 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3114813B2 (ja) | 1991-02-27 | 2000-12-04 | 日本電信電話株式会社 | 情報入力方法 |
US6176782B1 (en) * | 1997-12-22 | 2001-01-23 | Philips Electronics North America Corp. | Motion-based command generation technology |
US6353764B1 (en) | 1997-11-27 | 2002-03-05 | Matsushita Electric Industrial Co., Ltd. | Control method |
EP0991011B1 (en) * | 1998-09-28 | 2007-07-25 | Matsushita Electric Industrial Co., Ltd. | Method and device for segmenting hand gestures |
JP3847753B2 (ja) * | 2004-01-30 | 2006-11-22 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス |
US7893920B2 (en) * | 2004-05-06 | 2011-02-22 | Alpine Electronics, Inc. | Operation input device and method of operation input |
JP4608326B2 (ja) | 2005-01-26 | 2011-01-12 | 株式会社竹中工務店 | 指示動作認識装置及び指示動作認識プログラム |
JP4575829B2 (ja) * | 2005-03-30 | 2010-11-04 | 財団法人エヌエイチケイエンジニアリングサービス | 表示画面上位置解析装置及び表示画面上位置解析プログラム |
JP2007226397A (ja) | 2006-02-22 | 2007-09-06 | Sharp Corp | ポインティング装置、ポインティング方法、ポインティングプログラムおよびポインティングプログラムを記録した記録媒体 |
US8589824B2 (en) * | 2006-07-13 | 2013-11-19 | Northrop Grumman Systems Corporation | Gesture recognition interface system |
-
2012
- 2012-04-05 JP JP2013510860A patent/JP5798183B2/ja not_active Expired - Fee Related
- 2012-04-05 CN CN201280001738.9A patent/CN102959491B/zh not_active Expired - Fee Related
- 2012-04-05 US US13/701,944 patent/US9081430B2/en not_active Expired - Fee Related
- 2012-04-05 WO PCT/JP2012/002376 patent/WO2012144145A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0519957A (ja) * | 1991-07-15 | 1993-01-29 | Nippon Telegr & Teleph Corp <Ntt> | 情報の入力方法 |
JPH11134089A (ja) * | 1997-10-29 | 1999-05-21 | Takenaka Komuten Co Ltd | ハンドポインティング装置 |
JPH11327753A (ja) * | 1997-11-27 | 1999-11-30 | Matsushita Electric Ind Co Ltd | 制御方法及びプログラム記録媒体 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014006661A (ja) * | 2012-06-22 | 2014-01-16 | Yahoo Japan Corp | 表示制御装置、表示制御方法、情報表示システム、およびプログラム |
JP2016015078A (ja) * | 2014-07-03 | 2016-01-28 | Necパーソナルコンピュータ株式会社 | 表示制御装置、表示制御方法、及び、プログラム |
JP2022008717A (ja) * | 2016-08-19 | 2022-01-14 | ヒュンダイ・アイティー カンパニー リミテッド | 音声と動作認識に基づいたスマートボードを制御する方法およびその方法を使用した仮想レーザーポインター |
JP2019087136A (ja) * | 2017-11-09 | 2019-06-06 | シャープ株式会社 | 画面表示制御方法および画面表示制御システム |
Also Published As
Publication number | Publication date |
---|---|
CN102959491B (zh) | 2016-03-23 |
US20130076625A1 (en) | 2013-03-28 |
US9081430B2 (en) | 2015-07-14 |
JP5798183B2 (ja) | 2015-10-21 |
JPWO2012144145A1 (ja) | 2014-07-28 |
CN102959491A (zh) | 2013-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5798183B2 (ja) | ポインティング制御装置とその集積回路、およびポインティング制御方法 | |
US10001844B2 (en) | Information processing apparatus information processing method and storage medium | |
US10565437B2 (en) | Image processing device and method for moving gesture recognition using difference images | |
US10095458B2 (en) | Information processing apparatus, information processing method, non-transitory computer-readable storage medium, and system | |
JP6361714B2 (ja) | 情報処理装置、情報処理システム、その制御方法及びプログラム | |
US20150301596A1 (en) | Method, System, and Computer for Identifying Object in Augmented Reality | |
JP6691529B2 (ja) | 眉形状ガイド装置およびその方法 | |
JP6163899B2 (ja) | 情報処理装置、撮像装置、情報処理方法、及びプログラム | |
US20150145762A1 (en) | Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program | |
US20120319949A1 (en) | Pointing device of augmented reality | |
US9392248B2 (en) | Dynamic POV composite 3D video system | |
RU2598598C2 (ru) | Устройство обработки информации, система обработки информации и способ обработки информации | |
US20160232708A1 (en) | Intuitive interaction apparatus and method | |
US9268408B2 (en) | Operating area determination method and system | |
JP2012238293A (ja) | 入力装置 | |
CN111566596A (zh) | 用于虚拟现实显示器的真实世界门户 | |
JP2017191492A (ja) | 情報処理装置、情報処理方法および映像生成システム | |
EP3422145A1 (en) | Provision of virtual reality content | |
JP2004265222A (ja) | インタフェース方法、装置、およびプログラム | |
US20240346745A1 (en) | Image processing apparatus, method for controlling the same, and non-transitory computer-readable storage medium | |
JP5312505B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
KR20130018004A (ko) | 공간 제스처 인식을 위한 신체 트래킹 방법 및 시스템 | |
JPWO2018146922A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US11323688B2 (en) | Information processing apparatus and information processing method for generating a virtual viewpoint | |
US10345595B2 (en) | Head mounted device with eye tracking and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280001738.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13701944 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12773618 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013510860 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12773618 Country of ref document: EP Kind code of ref document: A1 |