EP2610714B1 - Comportement de pointage activé par une caméra à profondeur - Google Patents

Comportement de pointage activé par une caméra à profondeur Download PDF

Info

Publication number
EP2610714B1
EP2610714B1 EP12290009.5A EP12290009A EP2610714B1 EP 2610714 B1 EP2610714 B1 EP 2610714B1 EP 12290009 A EP12290009 A EP 12290009A EP 2610714 B1 EP2610714 B1 EP 2610714B1
Authority
EP
European Patent Office
Prior art keywords
target area
pointing
marking
pointing object
pointed position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP12290009.5A
Other languages
German (de)
English (en)
Other versions
EP2610714A1 (fr
Inventor
Zhe Lou
Sigurd Van Broeck
Marc Van Den Broeck
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alcatel Lucent SAS
Original Assignee
Alcatel Lucent SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcatel Lucent SAS filed Critical Alcatel Lucent SAS
Priority to EP12290009.5A priority Critical patent/EP2610714B1/fr
Publication of EP2610714A1 publication Critical patent/EP2610714A1/fr
Application granted granted Critical
Publication of EP2610714B1 publication Critical patent/EP2610714B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the invention relates to a method of and a marking device for marking a pointing position of a pointing object in a target area, and a computer program.
  • the presenter When a presenter is making a presentation in front of an audience in an auditorium, the presenter may be located between a projector screen showing slides and the audience. In the case in which the presenter intends to mark a portion of a slide on the projector screen, the presenter may physically point to the portion of a slide by raising his stretched arm and pointing with his finger or his hand to the desired portion of the slide. The presenter may also use a pole for pointing to the portion of the slide. Further, the presenter may alternatively use a laser pointer which emits a light beam being visualized by a red light spot on the desired portion of the slide to be marked.
  • the pole or the laser pointer for marking a desired portion of the slide may result in a poor marking of the desired portion of the slide, since the presenter may move during this marking procedure.
  • the presenter may be far away from the projector screen and may point with his hand or finger, the audience cannot recognize the portion of the slide to be highlighted.
  • the laser pointer does also not leave a track on the propagation path of the light beam towards the projector screen such that the presenter is not able to accurately select the portion to be highlighted in advance.
  • the displayed light spot of the laser pointer may comprise a small cross section such that the audience cannot sec the marked portion of the slide.
  • a user experience of the presenter and/or the audience during the marking of a desired portion of the screen may be poor.
  • EP 0 571 702 A2 discloses a hand pointing type input unit and wall computer module, wherein direction calculating means are provided to calculate a direction which is indicated by a hand or finger of the user.
  • WO 2011/018901 A1 discloses an image recognition device, operation determination method, and a program.
  • the object defined above is solved by a method of marking a pointed position of a pointing object in a target area, a marking device for marking a pointed position of a pointing object in a target area, and a computer program according to the independent claims.
  • a computer program when being executed by a processor, is configured to carry out or control a method of marking a pointed position of a pointing object in a target area.
  • pointing position is used tantamount to the term pointed position.
  • pointing position of a pointing object in a target area may particularly denote a position or a place in the target area which may be pointed at by the pointing object.
  • the pointing position may be defined by a location and/or an orientation of the pointing object.
  • target area may particularly denote a two dimensional area.
  • the target area may be disposed substantially in a plane parallel to the acquiring unit of the marking device.
  • the target area may be disposed substantially in a plane oriented at a non-zero angle relative to the acquiring unit of the marking device.
  • depth image may particularly denote a two dimensional array of a plurality of basic units, for example pixels, indicating depth information of objects in an acquired field of view associated with the depth image.
  • the depth information of the depth image may be visualized by a grey colour scale of the basic units of the acquired depth image.
  • the depth image may indicate a depth map of the acquired field of view.
  • the pointing object points to the pointing position in the target area may particularly denote that a pointing axis of the pointing object may essentially hit the pointing position being part of the target area.
  • a pointing position defined by a pointing object in a target area may be highlighted or emphasized by a displayed marking.
  • a depth image of the pointing object and the target area may be acquired.
  • Spatial information of the pointing object relative to the target area may be derived from the acquired depth image, in order to determine a location of the pointing position of the pointing object in the target area.
  • the determined location information may be used as input parameter to display the marking of the pointing position in the target area.
  • Using an acquired depth image may enable an accurate and reliable measure to determine the location of the pointing position in the target area, since information at one time instance may be used for the marking of the pointing position in the target area. In particular, movements of the pointing object do not reduce the accuracy of the marking of the pointing position.
  • the method may be executed in an efficient and easy way.
  • the pointing object may be arranged distant from the target area, an observer of the target area may, in comparison to a person pointing with his finger or arm, easily recognize the pointing position, since a marking may be displayed in the target area.
  • a user experience of a person associated with the pointing object and/or the marking device and/or a user experience of a person observing the target area may be enhanced.
  • the pointing object may comprise an arm and/or a hand of a person.
  • the pointing object may be configured as a wooden pole to be held by a person or any other elongated object held in the same manner.
  • the marking may be displayed at the determined location of the pointing position and/or at a predefined location distinct from the location of the pointing position.
  • the marking of the pointing position in the target area may be executed in a predictable way such that a user associated with the pointing object and/or the marking device may be enabled to adapt his behaviour to optimize the marking procedure.
  • the acquiring unit of the marking device and the target area may be fixed in a stationary position.
  • the pointing object may be located between the acquiring unit and the target area, and thus may be situated such that the pointing object has no physical contact with the target area.
  • the pointing object may be also essentially stationary relative to the marking device and the target area, thereby facilitating the determination of the location of the pointing position and thus enhancing the user experience.
  • the method may further comprise determining a distance between a reference point of the pointing object and the target area, and the determining of the location of the pointing position may comprise determining the location of the pointing position based on the determined distance.
  • the term "distance” may particularly denote a shortest connection between the reference point and the target area.
  • the distance may geometrically correspond to a distance line starting at the reference point and crossing the target area under a right angle at a perpendicular foot. Accordingly, in order to determine the distance between the reference point and the target area, a reference point may be selected in the depth image. A distance line may be then determined comprising the reference point and the perpendicular foot.
  • the accuracy of the marking procedure may be enhanced, since geometric information of the pointing object and the target area may be used.
  • the reference point may correspond to any point of the reference object.
  • the reference point may be a point at a surface of the reference object and/or a point on the pointing axis.
  • the method may further comprise determining a pointing angle defined between the pointing axis and a distance line associated with a distance between the pointing object and the target area, and the determining of the location of the pointing position may comprise determining the location of the pointing position based on the determined pointing angle.
  • the latter mentioned distance and distance line may correspond to the distance and the distance line explained above.
  • the determination of the pointing angle may comprise evaluating a dot product of the distance line and a line representation of the pointing axis.
  • a suitable algorithm may be used for numerically determining the location of the pointing position.
  • the location of the pointing position in the target area may be analytically determined based, for example, on trigonometric identities.
  • the acquiring of the depth image may comprise time of flight based sensing of the pointing object and the target area or stereo matching based sensing of the pointing object and the target area.
  • the time of flight based sensing may comprise emitting light pulses, for example of infrared wavelengths, to all objects in a field of view of the marking device, and sensing the light reflected from a surface of each object such that a depth information may be acquired based on the measured propagation time of the emitted light from its emission incidence to its sensing incidence.
  • Stereo matching based sensing may employ correlating information of two separately emitted light beams hitting a same object point, in order to acquire depth information in a field of view of the marking device.
  • conventional procedures may be employed for acquiring the depth image, thereby facilitating the marking of the pointing position in the target area.
  • the method may further comprise generating a feedback signal indicating the determined location of the pointing position in the target area, and providing the generated feedback signal for adapting the location of the pointing position of the pointing object.
  • the steps of generating the feedback signal and providing the feedback signal may be executed prior to the step of the displaying of the marking of the pointing position in the target area.
  • the providing of the feedback signal may be accomplished, for example, by visualizing or displaying an indicator in the target area indicating the determined pointing position.
  • the indicator may, for example, be configured as a red light spot caused by a light beam emitted by the displaying unit of the marking device.
  • a location and/or an orientation of the pointing object may be adapted based on the feedback signal such that an optimum location of the pointing position may be determined.
  • the accuracy of the marking of the pointing position in the target area may be significantly increased, since the optimum location of the pointing position may be iteratively determined based on the feedback signal.
  • a stability of an underlying algorithm may be increased, since the feedback signal may be used as further input parameter.
  • the method may further comprise determining a configuration of the pointing object, and the displaying of the marking may comprise displaying the marking based on the determined configuration of the pointing object.
  • configuration of the pointing object may particularly denote a shape of the pointing object at one time instance, for example during the acquisition of the depth image.
  • the pointing object may comprise an elongated shape, or an angled shape in terms of an ending portion of the pointing object being angled with respect to the pointing axis defined by the essentially elongated shape of the pointing object.
  • a static characteristic of the marking to be displayed may be selected based on the determined configuration of the pointing object.
  • An association between a configuration of the pointing object and the selected characteristics of the marking may be predefined, for example by the user of the marking device. Therefore the flexibility of the marking procedure may be increased.
  • the characteristic of the marking to be selected may comprise or may be configured as a shape and/or a color of the marking.
  • a shape of the marking may comprise at least one of a line, an arrow, an ovoid, a hand, a foot, and an animal.
  • the method may further comprise determining a time dependent change of a configuration of the pointing object, and the displaying of the marking may comprise displaying the marking based on the determined time dependent change of the configuration of the pointing object.
  • the latter mentioned configuration may correspond to the above explained configuration of the pointing object.
  • the term "time dependent configuration" of the pointing object may particularly denote a shape change of the pointing object during a time interval associated with a plurality of depth images.
  • a time-varying characteristic of the marking may be selected based on the time dependent change of the configuration of the pointing object. For example, the marking may then comprise at least one of a line being drawn, a color change, a circle being drawn, and hand writing.
  • An association between a time dependent change of the configuration of the pointing object and the selected characteristics of the marking may be predefined, for example by the user of the marking device.
  • an interaction between a person observing the target area and a user associated with the marking device and/or the pointing object may be enabled, thereby enhancing the user experience of both persons.
  • the time dependent change of the configuration of the pointing object may correspond to a hand gesture of the hand of the person. Determination of the time dependent change of the configuration of the pointing object may be based on pattern recognition, and a time dependent change of the configuration of the pointing object derived from a plurality of depth images may be compared to stored information of possible time dependent changes of the configuration of the or a similarly shaped pointing object.
  • the method may further comprise acquiring at least another depth image of the pointing object and the target area, wherein the pointing object may be arranged distant from the target area and may point to the pointing position in the target area, and the determining of the location of the pointing position in the target area may be based on the at least another acquired depth image.
  • a plurality of depth images may be acquired for the determining of the location of the pointing position in the target area.
  • the method may further comprise averaging the determined locations of the pointing position derived from the plurality of depth images, in order to increase the accuracy, reliability and stability of the marking procedure. In particular, unintentional movements of the pointing object may be accounted for.
  • the displaying of the marking may comprise adapting a characteristic of the displayed marking based on the determined configuration of the pointing object and/or the determined time dependent change of the configuration of the pointing object.
  • a characteristic of the displayed marking may be changed, in order to vary the emphasis to be put on the pointing position.
  • the target area may comprise a projector screen, a television screen or a board.
  • the method and the marking device may be used in conjunction with a presenter making a presentation using a projector screen. Further, displayed video information on a television screen may be marked during a video conference. Further, stationary information, for example adhesive tabs, fixed on the board may be marked. Thus, the method and the marking device maybe versatily applicable.
  • the marking device may be configured to automatically execute one or more steps of the method.
  • a potential for failures owing to human error may be reduced.
  • the marking device may be configured as a depth camera, particularly a time of flight camera, a Zcam, or a stereo camera.
  • the arrangement 100 comprises a marking device 102 in the form of a time of flight camera, a target area 104 which is part of a projector screen, and a presenter 106.
  • a marking device 102 in the form of a time of flight camera
  • a target area 104 which is part of a projector screen
  • a presenter 106 for ease of explanation, the target area and the projector screen will be denoted in the following with the common reference numeral 104.
  • the projector screen 104 is configured to show time varying information which are conceived in successively displayed slides.
  • Fig. 1 shows a screen shot of a slide displayed on the projector screen 104.
  • the time of flight camera 102 and the projector screen 104 are arranged in a stationary way, and are located at a known distance from one another. Further, the time of flight camera 102 comprises a field of view in which the projector screen 104 and the presenter 106 are located. The presenter 106 is standing between the projector screen 104 and the time of flight camera 102.
  • the arrangement 100 is associated with the presenter 106 making a presentation in the front of an audience, and enables the presenter 106 to highlight or emphasize a particular portion of a slide displayed in the projector screen 104 by overlaying a marking over the slide.
  • a step S0 the method starts.
  • the presenter 106 points to the particular portion using a pointing object 108.
  • the pointing object 108 comprises an essentially elongated shape along a pointing axis 110, and is configured as a right arm 112 and a right hand of the presenter 106.
  • the pointing object 108 may be configured as a wooden pole or the like which may be held by the presenter 106.
  • the time of flight camera 102 acquires a depth image of the presenter 106 standing in front of the projector screen 104 and pointing to a pointing position 116 in the slide shown on the projector screen 104.
  • the pointing position 116 of the pointing object 106 is associated with the pointing axis 110 intersecting the projector screen 104.
  • the word "Design" shown in the slide on the projector screen 104 represents the pointing position 116.
  • the image acquisition is based on time of flight based sensing of the field of view of the time of flight camera 102.
  • the marking device 102 may be configured as a stereo camera operating using stereo matching based sensing of the pointing object 108 and the target area 104.
  • the time of flight camera 102 associates the acquired depth image with a Cartesian coordinate system 320 comprising an x-axis 322, a y-axis 324 and a z-axis 326. Accordingly, the time of flight camera 102 attributes Cartesian coordinates to edge points of the projector screen 104 with the z-coordinate of the projector screen 104 being set to zero.
  • the edge points of the projector screen 104 comprise the coordinates (xp1, yp1, 0), (xp2, yp2, 0), (xp3, yp3, 0), and (xp4, yp4, 0).
  • a reference point 327 of the pointing object 108 is selected by the time of flight camera 102.
  • the reference point 327 is part of the pointing object 108, and is a point on a surface of the arm 112 of the presenter 106.
  • the reference point 327 comprises the coordinates (x1, y1, z1) in the coordination system 320.
  • the time of flight camera 102 determines a distance x between the reference point 327 and the projector screen 104 by dropping a perpendicular line between the reference point 327 and a plane 328 comprising the projector screen 104, i.e. the edge points of the projector screen 104. Accordingly, a perpendicular foot 330 in the plane 328 is identified which has coordinates (x2, y2, 0). In a case in which the arm 112 of the presenter 106 is located in front of the projector screen 104, the perpendicular foot 328 may also be located within an area bordered by the edge points of the projector screen 104. The distance x is calculated by subtracting, per each coordinate, the perpendicular foot 330 from the reference point 327.
  • the time of flight camera 102 determines a pointing angle A which is defined between the pointing axis 110 and a distance line 332 associated with the distance x between the reference point 327 of the pointing object 108 and the perpendicular foot 330.
  • another reference point 334 of the arm 112 is selected, and a distance between the reference point 327 and the another reference point 334 is calculated using the coordinates of these two points 327, 334.
  • a line comprising the reference point 327 and the another reference point 334 approximate the pointing axis 110.
  • the pointing angle A is then determined based on a line associated with the distance between the reference point 327 and the another reference point 334, the distance line 332, the distance between the reference point 327 and the another reference point 334, and the distance x in that a dot product of the two latter lines is evaluated.
  • Information about the coordinates of the edge points of the projector screen 104 are employed to determine the location of the pointing position 116.
  • a location of the pointing position 116 in the projector screen 104 is determined based on the coordinates of the reference point 327, the coordinates of the perpendicular foot 330, the determined pointing angle A and the perpendicular angle B defined between the distance line 332 and the plane 328.
  • trigonometric identities may be used for this determining step.
  • the coordinates of the pointing position 116 are identified to be (x3, y3, 0).
  • the time of flight camera 102 determines a configuration of the pointing object 108, namely a hand posture of the hand 114, using pattern recognition, in order to select a characteristic of a marking to be displayed at the pointing position 116. Since the hand 114 is straightened and exclusively a forefinger of the hand 114 is pointing to the pointing position 116, the time of flight camera 102 determines that an ellipse-type marking is to be selected.
  • a time dependent change of the configuration of the pointing object, a hand gesture of the hand 114 of the presenter 106 is determined in a predetermined time interval, in order to select a time varying characteristic of the marking to be displayed.
  • a plurality of depth images is acquired, and pattern recognition is employed to identify the hand gesture. Assuming the hand to be rotating around the pointing axis 110, the time of flight camera 102 determines that the colour of the marking to be displayed is to be changed from yellow to green.
  • a first part of the marking for example a red circle
  • a second part of the marking for example a yellow line
  • a step S10 the time of flight camera 102 displays the marking 436 in the form of a yellow ellipse around the location of the pointing position 116 in the slide of the projector screen 104 in order to mark the word "Design" at the pointing position 116.
  • the method stops. Alternatively, the steps S1 to S10 will be repeated during the complete time the presenter 106 makes his presentation.
  • a plurality of depth images may be acquired by the time of flight camera 102. Coordinates of determined locations of the pointing position 116 and respective results of the determined configurations of the pointing object 108 and the determined change of the configuration derived from the plurality of depth images may be averaged.
  • the time of flight camera 102 may generate a feedback signal 438 indicating a determined pointing position 116 in the projector screen 104, and may provide the generated feedback signal 438 to the presenter 106 prior to the display of the marking 436 at the determined pointing position 116 in the step S10.
  • the presenter 106 may be capable of determining whether the determined pointing position 116 may coincide with a desired pointing position such that the location of the marking 436 to be displayed may be adapted prior to the display of the marking 436.
  • the feedback signal 438 may be formed by a red laser spot of a small cross-section, which results from a laser beam emitted by the time of flight camera 102 and hitting the projector screen 104.
  • the laser spot may comprise the small cross section, the laser spot may be only visible for the presenter 106.
  • the presenter 106 may recognize the laser spot in the slide on the projector screen 106, the presenter 106 may adapt a location and/or an orientation of a part of the pointing object 106, namely the arm 112, to optimize the determined location of the pointing position 116.
  • the latter procedure may be executed prior to the step S10 and optionally also prior to the steps S8 and/or S9, and may be repeated several times until the desired location of the pointing position 116 coincides with the determined location of the pointing position 116.
  • the time of flight camera 102 may acquire a depth image for each generated feedback signal 438.
  • the presenter 106 may change the posture of his hand 114 and/or the time dependent gesture of his hand 114.
  • the time of flight camera 102 may accordingly determine the change using successively acquired depth images, and the characteristic of the marking 436 may be adapted either during the display of the marking 436 or between successive displays of the marking 436.
  • the time of flight camera 102 comprises an acquiring unit 540 configured to acquire the depth image of the pointing object 108 and the target area 104. To this end, as explained above, the pointing object 108 is arranged distant from the target area 104 and points to the pointing position 116 in the target area 104. Further, the time of flight camera 102 comprises a first determining unit 542 configured to determine the location of the pointing position 116 in the target area 104 based on the acquired depth image, and a displaying unit 544 configured to display the marking 436 of the pointing position 116 in the target area 104 based on the determined location of the pointing position 116. The first determining unit 542 is also configured to determined the distance x and the pointing angle A based on the acquired depth image.
  • the time of flight camera 102 comprises a second determining unit 546 configured to determine a configuration of the pointing object 108.
  • the second determining unit 546 represents a hand posture determining unit operating based on pattern recognition of the hand posture of the hand 114 of the presenter 106.
  • a third determining unit 548 of the time of flight camera 102 is configured to determine a time dependent change of a configuration of the pointing object 108, and represents a hand gesture determining unit operating based on pattern recognition of the gesture of the hand 114 of the presenter 106.
  • the displaying unit 544 is also configured to display or adapt the display of the marking 436 based on the determined configuration of the pointing object 108, i.e. the hand posture of the hand 114, and is configured to display or adapt the display of the marking 436 based on the determined time dependent change of the configuration of the pointing object 108, i.e. the hand gesture of the hand 114 of the presenter 106.
  • the acquiring unit 540 also comprises a light emitting subunit configured to emit light pulses.
  • the acquiring unit 540 additionally comprises a light receiving subunit configured to receive the emitted light pulses after a reflection of the emitted light pulses at the pointing object 118, the target area 104, and other objects in the field of view of the time of flight camera 102.
  • the light emitting subunit is configured as a light emitting diode (LED), and the light receiving subunit is configured as a photo diode.
  • At least two of the above mentioned units 540-548 of the time of flight camera 102 may be integrally formed and/or may be part of a processor. Further, an association between functionality based and actual physical units of the time of flight camera 102 may differ from the above described embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Claims (8)

  1. Procédé de marquage d'une position pointée (116) d'un objet de pointage (108) dans une zone cible (104) qui comprend un écran de projection, un écran de télévision ou un tableau, dans lequel l'objet de pointage (108) comprend une forme allongée le long d'un axe de pointage (110), dans lequel la position pointée (116) est associée à l'axe de pointage (110) coupant la zone cible (104), le procédé comprenant les étapes suivantes :
    - acquérir (S2) une image de profondeur de l'objet de pointage (108) et de la zone cible (104), dans lequel l'objet de pointage (108) est situé à distance de la zone cible (104) et pointe la position pointée (116) dans la zone cible (104) en utilisant une caméra (102)
    - déterminer (S7) un emplacement de la position pointée (116) dans la zone cible (104) sur la base de l'image de profondeur acquise,
    - générer (S10) un signal de rétroaction (438) indiquant l'emplacement déterminé de la position pointée (116) dans la zone cible (104), dans lequel le signal de rétroaction (438) résulte d'un faisceau laser émis par la caméra (102),
    - déterminer (S8) une configuration de l'objet de pointage (108), à savoir une posture d'une main (114) en utilisant la reconnaissance de formes, afin de sélectionner une caractéristique d'un marquage à afficher à la position de pointage (116),
    - déterminer (S9) un changement dépendant du temps de la configuration de l'objet de pointage, un mouvement de la main (114) du présentateur (106), dans un intervalle de temps prédéterminé, afin de sélectionner une caractéristique variant dans le temps d'un marquage (436) à afficher, dans lequel une pluralité d'images de profondeur est acquise, et la reconnaissance de formes est utilisée pour identifier le mouvement de main, caractérisé par les étapes suivantes
    - délivrer le signal de rétroaction (438) généré à un présentateur (106) avant l'affichage du marquage (436) à l'emplacement déterminé de la position pointée (116), et
    - afficher (S10) le marquage (436) de la position pointée (116) dans la zone cible (104) sur la base de l'emplacement déterminé de la position pointée (116)
  2. Procédé selon la revendication 1, le procédé comprenant en outre l'étape suivante :
    - déterminer (S5) une distance (x) entre le point de référence (327) de l'objet de pointage (108) et la zone cible (104), dans lequel la détermination (S7) de l'emplacement de la position pointée (116) comprend la détermination (S7) de l'emplacement de la position pointée (116) sur la base de la distance déterminée (x).
  3. Procédé selon la revendication 1 ou 2, le procédé comprenant en outre l'étape suivante :
    - déterminer (S7) un angle de pointage (A) défini entre l'axe de pointage (110) et une ligne de distance (232) associée à une distance (x) entre l'objet de pointage (116) et la zone cible (104),
    dans lequel la détermination (S7) de l'emplacement de la position pointée (116) comprend la détermination (S7) de l'emplacement de la position pointée (116) sur la base de l'angle de pointage (A) déterminé.
  4. Procédé selon l'une quelconque des revendications 1 à 3, dans lequel l'acquisition (S2) de l'image de profondeur comprend la détection par la méthode de temps de vol de l'objet de pointage (108) et de la zone cible (104) ou la détection basée sur la stéréo correspondance de l'objet de pointage (108) et de la zone cible (104).
  5. Procédé selon l'une quelconque des revendications 1 à 4, dans lequel l'affichage (S10) du marquage (436) comprend l'adaptation d'une caractéristique du marquage (436) affiché sur la base de la configuration déterminée de l'objet de pointage (108) et/ou le changement dépendant du temps déterminé de la configuration de l'objet de pointage (108).
  6. Dispositif de marquage (102) pour marquer une position pointée (116) d'un objet de pointage (108) dans une zone cible (104) qui comprend un écran de projection, un écran de télévision ou un tableau, dans lequel l'objet de pointage (108) comprend une forme allongée le long d'un axe de pointage (110), dans lequel la position pointée (116) est associée à l'axe de pointage (110) coupant la zone cible (104), le dispositif de marquage (102) comprenant :
    - des moyens pour émettre un faisceau laser,
    - une unité d'acquisition (540) configurée pour acquérir une image de profondeur de l'objet de pointage (108) et de la zone cible (104), dans lequel l'objet de pointage (108) est agencé à distance de la zone cible (104) et pointe la position pointée (116) dans la zone cible (104),
    - une première unité de détermination (542) configurée pour déterminer un emplacement de la position pointée (116) dans la zone cible (104) sur la base de l'image de profondeur acquise,
    - la première unité de détermination (542) et une unité d'affichage (544) configurées pour générer un signal de rétroaction (438) indiquant l'emplacement déterminé de la position pointée (116) dans la zone cible (104), dans lequel le signal de rétroaction (438) résulte d'un faisceau laser émis par le dispositif de marquage (102),
    - une deuxième unité de détermination (546) configurée pour déterminer une configuration de l'objet de pointage (108), à savoir une posture d'une main (114) en utilisant la reconnaissance de formes, afin de sélectionner une caractéristique d'un marquage (436) à afficher à la position de pointage (116),
    - une troisième unité de détermination (548) configurée pour déterminer un changement dépendant du temps de la configuration de l'objet de pointage (108), et représente une unité de détermination de mouvement de main fonctionnant sur la base de la reconnaissance de formes du mouvement de la main (114) du présentateur (106), caractérisé par
    - le dispositif de marquage (102) configuré pour délivrer le signal de rétroaction (438) généré à un présentateur (106) avant l'affichage du marquage (436) à l'emplacement déterminé de la position pointée (116), et
    - l'unité d'affichage (544) configurée pour afficher un marquage (436, 438) de la position pointée (116) dans la zone cible (104) sur la base de l'emplacement déterminé de la position pointée (116)
  7. Dispositif de marquage (102) selon la revendication 6 ; dans lequel le dispositif de marquage (102) est configuré comme une caméra de profondeur.
  8. Programme informatique, qui, lorsqu'il est exécuté par un processeur, est configuré pour exécuter ou commander un procédé de marquage d'une position pointée (116) d'un objet de pointage (108) dans une zone cible (104) selon l'une des revendications 1 à 5.
EP12290009.5A 2012-01-02 2012-01-02 Comportement de pointage activé par une caméra à profondeur Not-in-force EP2610714B1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP12290009.5A EP2610714B1 (fr) 2012-01-02 2012-01-02 Comportement de pointage activé par une caméra à profondeur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP12290009.5A EP2610714B1 (fr) 2012-01-02 2012-01-02 Comportement de pointage activé par une caméra à profondeur

Publications (2)

Publication Number Publication Date
EP2610714A1 EP2610714A1 (fr) 2013-07-03
EP2610714B1 true EP2610714B1 (fr) 2016-08-10

Family

ID=45656765

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12290009.5A Not-in-force EP2610714B1 (fr) 2012-01-02 2012-01-02 Comportement de pointage activé par une caméra à profondeur

Country Status (1)

Country Link
EP (1) EP2610714B1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331192B (zh) * 2014-10-20 2017-12-08 深圳市天英联合教育股份有限公司 电子白板显示方法
WO2016204743A1 (fr) 2015-06-17 2016-12-22 Hewlett Packard Enterprise Development Lp Action de pointage

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0571702A3 (fr) * 1992-05-26 1994-10-12 Takenaka Corp Dispositif de pointage avec la main et ordinateur mural.
JP4701424B2 (ja) * 2009-08-12 2011-06-15 島根県 画像認識装置および操作判定方法並びにプログラム

Also Published As

Publication number Publication date
EP2610714A1 (fr) 2013-07-03

Similar Documents

Publication Publication Date Title
US11652965B2 (en) Method of and system for projecting digital information on a real object in a real environment
EP3283938B1 (fr) Interface gestuelle
KR101954855B1 (ko) 볼륨 내 물체의 심도 맵핑을 위한 광 패턴의 강도 변화의 사용
US9430698B2 (en) Information input apparatus, information input method, and computer program
US9075455B2 (en) Method for determining the relative position of an object in an area, and optical input system
KR101560308B1 (ko) 가상 필기 입력을 위한 방법 및 전자 장치
US10878633B2 (en) Augmented reality-based measuring system
KR20020086931A (ko) 몸짓 기반 입력 및 타겟 지시용 단일 카메라 시스템
US10444344B2 (en) Optical sensor-based position sensing of a radio frequency imaging device
US20190242692A1 (en) Augmented reality-based system with perimeter definition functionality
CN108572730B (zh) 用于使用深度感知相机与计算机实现的交互式应用程序进行交互的系统和方法
EP2610714B1 (fr) Comportement de pointage activé par une caméra à profondeur
JP6070211B2 (ja) 情報処理装置、システム、画像投影装置、情報処理方法およびプログラム
US20200391317A1 (en) Welding operation measurement system
US20140204361A1 (en) Laser range finding
US10310080B2 (en) Three dimensional manufacturing positioning system
US10372268B2 (en) Spatial image display apparatus and spatial image display method
US9860519B2 (en) Method for correcting image phase
JP5623966B2 (ja) 可搬可能な電子黒板システムにおける再帰性反射材の設置支援方法及びプログラム
KR100932679B1 (ko) 길이측정이 가능한 프로브펜
JP2017040840A (ja) 情報処理装置、情報処理方法および投影装置
KR20100120556A (ko) 라인빔을 이용한 터치 좌표 검출 장치
JP2011237290A (ja) 座標測定装置および方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20140103

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

111Z Information provided on other rights and legal means of execution

Free format text: AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

Effective date: 20140303

17Q First examination report despatched

Effective date: 20140324

D11X Information provided on other rights and legal means of execution (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ALCATEL LUCENT

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602012021432

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06F0003030000

Ipc: G06F0003010000

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/01 20060101AFI20160218BHEP

Ipc: G06F 3/03 20060101ALI20160218BHEP

INTG Intention to grant announced

Effective date: 20160323

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

RIN1 Information on inventor provided before grant (corrected)

Inventor name: LOU, ZHE

Inventor name: VAN BROECK, SIGURD

Inventor name: VAN DEN BROECK, MARC

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 819596

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602012021432

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20160810

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 819596

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160810

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161210

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161110

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161212

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161111

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602012021432

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161110

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20170511

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170131

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170131

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170102

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170102

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20180122

Year of fee payment: 7

Ref country code: GB

Payment date: 20180119

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20180119

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20120102

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602012021432

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20190102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190131

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190801

Ref country code: CY

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160810

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160810