EP3281097A1 - Interaktive elektronische projektionsvorrichtung - Google Patents

Interaktive elektronische projektionsvorrichtung

Info

Publication number
EP3281097A1
EP3281097A1 EP16717309.5A EP16717309A EP3281097A1 EP 3281097 A1 EP3281097 A1 EP 3281097A1 EP 16717309 A EP16717309 A EP 16717309A EP 3281097 A1 EP3281097 A1 EP 3281097A1
Authority
EP
European Patent Office
Prior art keywords
image
projection
detector
frame
bracelet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16717309.5A
Other languages
English (en)
French (fr)
Inventor
Pascal Pommier
Guillaume POMMIER
Nicolas CRUCHON
Fabien VIAUT-NOBLET
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cn2p
Original Assignee
Cn2p
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cn2p filed Critical Cn2p
Publication of EP3281097A1 publication Critical patent/EP3281097A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1639Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection

Definitions

  • the field of the invention relates to interactive projection devices for projecting an image and offering interactivity on an area of the projected image.
  • the invention aims to overcome the aforementioned drawbacks.
  • An object of the invention relates to a projection device for displaying an interactive digital content intended to be projected on a surface.
  • the device of the invention comprises a projection frame.
  • the projection frame includes: ⁇ A transmitter of a light beam in a non-visible frequency band forming a light sheet to cover a first area of a surface;
  • a projector projecting an image onto a second area
  • a first detector capturing an image of the second zone
  • An object of the invention relates to a projection device for displaying an interactive digital content intended to be projected on a surface, characterized in that it comprises a projection frame comprising a part forming a holding base.
  • the projection frame includes:
  • a projector projecting an image onto a second area
  • a first sensor capturing an image of the second area
  • a computer determining at least a position of at least one interaction point of the light beam through the analysis of a trace of the image acquired by the first detector.
  • the transmitter and the first detector are arranged on a lower part of the projection frame and the image projector is arranged on an upper part of the projection frame, said projection frame comprising a rotation means for rotating the part of the projection frame. top of the projection frame along a vertical axis vis-à-vis the lower part of the projection frame.
  • the projection frame comprises a first side comprising the image projector on an upper part of the projection frame and a second side opposite to the first side on which the transmitter and the first detector are arranged on the lower part of the projection frame, said transmitter projecting the light sheet in a direction opposite to the image projection direction.
  • the projection frame comprises a first side comprising the image projector, the transmitter and the detector, the transmitter projecting the light sheet in a direction substantially parallel to the projection direction of the light source. images, so that the first zone is substantially superimposed on the second zone.
  • the image projector is disposed on an upper part of the projection frame and the transmitter is arranged on a lower part of the projection frame.
  • the projection frame comprises a rotation means for pivoting an upper part of the projection frame along a vertical axis vis-à-vis the lower part of the projection frame.
  • the detector records images comprising at least one interaction trace when the beam is intercepted by a body, the computer generating interaction instructions modifying the projected image as a function of an interaction zone detected.
  • the upper part of the projection frame is removable vis-à-vis its lower part, the upper part of the projection frame being an electronic bracelet comprising an image projector and the lower part being a base of maintaining said bracelet.
  • the bracelet comprises a band intended to be held around a wrist, a power source and a bracelet frame arranged and held on an upper part of the bracelet, said bracelet frame comprising the transmitter, the projector, the first detector and the computer.
  • the projection device comprises:
  • the projection device comprises:
  • the transformation factor calculations are calculated after a calibration of the image. According to another embodiment which can be combined with a calibration step, the transformation factor calculations are automatically calculated in real time according to image correction factors to be applied to the transformation factors.
  • the emitter of the light beam is an infrared emitter.
  • the transmitter is a linear transmitter projecting a substantially plane light beam.
  • the transmitter is arranged on the lower part of the projection frame.
  • the lower part of the device is the base 5 and its upper part comprise the parts 2 and 3.
  • the transmitter is arranged on the bracelet frame on the first side, at a height between the projector and the band of the bracelet.
  • the first detector comprises a range of sensitivity for detecting a trace caused by interception of the light beam with a body.
  • the first detector is an infrared detector capturing an image in which the light beam forms an image whose longitudinal dimensions, that is to say in the projection direction, are identifiable.
  • the first detector is an infrared detector capturing an image in which the light beam forms an image whose longitudinal dimensions, that is to say in the projection direction, are identifiable.
  • the first detector is arranged on the lower part of the projection frame between the upper part of the projection frame and the transmitter.
  • the position of at least one interaction point is calculated from:
  • a transformation of the acquired image and trace into an original image comprising an image of the trace from the transformation factors
  • a geometric construction of an interaction point of at least one trace or image of the trace is
  • the calculator compares the position of the interaction point with a matrix of points delimiting interaction zones in a reference frame linked to the original image, the calculator deducing a interaction probability with interaction zone .
  • the device further comprises a second detector capturing colorimetric images of the second zone.
  • the second detector may be arranged for example on the upper part of the projection frame. It can be co-located with the projector, that is to say substantially in the same vertical position. According to another embodiment, it can be arranged on the part forming the tower of the projection device or on the base of the latter.
  • the projection device comprises an image stabilizer, said image stabilizer comparing the images acquired by the second detector with dimensions of a reference image and generating corrective factors to be applied to the deformation factors. image according to the comparison of the images.
  • the image stabilizer compares the longitudinal dimensions of the second area of the images acquired by the first detector with the dimensions of the images acquired by the second detector to generate correction factors to be applied to the transformation factors.
  • the second detector performs a second calculation of an interaction point by analyzing a trace intercepting the image, said trace being obtained by analyzing a change in the color of the pixels of the image. a portion of the acquired image.
  • a calculator correlates the position of an interaction point obtained from an image acquired by the first detector and the position of an interaction point obtained from an image acquired by the second detector, the correlation of positions to generate a new position of an interaction point.
  • the image projector is a color pico-projector.
  • the image projector comprises a blue laser, a red laser and a green laser and a set of micro-mirrors orienting itself so as to produce in each point of projection a point whose color is generated by a combination of the three lasers oriented by the mirrors.
  • the image projector is an LCOS type projector.
  • the image projector emits an image whose resolution is 1920x2080.
  • the projection device comprises an accelerometer and a gyroscope for activating functions generating a modification or a change of the image projected by the projector.
  • the power source is a removable battery.
  • FIG. 1 a front view of an electronic bracelet comprising a projector of the invention
  • FIG. 2 an arm of a user wearing a bracelet of the invention
  • FIG. 3 a side sectional view of the bracelet of the invention and a projection mode of an image
  • FIG. 4 is a representation of a grid used to detect an interaction point on a projected image
  • FIG. 5 a superimposition of an original image and a matrix of delimitation points for the calculation of an interaction zone
  • ⁇ Figure 6 is a projection of the invention device in a first configuration
  • ⁇ Figure 7 a projection device of the invention according to a second display configuration.
  • FIGS. 1 to 5 show a particular embodiment of the invention in which the projection device 1 of the invention comprises a removable upper part comprising an electronic bracelet 100.
  • This embodiment is a particular embodiment of the invention as far as where the electronic bracelet 100 can be advantageously worn by a user according to a first use and can be positioned on a base 5 for a second use.
  • the bracelet of the invention 100 is intended to cooperate with a base 5 thus forming a projection frame 2, 3, 5.
  • FIGS. 6 and 7 represent a more general mode of the invention in which the frame 2, 3 forms a projection tower which may optionally comprise a base 5.
  • the object of the invention comprises, in the first place, the projection device 1 .
  • a portion of its frame 2, 3 may be a bracelet cooperating with a base 5.
  • the description firstly describes the first use by a description of an electronic bracelet 100 which can be positioned on a base 5.
  • the common principles of operation including the detection of an interaction point or pairing with a Smartphone type equipment are described in the light of this first use and are applicable to the second use.
  • the first and second uses are jointly described for the common parts in the first use of the projection device of the invention.
  • FIG. 1 shows an embodiment of a bracelet 100 of the invention.
  • the bracelet comprises a frame 3, a band 2 and a power source 4.
  • the band 2 forms the part of the bracelet for maintaining said bracelet around the wrist of a person. It may include means for adjusting the fastening position of the band to accommodate different circumferences of wrists. The means for adjusting the position of the bracelet 100 may also make it possible to fix two parts of the band around the wrist together.
  • the band may be of flexible elastic material, fabric or rigid material such as a rigid plastic material, metal or foam or any other material for making a band.
  • the band 2 may comprise a thin thickness of the size of a watch strap of a few millimeters or be thicker of the order of 1 or 2 cm.
  • the lower part of the bracelet 100 is designated as a part opposite the part of the bracelet 100 comprising the frame 3 corresponding to the upper part of the bracelet 100.
  • a power source 4 is positioned in the lower part of the bracelet 100 so as to make the frame 3 less bulky in volume and so as to balance in weight and / or aesthetically the bracelet 100 of the invention from either side.
  • the bracelet 100 can find a better balance when held around a wrist.
  • the supply connector (s) feeding the electronic components of the frame can (wind) be conveyed (s) along the band 2 for example inside the band 2 so as to to be masked from the outside.
  • the power source 4 is arranged on the upper part of the bracelet 100.
  • the frame 3 may comprise the power source 4.
  • the power source may be included in another frame attached or juxtaposed to the frame 3 on the upper part.
  • the power source is a rechargeable battery. The battery can then be removable and thus be removed from the bracelet 100 to be recharged. Another solution is to place the bracelet 100 on a base comprising a power supply for charging the battery that remains in position in the bracelet.
  • the power source is an exchangeable battery.
  • the frame 3 comprises an infrared emitter 30.
  • the emitter 30 then emits a light beam 31 forming a light sheet.
  • the transmitter 30 is arranged in the lower part of the frame and can be a linear transmitter.
  • the lower part of the frame 3 is defined as the part closest to the skin of the wrist or forearm or the hand of a person wearing the bracelet 100.
  • the display can be made on the anterior or posterior side of the forearm.
  • the projection of images can be performed on the inside or outside of the hand.
  • One advantage is to achieve a substantially plane beam closer to the skin and substantially parallel to the surface of the wrist or forearm.
  • Figure 3 shows a sectional view in which the beam 31 is parallel to the surface of the forearm 101 and located at a height d.
  • the frame 3 comprises a projector 20 for projecting an image along an axis intercepting the wrist or forearm of a person wearing the bracelet 100.
  • FIG. the projection cone 21 intercepting the forearm 101 to form an image 22.
  • the projector is arranged in the frame at a height denoted HLONG of the surface of the forearm 101.
  • the frame 3 comprises a first detector 10 for detecting in the infrared range the color changes related to the interactions of the beam emitted by the transmitter 30.
  • the frame 3 comprises a second detector 11 for detecting the images emitted in the visible frequency domain in order to adjust in real time the size of the image and / or to calculate in real time the deformation factors at apply to the projected image.
  • the frame 3 comprises calculation means, denoted M in FIG. 3, such as a computer that can be a microprocessor, a microcontroller or an electronic chip.
  • the calculating means may comprise, according to the embodiment chosen, one or more computers performing the various functions of image processing. Among these functions are the generation of images and the calculation of deformation factors and / or image correction.
  • the calculator makes it possible to calculate interaction point and servo position positions for generating new images based on the detected commands.
  • the computer is therefore capable of generating the images to be projected according to the detected interactions, as well as any other functions necessary for carrying out the invention.
  • the frame 3 comprises one or more memories, denoted M in FIG. 3, for storing temporary computed values or for storing interaction information such as interaction point positions or for storing data for generating images or any other data necessary for carrying out the invention.
  • the frame 3 comprises an accelerometer and a gyroscope for measuring movements of a wrist and / or the forearm of a person wearing the bracelet 100.
  • the detected movements can be measured by comparing values of the accelerations with reference to known and recorded values which correspond to actions to be carried out.
  • the ignition of the bracelet can be performed by turning the wrist twice along the axis of the forearm.
  • the acceleration values measured over a given period of time make it possible to determine which action must be taken according to the motion sequence detected.
  • a wakeup action can be engaged to turn on the bracelet 100 when a threshold of acceleration in rotation around the forearm has been crossed.
  • actions can be indicated according to acceleration values measured according to the three axes of a Cartesian reference system to generate specific actions such as: activate a detector or a projector, turn off a detector or a projector, generate an image or modify the generated image, activate a new image from a first image based on a browsing history.
  • FIG. 2 shows a bracelet 100 of the invention positioned in a portion of the arm situated between the forearm 101 and the hand 100. This junction zone is called the wrist 102. This zone is advantageously intended for the wearing of the bracelet 100 of the invention.
  • the bracelet 100 is shown projecting an image 22 on the forearm 101 of a person.
  • the infrared light beam 31 is also represented superimposed on the image displayed on the forearm 101.
  • the display of the image 22 and the generation of the beam 31 can be made on the hand.
  • a mode makes it possible for example to return the bracelet 100 and to invert the display in order to activate the projection direction of the image 22 towards the hand while benefiting from an image displayed in the reading direction.
  • the bracelet is configured for display towards the direction of the hand.
  • this display mode does not provide the entire projection surface of the forearm 101. Fingers and especially carpal bones limit the display area and cause image distortion displayed.
  • the movements of the hand 100 are often sharper and more sporadic than those of a forearm 101, therefore, the image stabilizer must be more responsive and must be configured to take into account these movements of the hand.
  • the display mode in the direction of the forearm 101 is described and corresponds to a preferred embodiment of the invention.
  • the beam 31 is emitted preferentially in a non-visible frequency band so as not to alter the image 22 projected by the projector 20.
  • the light beam is emitted by a linear emitter generating a substantially plane beam in a range. infrared frequency.
  • the beam is emitted in a plane substantially parallel to the surface of the skin between 1 mm and 1 cm from the surface of the skin. A distance of between 1 mm and a few millimeters makes it possible to obtain good detection efficiency on the projection zone by limiting the errors of false detections.
  • a power management module of the transmitted beam can be integrated into the frame.
  • a control accessible by a user that can be either digital or by means of a discrete allows him to adjust the power of the beam. This command adjusts for example a night mode or a day mode. By default, the power is configured to provide good day and night detection.
  • a detector 10 makes it possible to acquire, in a given frequency range, at least one trace 32 formed by the interception of the beam by a body.
  • the body intercepting the beam is generally the finger of a user who is positioned on an area of the displayed image.
  • An advantage of the bracelet 100 of the invention is to reproduce interactivity comparable to that of smartphones or tablets that include a touch screen but without the use and size of such a screen.
  • Another body can be used such as a stylus.
  • an advantage of the invention is that the interaction can be detected even with the use of a glove which does not allow a touch screen.
  • the detector 10 When a body intercepts in at least one point the light beam 31, the detector 10 captures a light variation which can result in the presence of a white spot when the beam is an infrared beam.
  • the detector 10 thus makes it possible to generate an image comprising a trace 32 having coordinates in the acquired image corresponding to the point of interaction that the user wishes to engage by an action of his finger. It is recalled that the user does not see the infrared beam 31 but only the projected image 22.
  • the bracelet offers transparent control interactivity for the user.
  • the interaction point therefore corresponds to an area of the image that it wishes to activate.
  • the activation may correspond to the desire to navigate on another page by activating a link or may correspond to a choice among a list of choices or an option displayed and to be validated. Other examples of activations are possible according to the bracelet 100 of the invention.
  • An advantage of the arrangement of the detector 10 which is positioned at a greater height than the beam transmitter 30 on the frame of the bracelet 100 vis-à-vis the surface of the wrist is to obtain a good image reflecting traces related to the interception of the beam by a body.
  • Other systems exist evaluating the position at depth of an interaction such as a "radar" type of operation, but these latter solutions remain approximate and do not discriminate many points on the interaction zone. These devices generate many false detections because of the imprecision of the evaluation of the distance from the body to the detector.
  • An advantage of the bracelet of the invention is to provide an arrangement of the detector providing a perspective of detection of the displayed image. This configuration improves the detection of an interaction point and the accuracy of determining the coordinates of the center of the interaction point.
  • the determination of the interaction point can be performed in the referential linked to the projected image or in the reference frame of the image acquired and transformed. Both variants are substantially equivalent and offer comparable results. Either of these methods has its own advantages that can be chosen according to the intended design. For example, when the position calculations are performed in the frame of the image displayed on the forearm, the calculations related to the transformation matrices of the detected spot are simplified. On the other hand, when the position computations are carried out in the repository of the image acquired and transformed, a gain of precision can be obtained on the determination of the center of the detected spot and the determination of the activated zone of the image.
  • the bracelet 100 or the projection device 1 of the invention therefore allows the analysis of at least one position of a user interaction point to determine which zone of the image will be activated.
  • the image may include areas of interaction.
  • a software cut of the image makes it possible to segment the image into different zones of interaction.
  • the invention then makes it possible to compare the position of the point of interaction of the beam with reference points and to identify to which interaction zone of the image this point corresponds.
  • the bracelet 100 of the invention makes it possible to transform the image acquired by the detector 10 into a known reference frame of an undeformed image.
  • the image projected before the application of deformation factors or the image acquired after the application of deformation factors is called the original image.
  • Different deformations can be applied to the acquired image to switch it to a format linked to the original image.
  • the deformations applied to the image during projection allow a user to view the image as if it were displayed while maintaining the proportions of an original image.
  • the deformations applied during the acquisition of images make it possible to take into account the differences related to the fact that the detection plane is not parallel to the plane of the image and the effects of perspective.
  • a first transformation may be applied to compensate for the lateral shift DLAT of the detector 10 vis-à-vis the central axis of projection.
  • a second transformation can be applied to compensate for the perspective effects of the image projected in depth and to apply a transformation aimed at restoring the image acquired in a 2D plane.
  • the perspective effects can take into account the height between the detector 10 and the plane of the projected image 22 substantially parallel to the plane forming the forearm 101.
  • the acquired image can then be transformed to compensate for this difference.
  • the lateral perspective effects can be taken into account by the transformation factors as well as the edge effects including in particular the part of the image closest to the bracelet as well as the farthest part.
  • a third transformation can be applied to compensate for the effects of surfaces related to the anatomy of the forearm or the hand that would be taken into account in projecting the projected image 22.
  • the trace 32 detected in the acquired image can be transformed so as to obtain a trace 32 'in an image reference frame that is not deformed by the projection of the latter on the forearm.
  • the undistorted image is named the original image as previously stated.
  • the trace 32 'in the original image includes one or more pixels in the repository of the original image.
  • a step of determining the center 33 of this trace 32 ', or a center of gravity, can be engaged to determine the most likely point of interaction of the image that a user wished to activate.
  • Point thus calculated, called "interaction center” 33 can be a pixel of the image.
  • a step of comparing the position of the interaction center 33 with the original image makes it possible to determine the action to be taken by the computer.
  • the interaction center can be calculated on the acquired image not yet corrected by the transformation factors.
  • the transform of the interaction center of the trace of the acquired image makes it possible to determine the position of the interaction center of the trace of the original image.
  • the position of the interaction center 33 is compared with a matrix of points delimiting zones 55 or 55 'according to whether the acquired image or the original image obtained by the transformation of the image is considered. acquired image.
  • zones 55 are shown in perspective in FIG. 4 and superimposed on the beam 31 and are represented in the original image in FIG. 5.
  • Delimiting points 50 are defined in order to delimit the interaction zones 55 '.
  • FIG. 5 represents the zones 55 'in a reference frame linked to the original image as well as the delimitation points 50.
  • the image is delimited in zone 55' by a grid in the reference frame linked to the original image.
  • the grid has identical rectangular surfaces. But any other grid is compatible with the bracelet 100 of the invention.
  • areas 55 'forming squares, diamonds or circles can be defined.
  • the position of the interaction center 33 is thus compared with the position of the boundary points 50 or the boundaries of the zones 55 '.
  • An algorithm for calculating the distances from the interaction center 33 to the closest delimitation points 50 makes it possible to estimate the zone 55 'which is activated by the user.
  • the bracelet of the invention is configured to determine the proportion of the task in each zone.
  • the area 55 'comprising the largest number of pixels of the trace 32' is determined as the active area.
  • the display of the image comprises activatable zones whose activation is determined according to the calculation of the position of the interaction center 33 in the image.
  • Figure 5 shows icons 201 of the original image.
  • the computation of the trace 32 ', or of its center 33, in the reference frame of the original image can be reconciled:
  • the use of the zones remains optional.
  • the use of the zones makes it possible to make the detection of interaction points more robust by making a simple comparison with the corresponding area corresponding to the determined interaction point. The comparison makes it possible to arrive at an action that is related to the activation of said determined zone.
  • Other actions combining different interaction points 33 can be detected according to the same principle by the bracelet 100 of the invention.
  • two interaction points 33 can be detected.
  • motion detection includes detecting a set of interaction points.
  • Instant vectors can be deduced.
  • the computer makes it possible, for example, to enlarge an image portion or the entire image according to the position of the determined interaction centers.
  • Figure 3 shows the projector 20 projecting an image on the forearm 101.
  • Projection makes it possible to distort an original image by applying deformation factors to the image.
  • deformation factors take into account perspective effects.
  • These perspective effects can take into account in particular the depth of field, that is to say the delimitation of the image to be displayed on the furthest part of the bracelet 100 and take into account the height of the projector vis-à- screw of the projection plane located on the skin of the forearm 101.
  • the image transformation factors to compensate for the perspective effects take into account the lateral deformation of the image, i.e. the points of the image farthest from the main optical axis. .
  • the deformations take into account the anatomy of the surface of the forearm or the hand of a user according to his morphology. For example, an average or standard morphology is applied to an image projected by the bracelet 100 and can be adjusted according to the different morphologies of users. Transformation factor correction factors can be applied to modify the transformations applied to the projected image.
  • An objective of image transformation factors is to display an image on the forearm of a user that is close to an original image for the user. It is then necessary to compensate some natural deformations related to the image projection mode and the projector itself.
  • the calculator of the bracelet of the invention or the projection device 1 may make it possible to perform the image processing calculations, in particular the application of the deformation and / or correction factors.
  • Another calculator can be used to generate the images.
  • the same computer generates the images and transforms the images from the deformation and / or correction factors.
  • the projector can have a laser projector type pico-laser color projector.
  • the image projector comprises a blue laser, a red laser and a green laser and a set of micro-mirrors oriented so as to produce at each projection point a point whose color is generated by a combination of the three lasers oriented by the mirrors.
  • the image projector may be for this purpose an LCOS type projector, designating in the English terminology "Liquid Crystal On Silicon” and meaning Liquid Crystal on Silicon.
  • This technology mixes a light source with a light source.
  • the light source can be generated by one or more laser (s) or one or more diode (s).
  • a liquid crystal display can be directly mounted on an integrated component.
  • a prism can also be used.
  • the image projector emits an image whose resolution is 1920x2080.
  • the bracelet 100 or the projection device 1 comprises a second detector 1 1 for acquiring color images.
  • the second detector allows to apply factors fixes to the transformations to apply to the projected image.
  • an analysis of the contours of the projected image makes it possible to readjust the transformation factors to be applied to the projected image. For this, corrective factors are applied to the transformation factors to take into account the actual display detected on the forearm.
  • the second detector 1 1 makes it possible to improve, in particular, the image stabilizer function.
  • the contours of the displayed image 22 can be regularly compared to a control image having desired nominal dimensions whose characteristics are recorded in a memory M. This real-time comparison of the dimensions of the displayed image and that recorded makes it possible to generate data. corrective factors. The correction factors can therefore be generated according to the differences calculated by a calculator between two image dimensions.
  • correction factors compensate for wrist, hand and forearm movements.
  • correction factors can also compensate for an inclination of the frame when the bracelet has play around the wrist.
  • the correction factors make it possible to offer a user an image stabilizer function.
  • the dimensions of the images captured by the two detectors 10 and 11 can also be compared to ensure coherence of the image stabilization and the detection of interactive area of the image.
  • the second detector 1 1 also makes it possible to perform a second detection calculation of an interaction center or an interaction zone.
  • the positions obtained by the two detectors coupled to one or more computers can be correlated to reduce the rate of false detections.
  • An image calibration can be defined by means of the bracelet
  • the image calibration is to project a reference image and to apply transformation factors according to the wishes of a user.
  • the bracelet of the invention comprises an interface comprising buttons or contactors for applying changes in correction factors or image deformation.
  • the calibration makes it possible to ensure that the image is displayed appropriately for a user, that is to say in a substantially rectangular format compensating the effects of perspectives.
  • the calibration makes it possible to adapt the format of the image to a given morphology of a user.
  • One advantage is to determine a display format and then make corrections to compensate for movements when using the bracelet.
  • Another advantage is to allow the calibration phase at any time to compensate for drifts or changes in anatomy. It is also possible to customize the display according to users, the bracelet can then be preconfigured according to different calibrations and thus be used by different people.
  • the bracelet 100 comprises a fabric that can be deployed on the forearm to form a screen.
  • the fabric can be integrated in the strip 2 or in the frame 3 or in the compartment 4 which comprises the battery.
  • the fabric can be held at its end by an elastic band or a second bracelet that attaches to the arm to stretch it.
  • the bracelet 100 extends longitudinally for example by means of overlapping concentric rings.
  • a locking and unlocking system makes it possible to switch to an extended mode of the bracelet 100 and to lock it.
  • the rings are then designed to hold for example through diameters cooperating at the ends between two overlapping rings.
  • the elongation device of the bracelet 100 is then similar to a deployment device of the "fishing rod" type.
  • the rings may be portions of unclosed rings that cover a part of the forearm.
  • One advantage is to form a screen superimposed on the skin of the forearm. This embodiment makes it possible in particular to overcome the surface topology of the forearm of a person, bristles and different thickness of the forearm.
  • the bracelet 100 of the invention or the projection device 1 can be paired with equipment connected to a mobile or terrestrial network such as a smartphone, tablet or computer.
  • a wireless link is advantageously used to pair the devices together.
  • the wireless link can be established using a Bluetooth or Wifi protocol or any other protocol that allows such a link to be established.
  • the bracelet of the invention comprises for this purpose a radio component or network for establishing such a connection.
  • the computer present in the bracelet of the invention is configured to process the data received from the equipment and process the images to project them.
  • the image projected by the bracelet of the invention is an image generated by the equipment and transmitted by the wireless link to the bracelet.
  • the bracelet when an interaction on the image displayed by the bracelet is detected, the bracelet deals with the interaction independently of the equipment for example by offering an interactive menu and validating a choice of the user or by actuating a button for generating a second image.
  • the bracelet is for example able to directly generate a request to a network equipment via access to a network independently of the equipment.
  • the bracelet includes a memory in which are saved the data to be displayed. Other interactions are possible according to alternative embodiments that can be combined with each other.
  • the bracelet of the invention is configured to generate a request to the equipment in order to process the interaction detected on the displayed image.
  • the bracelet is then used as an alternative display of equipment such as the Smartphone. If for example a button displayed on the forearm is activated, the query generated to the equipment can return an action or an image to the bracelet. The latter will then be able to project the result of the interaction.
  • the bracelet 100 or the projection device 1 comprises a network component for connecting to a network.
  • the connection can be made, for example, by a wireless link such as a link Wifi or Bluetooth or any other protocol for establishing a wireless link.
  • the bracelet can be configured to connect to an internet box via a Wi-Fi network or a 3G or 4G mobile network. The bracelet is then able to generate requests via the box through the network to interface with a server.
  • the bracelet is a connected bracelet that can display for example a digital content from the Internet.
  • a video of a video platform can be displayed on the forearm thanks to the reception of video frames and their processing by the computer of the bracelet and the image projector.
  • FIG. 6 represents another embodiment of the invention in which:
  • the strap 100 may be affixed to and / or attached to a support 5 forming a retaining base or;
  • the projection frame 2, 3, 5 forms a tower comprising a projector, an IR transmitter and a detector.
  • the bracelet When the bracelet is affixed to the base it performs the functions of a projection tower, object of the invention and corresponds to the parts of the projection frame 2 and 3.
  • object of the invention When the bracelet is affixed to the base at first.
  • a second display mode may be engaged by a user.
  • the image obtained can be larger, in particular by obtaining a larger interception surface between the cone 21 'and the display surface.
  • the base can include its own power source to power the bracelet 100 or recharge the battery 4 of the bracelet 100.
  • the base comprises an emitter 61 of an infrared beam and a detector 60 for acquiring an image 62 and detecting the interaction points of a finger in an interactivity zone .
  • the interactivity zone is projected on the other side of the strap 100 with respect to image projection 22 as shown in FIG. 5.
  • the user can therefore use his finger as a mouse to animate a cursor displayed on the image or to activate an area of the image 22.
  • the calculator K makes it possible to take into account the movements or actions carried out by an interacting finger on the interactivity zone 62 to generate actions on the image 22 to be displayed.
  • the actions that the computer can engage there is in particular according to the invention: a modification of the image, a validation of a choice to generate a request from a server for example, the generation of a new image, activating a menu or button, etc.
  • FIG. 6 An object of the invention relates to a projection device as shown in Figure 6.
  • the frame 2, 3 is not necessarily an electronic bracelet.
  • a first configuration is shown in Figure 6, the infrared transmitter 61 and the detector 60 are arranged on a second side C2 opposite the first side C1 in which the projector is arranged.
  • the projection zone Z1 forming a light sheet that is preferentially invisible to the user makes it possible to produce a pad.
  • a pad can be understood as an interactivity area on which a user can interact to activate functions, move an image, control a cursor that is displayed on the image.
  • the detector 60 is capable of detecting the moving points of a body such as a person's finger.
  • the detector 60 can be arranged just above the vertical axis of the transmitter so as to benefit from the perspective effects of the emitted beam and to acquire an image whose dimensions make it possible to detect an interaction point on the original image. It is recalled that the original image is the image of which deformed by the projection. This is the image generated by the processor or calculator generating the images before the image is distorted by image distortion factors.
  • the detector may be located in the upper part of the projection frame 2, 3.
  • the interaction point being calculated on the image 62, it is possible to generate a cursor at a position of the displayed image 22.
  • the movements of a finger on the image 62 can be reflected by the generation of a movement of a cursor displayed on the image 22.
  • the computer transposes the coordinates of the interaction point calculated in the zone Z1 of the image 62 acquired by the detector 60 in the image projected by the projector 20.
  • FIG. 7 represents an embodiment in which the projection of images 22 is carried out in the same direction as the emission of the light ply 31.
  • the light sheet 31 in this case is superimposed on the displayed image 22 thus giving the illusion to the user that he interacts directly on the image 22.
  • the transmitter 61 is preferably arranged on the lower part of the projection frame, for example on the base 5, so that the light beam 31 is emitted parallel to the projection surface and closest to this surface .
  • the detector 60 is positioned a little higher than the transmitter along the vertical axis defining the axis in which the frame extends vertically.
  • the height shift makes it possible to benefit from the formation of an image acquired by perspective effect.
  • the detector 60 can be arranged on the base 5 or on the frame 2, 3 forming the projection tower.
  • an advantage is to overcome the detection of the hand of a user superimposed on the image. Given the capture angle of the detector 60, as shown by the dotted lines of the capture area of the detector 60 of Figure 7, it is understood that a finger intercepting the beam 31 will be well detected without taking into account the hand of a user.
  • Figure 7 shows a detector 10 disposed on the part 3 of the frame of the projection device of the invention.
  • the acquisition of the image making it possible to deduce the interaction point makes it possible to obtain an important image and to obtain a good accuracy of the point of interaction.
  • the projection device comprises a removable part.
  • the upper part of the frame can turn around an axis of rotation AAR.
  • the part 2, 3 rotates about this axis, the pivoting being made vis-à-vis the base 5.
  • the base 5 has not moved during this rotation, this embodiment allows the passage from a display and interaction mode to a second display and interaction mode. It is therefore possible to switch from the embodiment of Figure 6 to the embodiment of Figure 7 by turning the parts 2, 3 of the projection frame.
  • the rotation of the lower part such that the base with respect to the upper part defines an angle between the projection axis of the light beam 31 and the projection axis of the image 22
  • this angle can be 180 ° as shown in FIG. 6.
  • this angle can be 0 °, which corresponds to the embodiment of FIG.
  • intermediate angles of 90 °, 270 ° or other angles may be considered.
  • a holding member may be used to secure the lower portion to the top to lock the rotation.
  • FIGS. 6 and 7 inherits all the embodiments described above, including in particular:
  • ⁇ pairing to third-party equipment such as a smartphone using a wireless component
  • a network component for an internet connection for example
  • the transmitter 30 on the upper part 3 of the projection frame can be used when the projection device 1 is used to project on a surface of a
  • the projection device 1 may comprise a removable upper part 3 which makes it possible to have a miniaturized projector.
  • the projection frame 1 is thus formed of the single part 3.
  • FIG. 3 This embodiment is shown in which the projection is carried out on a plane Ps. This figure has previously been described by considering the surface Ps as the surface of a forearm 101. The operation of such a projection device 1 is then similar to that of the bracelet 100, the projection plane being any surface Ps.
  • parts 2 and 5 of the projection frame are optional. Part 2 which forms a tower may be optionally arranged between the upper part of the projection frame 3 and the lower part 5 of the projection frame.
  • the projection device can be miniaturized so as to make it transportable.
  • the height of the device is less than 10 cm.
  • the height of the device is less than 5 cm.
  • the projection device of the invention may include a power source allowing it to be autonomous.
  • the source is rechargeable for example.
  • a battery can be made removable to be recharged directly by a standalone charger.
  • the projection device of the invention comprises an interface for connecting an external power source.
  • the power source is not necessarily integrated in the part 2 and 3 of the projection support.
  • Parts 2 and 3 may comprise connectors to be connected to a power source included in the base 5 or possibly in the tower 2 of the projection device 1.
  • the frame material may be metal, plastic, wood or any other material for making a frame.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)
EP16717309.5A 2015-04-10 2016-04-08 Interaktive elektronische projektionsvorrichtung Withdrawn EP3281097A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1553162A FR3034890B1 (fr) 2015-04-10 2015-04-10 Dispositif electronique de projection interactif
PCT/EP2016/057849 WO2016162539A1 (fr) 2015-04-10 2016-04-08 Dispositif electronique de projection interactif

Publications (1)

Publication Number Publication Date
EP3281097A1 true EP3281097A1 (de) 2018-02-14

Family

ID=53541763

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16717309.5A Withdrawn EP3281097A1 (de) 2015-04-10 2016-04-08 Interaktive elektronische projektionsvorrichtung

Country Status (4)

Country Link
US (1) US20180074653A1 (de)
EP (1) EP3281097A1 (de)
FR (1) FR3034890B1 (de)
WO (1) WO2016162539A1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3055158B1 (fr) * 2016-08-19 2019-11-08 Ontracks Systeme d'aide a la navigation visuelle et lateralisee
CN110161793A (zh) * 2019-04-16 2019-08-23 苏州佳世达光电有限公司 一种投影调整系统、投影机及支撑机构
CN113709434A (zh) * 2021-08-31 2021-11-26 维沃移动通信有限公司 投影手环及其投影控制方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1316055A4 (de) * 2000-05-29 2006-10-04 Vkb Inc Einrichtung zur eingabe virtueller daten und verfahren zum eingeben alphanumerischer und anderer daten
GB201205303D0 (en) * 2012-03-26 2012-05-09 Light Blue Optics Ltd Touch sensing systems
JP2015041052A (ja) * 2013-08-23 2015-03-02 ソニー株式会社 リストバンド型情報処理装置および記憶媒体

Also Published As

Publication number Publication date
WO2016162539A1 (fr) 2016-10-13
US20180074653A1 (en) 2018-03-15
FR3034890B1 (fr) 2018-09-07
FR3034890A1 (fr) 2016-10-14

Similar Documents

Publication Publication Date Title
FR3034889A1 (fr) Bracelet electronique pour l’affichage d’un contenu numerique interactif destine a etre projete sur une zone d’un bras
EP3129849B1 (de) Systeme und verfahren zur augenverfolgungskalibrierung
US9864430B2 (en) Gaze tracking via eye gaze model
CN104871084B (zh) 自适应投影仪
CN105917292B (zh) 利用多个光源和传感器的眼睛注视检测
US9779512B2 (en) Automatic generation of virtual materials from real-world materials
FR3011952A1 (fr) Procede d'interaction par le regard et dispositif associe
EP2582283B1 (de) Verfahren zur beurteilung einer referenzhaltung
CN106415444A (zh) 注视滑扫选择
EP2999393A1 (de) Verfahren zur bestimmung von augenmessungen unter verwendung eines verbrauchersensors
FR2897680A1 (fr) Dispositif de capture de mouvement et procede associe
CN106255938A (zh) 传感器和投影仪的校准
EP3123445A1 (de) Rechnerische arraykamera mit dynamischer beleuchtung zur augenverfolgung
FR2980592A1 (fr) Procede de mesure de parametres morpho-geometriques d'un individu porteur de lunettes
WO2013153538A1 (fr) Procede de determination de la direction du regard d'un utilisateur
US10996335B2 (en) Phase wrapping determination for time-of-flight camera
US11516457B2 (en) Switchable fringe pattern illuminator
EP3281097A1 (de) Interaktive elektronische projektionsvorrichtung
FR2996014A1 (fr) Procede d'aide a la determination de parametres de vision d'un sujet
CA2879586C (fr) Procede de mesure de parametres morpho-geometriques d'un individu porteur de lunettes
EP3145405B1 (de) Verfahren zur bestimmung von mindestens einem verhaltensparameter
US20240094390A1 (en) Indirect time of flight sensor with parallel pixel architecture
EP3724749B1 (de) Gerät für augmented reality anwendung
FR3043295A1 (fr) Dispositif de realite augmentee spatiale pour un environnement de bureau
FR2982664A1 (fr) Systeme et procede de telemetrie sans contact adaptes aux objets de forme complexe

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20171108

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20191101