WO2018109570A1 - Lentille de contact intelligente et système multimédia comprenant la lentille de contact intelligente - Google Patents

Lentille de contact intelligente et système multimédia comprenant la lentille de contact intelligente Download PDF

Info

Publication number
WO2018109570A1
WO2018109570A1 PCT/IB2017/052336 IB2017052336W WO2018109570A1 WO 2018109570 A1 WO2018109570 A1 WO 2018109570A1 IB 2017052336 W IB2017052336 W IB 2017052336W WO 2018109570 A1 WO2018109570 A1 WO 2018109570A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
sight
line
smart contact
contact lens
Prior art date
Application number
PCT/IB2017/052336
Other languages
English (en)
Inventor
FuTao HE
Xiaobo Wang
Jerry Tan
Original Assignee
Sony Mobile Communications Inc.
Sony Mobile Communications (Usa) Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Mobile Communications Inc., Sony Mobile Communications (Usa) Inc. filed Critical Sony Mobile Communications Inc.
Publication of WO2018109570A1 publication Critical patent/WO2018109570A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/125Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes with contact lenses
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/04Contact lenses for the eyes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • This disclosure relates to the field of information technologies, and in particular to a smart contact lens and a multimedia system including the smart contact lenses.
  • smart contact lenses As one of the smart devices, due to their features of small volumes, and being flexible and convenient to wear etc.
  • Existing smart contact lenses have various forms. For example, a smart contact lens capable of displaying contents of a mobile phone of a user, a smart contact lens capable of detecting a blood glucose level of a user, and a smart contact lens capable of capturing images, have appeared.
  • the above existing method for detecting the lines of sight of the user needs to provide an extra camera, and to use the camera to determine positions of the eyeballs to track the eyeballs. That method is relatively low in detection precision and relatively large in amount of or number of calculations (sometimes referred to as "calculation amount"), and it leads to that a detection result cannot be obtained in realtime. Furthermore, that technique is limited by a detection range of the camera.
  • Embodiments of this disclosure provide a smart contact lens and a multimedia system including a smart contact lens, in which by providing multiple detecting elements generating detection signals when contacting eyelids of a user in edge areas of the smart contact lens, lines of sight of the user may be simply and conveniently detected, which is relatively high in detection precision, small in calculation amount, may obtain a detection result in real-time, and is not limited to a detection range. Furthermore, a function of detecting the lines of sight of the user may be simply and conveniently achieved, and expandability and flexibility of application of the function may be improved.
  • a smart contact lens including: a substrate; and multiple detecting elements provided in multiple edge areas of the substrate, the detecting elements generating detection signals when contacting eyelids of a user wearing the smart contact lens, to detect lines of sight of the user.
  • the multiple detecting elements are provided in all or part of edge areas of the substrate.
  • the multiple detecting elements are capacitance contact elements, and result in a change of an output voltage and/or current when contacting the eyelids of the user, to generate the detection signals.
  • the smart contact lens further includes: a wireless communication unit configured to perform signal transmission with external electronic equipment.
  • the smart contact lenses are respectively worn on the left eye and the right eye of the user; and the image capturing unit determines a focus position for capturing an image according to three- dimensional coordinates of an intersection of the detected line of sight of the left eye and line of sight of the right eye of the user.
  • the smart contact lenses further include: a power supplying unit configured to accumulate electric power by at least one of wireless charging, solar energy charging and biological energy charging, so as to supply power to the smart contact lens(es).
  • the multimedia system includes: at least one of the smart contact lens(es) as described in the first aspect, worn on the left eye and/or the right eye of a user; and electronic equipment capable of performing signal transmission with the smart contact lens(es); the smart contact lens(es) or the electronic equipment include(s): a processing unit configured to process the detection signals generated by the detecting elements of the smart contact lens(es).
  • the processing unit includes: a determining unit configured to determine moving directions and moving angles of lines of sight of the user according to the detection signals.
  • the determining unit includes: a first determining unit configured to determine a location distribution of detecting elements in the multiple detecting elements contacting the eyelids of the user according to the detection signals generated by the detecting elements; and a second determining unit configured to determine the moving directions and moving angles of the lines of sight of the user according to a comparison result of the determined location distribution of detecting elements contacting the eyelids of the user and a reference location distribution.
  • the second determining unit when the comparison result is that the number of detecting elements generating detection signals at a side of a first horizontal direction is increased, the second determining unit is configured to determine that the eyeballs of the user move to the first horizontal direction; when the comparison result is that the number of detecting elements generating detection signals at a side of a second horizontal direction is increased, the second determining unit is configured to determine that the eyeballs of the user move to the second horizontal direction; when the comparison result is that the number of detecting elements generating detection signals at a side of a first vertical direction is increased and the number of detecting elements generating detection signals at a side of a second vertical direction is decreased, the second determining unit is configured to determine that the eyeballs of the user move to the first vertical direction; and when the comparison result is that the number of detecting elements generating detection signals at a side of a second vertical direction is increased, the second determining unit is configured to determine that the eyeballs of the user move to the second vertical direction.
  • the second determining unit is configured to determine the moving angles of the lines of sight of the user according to a difference between the determined a location distribution of detecting elements contacting the eyelids of the user and the reference location distribution.
  • the determining unit further includes: a third determining unit configured to determine the distribution of a reference position according to a location distribution of detecting elements generating detection signals when the user is looking straight ahead.
  • the processing unit further includes: a movement controlling unit configured to control movement of an operating cursor displayed on a display of the electronic equipment according to the moving directions and moving angles of the lines of sight of the user.
  • the multimedia system includes: two smart contact lenses respectively worn on the left eye and right eye of the user; and the determining unit is configured to respectively determine moving directions and moving angles of the line of sight of the left eye and the line of sight of the right eye of the user according to detection signals generated by the detecting elements of the two smart contact lenses respectively worn on the left eye and right eye of the user.
  • the processing unit further includes: a first display controlling unit configured to control three-dimensional display of the smart contact lenses or the electronic equipment according to the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user.
  • the processing unit further includes: a second display controlling unit configured to control augmented reality display of the smart contact lenses or the electronic equipment according to the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user.
  • the processing unit further includes: a second calculating unit configured to, according to three-dimensional coordinates of the intersections of the line of sight of the left eye and line of sight of the right eye of the user at a first moment and a second moment respectively calculated by the first calculating unit, calculate a distance between the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the first moment and the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the second moment.
  • a second calculating unit configured to, according to three-dimensional coordinates of the intersections of the line of sight of the left eye and line of sight of the right eye of the user at a first moment and a second moment respectively calculated by the first calculating unit, calculate a distance between the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the first moment and the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the second moment.
  • the first calculating unit calculates the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the first moment based on blink of one of the left eye and the right eye of the user at the first moment; and the first calculating unit calculates the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the second moment based on blink of the other one of the left eye and the right eye of the user at the second moment.
  • An advantage of the embodiments of this disclosure exists in that by providing multiple detecting elements generating detection signals when contacting eyelids of a user in edge areas of the smart contact lens, lines of sight of the user may be simply and conveniently detected, which is relatively high in detection precision, small in calculation amount, may obtain a detection result in real-time, and is not limited to a detection range. And furthermore, a function of detecting the lines of sight of the user may be simply and conveniently achieved, and expandability and flexibility of application of the function may be improved.
  • FIG. 1 is a schematic diagram of the smart contact lens of Embodiment 1 of this disclosure.
  • FIG. 2 is another schematic diagram of the smart contact lens of Embodiment 1 of this disclosure.
  • FIG. 3 is a schematic diagram of the processing unit 104 of Embodiment 1 of this disclosure.
  • FIG. 4 is a schematic diagram of the determining unit 301 of Embodiment 1 of this disclosure.
  • FIG. 5 is a schematic diagram of a case where the user is looking straight ahead of Embodiment 1 of this disclosure
  • FIG. 6 is a schematic diagram of a case where the eyeballs of the user are moving in a horizontal direction of Embodiment 1 of this disclosure
  • FIG. 7 is a schematic diagram of a case where the eyeballs of the user are moving to a first vertical direction of Embodiment 1 of this disclosure
  • FIG. 8 is a schematic diagram of a case where the eyeballs of the user are moving to a second vertical direction of Embodiment 1 of this disclosure;
  • FIGs. 9-12 are schematic diagrams of controlling movement of an operational cursor of the electronic equipment by the lines of sight of the user of Embodiment 1 of this disclosure;
  • FIG. 13 is a schematic diagram of an intersection of the line of sight of the left eye and the line of sight of the right eye of Embodiment 1 of this disclosure;
  • FIG. 14 is a schematic diagram of measuring a size of an object of Embodiment 1 of this disclosure
  • FIG. 15 is a schematic diagram of the multimedia system of Embodiment 2 of this disclosure.
  • FIG. 16 is a block diagram of the electronic equipment of Embodiment 2 of this disclosure.
  • electronic equipment and “electronic apparatus”, in this disclosure, may relate to any type of appropriate electronic apparatus, and examples of such electronic apparatus include a computer, a mobile phone, a smart mobile phone, a photo camera, a video camera, a tablet PC, a telephone, a media player, and a game device, etc.
  • unit may have conventional meaning in the field of electronics, electrical devices and/or electronic devices and may include, for example, electrical and/or electronic circuitry, devices, modules, processors, memories, logic solid state and/or discrete devices, computer programs or instructions for carrying out respective tasks, procedures, computations, outputs, and/or displaying functions, and so on, as such as those that are described herein.
  • FIG. 1 is a schematic diagram of the smart contact lens of Embodiment 1 of this disclosure. As shown in FIG. 1, the smart contact lens 100 includes:
  • edge areas may include, for example, at the edge area, near the edge, adjacent the edge, or such other similar location so as to provide for generating detection signals as described herein.
  • the lines of sight of the user may be simply and conveniently detected, which is relatively high in detection precision, small in calculation amount, may obtain a detection result in real-time, and is not limited to a detection range. And furthermore, a function of detecting the lines of sight of the user may be simply and conveniently achieved, and expandability and flexibility of application of the function may be improved.
  • the substrate 101 may be made from a transparent material, and an existing method may be used for making the substrate.
  • the substrate may be a material that is the same or similar to that of which conventional contact lenses may be made or may be such other material as may be suitable for use as is disclosed herein.
  • a shape of the substrate 101 may be designed as actually demanded.
  • the term “as demanded” may also mean “as required,” as desired,” as preferred,” and so on, e.g., as demanded by a user, as demanded by circumstances or desired function, etc.
  • the substrate 101 may be of a circular shape, or an elliptical shape, etc.
  • the multiple detecting elements 102 may be provided in multiple edge areas of the substrate 101.
  • the multiple detecting elements 102 may be implanted in the substrate 101 by using an existing method.
  • the multiple detecting elements 102 may be provided in all or part of edge areas of the substrate 101.
  • the multiple detecting elements 102 are provided continuously in, at, near, adjacent, etc., as mentioned above, the whole edges of the substrate 101, with a density of distribution that may be set according to a detection precision and an actual demand.
  • FIG. 2 is another schematic diagram of the smart contact lenses of Embodiment 1 of this disclosure. As shown in FIG. 2, edge areas of the substrate 101 in four directions or locations, e.g., upper, lower, left and right directions, are provided with multiple detecting elements 102, and for brevity and description other elements are not shown.
  • the detecting elements 102 may be capacitance contact elements provided on a surface of the substrate 101, and result in changes of an output voltage and/or current when contacting eyelids of the user, thereby generating the detection signals.
  • the capacitance contact elements are used as the detecting elements 102, their structures are simple, and are easy to manufacture; and by simple changes of the voltage and/or current, the detection signals may be obtained, thus a detection speed is relatively fast.
  • the capacitance contact elements may be of existing or known type structures.
  • the smart contact lenses 100 may further include a wireless communication unit 103 configured to perform signal transmission with external electronic equipment.
  • the communication unit 103 is shown in FIG. 1, but also similarly may be included in the smart contact lens 100 of FIG. 2.
  • the external electronic equipment may be various types of electronic devices, such as a computer, a mobile phone, a smart mobile phone, a photo camera, an image camera, a tablet PC, a telephone, a media player, and a game device, etc.
  • the wireless communication unit 103 may perform signal transmission with the external electronic equipment by using an existing wireless communication method, such as Bluetooth, and WTFI, etc.
  • the smart contact lens 100 may further include a processing unit 104 (shown in FIG. 1, but similarly may be included in the contact lens 100 of FIG. 2) configured to process the detection signals generated by the detecting elements 102.
  • a processing unit 104 shown in FIG. 1, but similarly may be included in the contact lens 100 of FIG. 2 configured to process the detection signals generated by the detecting elements 102.
  • the processing unit 104 is, for example, an IC (integrated circuit) chip, such as a CPU chip, and, if desired, other electronic component(s), e.g., memory, input/output circuitry and so on; and is implanted in the smart contact lenses.
  • IC integrated circuit
  • the processing unit 104 may be provided in the electronic equipment performing signal transmission with the smart contact lens, rather than provided in the smart contact lenses 100, to process the detection signals.
  • a structure and function of the processing unit 104 of this embodiment shall be illustrated and described below.
  • FIG. 3 is a schematic diagram of the processing unit 104 of Embodiment 1 of this disclosure. As shown in FIG. 3, the processing unit 104 includes:
  • a determining unit 301 configured to determine moving directions and moving angles of lines of sight of the user according to the detection signals
  • a movement controlling unit 302 configured to control movement of an operating cursor displayed on a display of the electronic equipment capable of performing signal transmission with the smart contact lenses according to the moving directions and moving angles of the lines of sight of the user;
  • a first calculating unit 303 configured to calculate the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user according to the moving directions and moving angles of the line of sight of the left eye and the line of sight of the right eye of the user;
  • a first display controlling unit 304 configured to control three-dimensional display of the smart contact lenses or the electronic equipment according to the three- dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user;
  • a second display controlling unit 305 configured to control augmented reality display of the smart contact lenses or the electronic equipment according to the three- dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user;
  • a second calculating unit 306 configured to, according to three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user at a first moment and a second moment respectively calculated by the first calculating unit, calculate a distance between the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the first moment and the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the second moment.
  • the movement controlling unit 302, the first calculating unit 303, the first display controlling unit 304, the second display controlling unit 305 and the second calculating unit 306 are optional, and may be provided as demanded by actual application.
  • the determining unit 301 determines the moving directions and moving angles of lines of sight of the user according to the detection signals.
  • a structure of the determining unit 301 and a method of determination shall be illustrated below.
  • FIG. 4 is a schematic diagram of the determining unit 301 of Embodiment 1 of this disclosure. As shown in FIG. 4, the determining unit 301 includes:
  • a first determining unit 401 configured to determine a location distribution of detecting elements in the multiple detecting elements contacting the eyelids of the user according to the detection signals generated by the detecting elements;
  • a second determining unit 402 configured to determine the moving directions and moving angles of the lines of sight of the user according to a comparison result of the determined location distribution of detecting elements contacting the eyelids of the user and a reference location distribution.
  • the determining unit 301 may further include:
  • a third determining unit 403 configured to determine the distribution of a reference position according to a location distribution of detecting elements generating detection signals when the user is looking straight ahead.
  • a direction the user is facing after wearing the smart contact lenses is taken as a reference direction; one of a left direction and a right direction is defined as a first horizontal direction, and the other one of the left direction and the right direction is defined as a second horizontal direction, an upward direction is defined as a first vertical direction, and a downward direction is defined as a second vertical direction.
  • FIG. 5 is a schematic diagram of a case where the user is looking straight ahead of Embodiment 1 of this disclosure.
  • the user is looking straight ahead, and a location distribution of the detecting elements, that is, the detecting elements generating the detection signals, is taken as the reference location distribution.
  • the detecting elements at the upper left half are taken as the detecting elements at a side of the first horizontal direction
  • the detecting elements at the right half are taken as the detecting elements at a side of the second horizontal direction
  • the detecting elements at the upper half are taken as the detecting elements at a side of the first vertical direction
  • the detecting elements at the lower half are taken as the detecting elements at a side of the second vertical direction.
  • the third determining unit 403 is first needed to determine their distribution of reference positions, so as to be adapted for structures of eyes of different users.
  • the second determining unit 402 determines the moving directions and moving angles of lines of sight of the user according to a comparison result. For example,
  • the second determining unit determines that the eyeballs of the user move to the first horizontal direction
  • the second determining unit determines that the eyeballs of the user move to the second horizontal direction
  • the second determining unit determines that the eyeballs of the user move to the first vertical direction
  • the second determining unit determines that the eyeballs of the user move to the second vertical direction.
  • FIG. 6 is a schematic diagram of a case where the eyeballs of the user are moving in a horizontal direction of Embodiment 1 of this disclosure.
  • the number of the detecting elements contacting the eyelids of the user at the side of the first horizontal direction (the left side) or the side of the second horizontal direction (the right side) is increased, that is, when the number of the detecting elements generating detection signals at the side of the first horizontal direction or the side of the second horizontal is increased, it is determined that the eyeballs of the user move to the first or the second horizontal direction.
  • FIG. 7 is a schematic diagram of a case where the eyeballs of the user are moving in the first vertical direction of Embodiment 1 of this disclosure.
  • the number of detecting elements contacting the eyelids of the user at the side of the first vertical direction (the upper side) is increased and the number of detecting elements contacting the eyelids of the user at the side of the second vertical direction (the lower side) is decreased, that is, when the number of detecting elements generating detection signals at the side of the first vertical direction is increased and the number of detecting elements generating detection signals at the side of the second vertical direction is decreased, it is determined that the eyeballs of the user move to the first vertical direction (the upper side).
  • FIG. 8 is a schematic diagram of a case where the eyeballs of the user are moving in the second vertical direction of Embodiment 1 of this disclosure.
  • the number of detecting elements contacting the eyelids of the user at the side of the second vertical direction (the lower side) that is, the number of the detecting elements generating detection signals at the side of the second vertical direction is increased, as the lower eyelid of the user is substantially kept unmoved, it is determined that the eyeballs of the user move to the second vertical direction (the lower side), even though the number of detecting elements generating detection signals at the side of the first vertical direction (the upper side) is increased.
  • the second determining unit 402 determines the moving angles of lines of sight of the user according to a difference between the determined a location distribution of detecting elements contacting the eyelids of the user and the reference location distribution.
  • the third determining unit 403 may obtain a correspondence relationship between the moving angles of lines of sight of the user and the difference between position distribution, and the second determining unit 402 determines the moving angles of lines of sight of the user according to the correspondence relationship.
  • the structure of the determining unit 301 of the processing unit 104 and a method for determining the moving directions and moving angles of lines of sight of the user are illustrated above.
  • the movement controlling unit 302 of the processing unit 104 controls the movement of the operating cursor displayed on the display of the electronic equipment capable of performing signal transmission with the smart contact lenses according to the moving directions and moving angles of the lines of sight of the user. In this way, the movement of the operating cursor of the electronic equipment may be simply and conveniently controlled by wearing the smart contact lenses.
  • the operating cursor displayed on the display of the electronic equipment is controlled to be in consistence with the moving directions of the lines of sight of the user, and a distance of movement of the operating cursor is proportional to the moving angles of the lines of sight of the user, which may be obtained by calculating according to a predetermined correspondence relationship, such as establishing a correspondence table according to data obtained by user test.
  • the operating cursor is a mouse cursor
  • the processing unit 104 transmits a control signal of the mouse cursor to a mouse control module in the electronic equipment, and performs corresponding control on the mouse cursor based on a mouse protocol.
  • FIGs. 9-12 are schematic diagrams of controlling the movement of the operating cursor 901 of the exemplary electronic equipment 902 by the lines of sight of the user of Embodiment 1 of this disclosure.
  • the operating cursor when the eyeballs of the user are substantially looking straight ahead, the operating cursor is placed at the center of the display of the electronic equipment.
  • the operating cursor when the eyeballs of the user move towards a side in the horizontal direction, the operating cursor is controlled to move to the same horizontal direction.
  • FIG. 11 when the eyeballs of the user move upwards, the operating cursor is controlled to move upwards
  • FIG. 12 when the eyeballs of the user move downwards, the operating cursor is controlled to move downwards.
  • the user may select to wear a smart contact lens on the left eye or on the right eye, or wear smart contact lenses on both the left eye and the right eye, as actually demanded.
  • the left eye or the right eye may wear a smart contact lens.
  • the determining unit 301 may respectively determine the moving directions and the moving angles of the line of sight of the left eye and the line of sight of the right eye of the user according to the detection signals generated by the detecting elements of the smart contact lenses worn respectively on the left eye and the right eye of the user.
  • the first calculating unit 303 calculates the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user according to the moving directions and moving angles of the line of sight of the left eye and the line of sight of the right eye of the user. In this way, more function applications may be expanded by using the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user.
  • the first calculating unit 303 calculating the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user shall be illustrated below.
  • FIG. 13 is a schematic diagram of the intersection of the line of sight of the left eye and the line of sight of the right eye of Embodiment 1 of this disclosure.
  • a and B respectively denote centers of the eyeballs of the left eye and the right eye of the user
  • F' denotes an intersection of the line of sight of the left eye and the line of sight of the right eye before movement of the eyeballs of the user (a previous moment)
  • F denotes an intersection of the line of sight of the left eye and the line of sight of the right eye after movement of the eyeballs of the user (a current moment).
  • a central point of a segment AB is taken as an origin, an axis when the segment AB is located is taken as the X axis, an axis in parallel with a vertical direction of a human body and passing through the origin is taken the Z axis, and Y axis passes through the origin and is perpendicular to the XZ plane.
  • F'(x,y,0) is a projection of a space point F(x,y,z) on the XY plane.
  • ZBAF' and ZABF' may be obtained by detecting the movement of the eyeballs at the previous moment, and
  • between F' and F may be obtained with reference to
  • three-dimensional coordinates F(x,y,z) of F may be obtained with reference to the two-dimensional coordinates F'(x,y) of F' in the XY plane.
  • the smart contact lenses or the electronic equipment capable of performing signal transmission with the smart contact lenses have a function of three- dimensional display
  • the first display controlling unit 304 controls the three- dimensional display of the smart contact lenses or the electronic equipment according to the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user. In this way, a visual effect of the three- dimensional display may be effectively improved.
  • the first display controlling unit 304 For example, with the control by the first display controlling unit 304, more details may be displayed at the intersection of the line of sight of the left eye and line of sight of the right eye of the user, thereby improving the visual effect of the three-dimensional display.
  • the smart contact lenses or the electronic equipment capable of performing signal transmission with the smart contact lenses have a function of augmented reality (AR) display
  • the second display controlling unit 305 controls the augmented reality display of the smart contact lenses or the electronic equipment according to the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user.
  • AR augmented reality
  • rendering of contrast and details are reinforced at the intersection of the line of sight of the left eye and line of sight of the right eye of the user, consistence of a focus of rendering of a picture or a scenario and a focus of vision may be achieved, and stronger real senses may be brought to the user, while a sense of giddiness brought about by inconsistence of a focus of rendering of a picture or a scenario and a focus of vision may be eliminated.
  • the second calculating unit 306 calculates the distance between the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the first moment and the intersection of the line of sight of the left eye and line of sight of the right eye of the user at the second moment. In this way, a size of an object may simply and accurately be measured.
  • the user looks towards an end of an object to be measured, and at this moment, by blink of the left eye of the user, such a moment may be locked and the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user at this moment may be calculated.
  • the lines of sight of the user move and the user looks towards the other end of the object to be measured, and at this moment, by blink of the right eye of the user, such a moment may be locked and the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user at this moment may be calculated, and a distance between the intersections of the line of sight of the left eye and line of sight of the right eye of the user at the two moments may be calculated, thereby obtaining the size of the object to be measured.
  • FIG. 14 is a schematic diagram of measuring a size of an object of Embodiment 1 of this disclosure.
  • the left eye blinks when the user looks towards an end PI of the object to be measured, and the first calculating unit 303 calculates three-dimensional coordinates Pl(xl,yl,zl) of PI;
  • the right eye blinks when the user looks towards the other end P2 of the object to be measured, and the first calculating unit 303 calculates three-dimensional coordinates P2(x2,y2,z2) of P2;
  • the second calculating unit 306 calculates a distance between PI and P2 according to formula (1) below:
  • D V[(xl - x2) 2 + (yl - y2) 2 + (zl - z2) 2 ] (1); where, D denotes the distance between PI and P2, (xl,yl,zl) denotes the three- dimensional coordinates of PI, and P2(x2,y2,z2) denotes the three-dimensional coordinates of P2.
  • the smart contact lens 100 may further include an image capturing unit 105 configured to capture an image according to the detected lines of sight of the user. For example, it determines a focus position for capturing an image according to the three-dimensional coordinates of the intersection of the detected line of sight of the left eye and line of sight of the right eye of the user. In this way, an object focused by the user may be quickly captured without needing to convert a coordinate system of the camera, and a captured image of relatively high quality may be obtained.
  • an image capturing unit 105 configured to capture an image according to the detected lines of sight of the user. For example, it determines a focus position for capturing an image according to the three-dimensional coordinates of the intersection of the detected line of sight of the left eye and line of sight of the right eye of the user. In this way, an object focused by the user may be quickly captured without needing to convert a coordinate system of the camera, and a captured image of relatively high quality may be obtained.
  • the intersection of the detected line of sight of the left eye and line of sight of the right eye of the user may be obtained through calculation by the first calculating unit 303.
  • the smart contact lens 100 may further include a power supplying unit 106 configured to accumulate electric power by at least one of wireless charging, solar energy charging and biological energy charging, to supply power to the smart contact lens. For example, it supplies power to the detecting elements 102, the processing unit 104, and the image capturing unit 105, etc.
  • a power supplying unit 106 configured to accumulate electric power by at least one of wireless charging, solar energy charging and biological energy charging, to supply power to the smart contact lens. For example, it supplies power to the detecting elements 102, the processing unit 104, and the image capturing unit 105, etc.
  • wireless charging may be performed by capacitance
  • charging may be performed by providing a solar cell
  • charging may be performed by performing transformation from biological energies to electrical energies by using secretion liquids in the eyes.
  • the wireless communication unit 103, the processing unit 104, the image capturing unit 105, and the power supplying unit 106, etc. may be implanted in the smart contact lenses by using an existing method, and these units may be integrated in an integrated circuit or a functional module, or may be provided separately. Furthermore, positions of these units on the substrate 101 may be determined according to an actual situation or as demanded or desired.
  • the lines of sight of the user may be simply and conveniently detected, which is relatively high in detection precision, small in calculation amount, may obtain a detection result in real-time, and is not limited to a detection range. And furthermore, a function of detecting the lines of sight of the user may be simply and conveniently achieved, and expandability and flexibility of application of the function may be improved.
  • FIG. 15 is a schematic diagram of the multimedia system of Embodiment 2 of this disclosure.
  • the multimedia system 1500 includes: at least one smart contact lens 1501, worn on the left eye and/or the right eye of a user; and electronic equipment 1502 capable of performing signal transmission with the smart contact lens(es).
  • the smart contact lens(es) 1501 or the electronic equipment 1502 include(s) a processing unit (e.g., as shown at 104 in FIG. 1, but not shown here) configured to process detection signals generated by detecting elements of the smart contact lens(es).
  • a structure and functions of the smart contact lens 1501 are completely the same as those of the smart contact lens 100 in Embodiment 1, and shall not be described herein any further.
  • the electronic equipment may be various types of electronic devices, such as a computer, a mobile phone, a smart mobile phone, a photo camera, a video camera, a tablet PC, a telephone, a media player, and a game device, etc.
  • the processing unit may be provided in the smart contact lens 1501, and may also be provided in the electronic equipment 1502, and its structure and functions are completely the same as those of the processing unit 104 in Embodiment 1, which shall not be described herein any further.
  • FIG. 16 is a block diagram of the electronic equipment of Embodiment 2 of this disclosure.
  • the electronic equipment 1600 e.g., in FIG. 15 also designated 1502
  • the electronic equipment 1600 may include a central processing unit 1601 and a memory 1602, the memory 1602 being coupled to the central processing unit 1601.
  • this figure is illustrative only, and other types of structures may also be used, so as to supplement or replace this structure and achieve telecommunications function or other functions.
  • the central processing unit 1601 executes the functions of the processing unit, and may be configured to process the detection signals generated by the detecting elements of the smart contact lenses.
  • the processing the detection signals generated by the detecting elements of the smart contact lenses may include: determining moving directions and moving angles of lines of sight of the user according to the detection signals; controlling movement of an operating cursor displayed on a display of the electronic equipment according to the moving directions and moving angles of the lines of sight of the user; calculating the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user according to the moving directions and moving angles of the line of sight of the left eye and the line of sight of the right eye of the user; controlling three-dimensional display of the smart contact lenses or the electronic equipment according to the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user; controlling augmented reality display of the smart contact lenses or the electronic equipment according to the three-dimensional coordinates of the intersection of the line of sight of the left eye and line of sight of the right eye of the user; and according to respectively calculated three-dimensional coordinates of the intersections of the line of sight of the left eye and line of sight of the
  • the electronic equipment 1600 may further include a communication module 1603, an input unit 1604, an audio processor 1605, a loudspeaker 1605-1, a microphone 1605-2, a display unit 1606, a power supply 1607, and an antenna 1608. It should be noted that the electronic equipment 1600 does not necessarily include all the parts shown in FIG. 16, and furthermore, the electronic equipment 1600 may include parts not shown in FIG. 16, and the related art may be referred to.
  • the central processing unit 1601 is sometimes referred to as a controller or control, and may include a microprocessor or other processor devices and/or logic devices.
  • the central processing unit 1601 receives input and controls operations of every components of the electronic equipment 1600.
  • the memory 1602 may be, for example, one or more of a buffer memory, a flash memory, a hard drive, a mobile medium, a volatile memory, a nonvolatile memory, or other suitable devices, which may store predefined or preconfigured information, and may further store a program executing related information.
  • the central processing unit 1601 may execute the program stored in the memory 1602, so as to realize information storage or processing, etc. Functions of other parts are similar to those of the prior art, which shall not be described herein any further.
  • the parts of the electronic equipment 1600 may be realized by specific hardware, firmware, software, or any combination thereof, without departing from the scope of the present disclosure.
  • the lines of sight of the user may be simply and conveniently detected, which is relatively high in detection precision, small in calculation amount, may obtain a detection result in real-time, and is not limited to a detection range. And furthermore, a function of detecting the lines of sight of the user may be simply and conveniently achieved, and expandability and flexibility of application of the function may be improved.
  • the above apparatuses and methods of the present disclosure may be implemented by hardware, or by hardware in combination with software.
  • the present disclosure relates to such a computer-readable program that when the program is executed by a logic device, the logic device is enabled to carry out the apparatus or components as described above, or to carry out the methods or steps as described above.
  • the present disclosure also relates to a storage medium for storing the above program, such as a hard disk, a floppy disk, a CD, a DVD, and a flash memory, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • Otolaryngology (AREA)
  • Acoustics & Sound (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une lentille de contact intelligente et un système multimédia comprenant les lentilles de contact intelligentes. La lentille de contact intelligente comprend : un substrat ; et de multiples éléments de détection dans de multiples zones de bord du substrat, les éléments de détection générant des signaux de détection lorsqu'ils entrent en contact avec les paupières d'un utilisateur portant la lentille de contact intelligente, pour détecter des lignes de visée de l'utilisateur. Par les multiples éléments de détection générant des signaux de détection lorsqu'ils entrent en contact avec les paupières de l'utilisateur dans les zones de bord des lentilles de contact intelligentes, des lignes de visée de l'utilisateur peuvent être détectées d'une manière simple et pratique, relativement élevée en termes de précision de détection, ne nécessitant qu'une petite quantité de calcul, pouvant atteindre un résultat de détection en temps réel, et non limitée à une plage de détection. Une fonction de détection de lignes de visée de l'utilisateur peut être réalisée de manière simple et pratique, et l'extensibilité et la flexibilité d'application de la fonction peuvent être améliorées.
PCT/IB2017/052336 2016-12-15 2017-04-24 Lentille de contact intelligente et système multimédia comprenant la lentille de contact intelligente WO2018109570A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611159967.XA CN108227239A (zh) 2016-12-15 2016-12-15 智能隐形眼镜以及包括该智能隐形眼镜的多媒体系统
CN201611159967.X 2016-12-15

Publications (1)

Publication Number Publication Date
WO2018109570A1 true WO2018109570A1 (fr) 2018-06-21

Family

ID=58692539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/052336 WO2018109570A1 (fr) 2016-12-15 2017-04-24 Lentille de contact intelligente et système multimédia comprenant la lentille de contact intelligente

Country Status (2)

Country Link
CN (1) CN108227239A (fr)
WO (1) WO2018109570A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021080926A1 (fr) * 2019-10-24 2021-04-29 Tectus Corporation Systèmes et procédés de sélection d'outils et d'activation basée sur l'œil
US11592899B1 (en) 2021-10-28 2023-02-28 Tectus Corporation Button activation within an eye-controlled user interface
US11619994B1 (en) 2022-01-14 2023-04-04 Tectus Corporation Control of an electronic contact lens using pitch-based eye gestures
US11662807B2 (en) 2020-01-06 2023-05-30 Tectus Corporation Eye-tracking user interface for virtual tool control
US11874961B2 (en) 2022-05-09 2024-01-16 Tectus Corporation Managing display of an icon in an eye tracking augmented reality device
US11907417B2 (en) 2019-07-25 2024-02-20 Tectus Corporation Glance and reveal within a virtual environment

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110726532A (zh) * 2018-07-17 2020-01-24 亨泰光学股份有限公司 隐形眼镜的聚焦点检测方法
CN109471527B (zh) * 2018-10-15 2022-07-22 上海理工大学 基于视觉跟踪技术的特殊病人信息交互系统及使用方法
CN109633910B (zh) * 2019-01-14 2021-11-05 京东方科技集团股份有限公司 Ar/vr隐形眼镜及其制作方法和电子设备
CN114545634A (zh) * 2022-02-24 2022-05-27 北京京东方技术开发有限公司 一种智能眼镜
CN114787755A (zh) * 2022-02-25 2022-07-22 曹庆恒 一种眼球追踪系统及其使用方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2281838A (en) * 1993-08-04 1995-03-15 Pioneer Electronic Corp Input for a virtual reality system
US9111473B1 (en) * 2012-08-24 2015-08-18 Google Inc. Input system
US20150362749A1 (en) * 2014-06-13 2015-12-17 Google Inc. Capacitive gaze tracking for auto-accommodation in a contact lens
US20160091737A1 (en) * 2014-09-26 2016-03-31 Samsung Electronics Co., Ltd. Smart contact lenses for augmented reality and methods of manufacturing and operating the same
US20160097940A1 (en) * 2013-05-02 2016-04-07 Sony Corporation Contact lens and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2281838A (en) * 1993-08-04 1995-03-15 Pioneer Electronic Corp Input for a virtual reality system
US9111473B1 (en) * 2012-08-24 2015-08-18 Google Inc. Input system
US20160097940A1 (en) * 2013-05-02 2016-04-07 Sony Corporation Contact lens and storage medium
US20150362749A1 (en) * 2014-06-13 2015-12-17 Google Inc. Capacitive gaze tracking for auto-accommodation in a contact lens
US20160091737A1 (en) * 2014-09-26 2016-03-31 Samsung Electronics Co., Ltd. Smart contact lenses for augmented reality and methods of manufacturing and operating the same

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11907417B2 (en) 2019-07-25 2024-02-20 Tectus Corporation Glance and reveal within a virtual environment
WO2021080926A1 (fr) * 2019-10-24 2021-04-29 Tectus Corporation Systèmes et procédés de sélection d'outils et d'activation basée sur l'œil
US11662807B2 (en) 2020-01-06 2023-05-30 Tectus Corporation Eye-tracking user interface for virtual tool control
US11592899B1 (en) 2021-10-28 2023-02-28 Tectus Corporation Button activation within an eye-controlled user interface
US11619994B1 (en) 2022-01-14 2023-04-04 Tectus Corporation Control of an electronic contact lens using pitch-based eye gestures
US11874961B2 (en) 2022-05-09 2024-01-16 Tectus Corporation Managing display of an icon in an eye tracking augmented reality device

Also Published As

Publication number Publication date
CN108227239A (zh) 2018-06-29

Similar Documents

Publication Publication Date Title
WO2018109570A1 (fr) Lentille de contact intelligente et système multimédia comprenant la lentille de contact intelligente
CN108450058B (zh) 实时自动车载相机校准
US20200327694A1 (en) Relocalization method and apparatus in camera pose tracking process and storage medium
JP6622395B2 (ja) バーチャルリアリティ画像を調整する方法及び装置
EP3063602B1 (fr) Entrées d'écran tactile assistées par le regard
EP3742743A1 (fr) Procédé et dispositif pour afficher un objet supplémentaire, dispositif informatique et support de stockage
CN116348836A (zh) 增强现实中用于交互式游戏控制的手势跟踪
CN110427110B (zh) 一种直播方法、装置以及直播服务器
US10595001B2 (en) Apparatus for replaying content using gaze recognition and method thereof
KR20120068253A (ko) 사용자 인터페이스의 반응 제공 방법 및 장치
US10983661B2 (en) Interface for positioning an object in three-dimensional graphical space
Mohr et al. Adaptive user perspective rendering for handheld augmented reality
US9600938B1 (en) 3D augmented reality with comfortable 3D viewing
CN110895433B (zh) 用于增强现实中用户交互的方法和装置
CN109445598B (zh) 一种基于视觉的增强现实系统装置
US10296098B2 (en) Input/output device, input/output program, and input/output method
CN112101261A (zh) 人脸识别方法、装置、设备及存储介质
US10345595B2 (en) Head mounted device with eye tracking and control method thereof
JP6479835B2 (ja) 入出力装置、入出力プログラム、および入出力方法
EP4222550A1 (fr) Jeu de réalité augmentée utilisant des faisceaux de lunettes virtuels
Raees et al. THE-3DI: Tracing head and eyes for 3D interactions: An interaction technique for virtual environments
KR101741149B1 (ko) 가상 카메라의 시점 제어 방법 및 장치
Lee et al. A new eye tracking method as a smartphone interface
US11354011B2 (en) Snapping range for augmented reality
Wang et al. Immersive 3D Human-Computer Interaction System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17722511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17722511

Country of ref document: EP

Kind code of ref document: A1