WO2015159851A1 - Detection unit, eyewear, and ocular potential detection system - Google Patents

Detection unit, eyewear, and ocular potential detection system Download PDF

Info

Publication number
WO2015159851A1
WO2015159851A1 PCT/JP2015/061369 JP2015061369W WO2015159851A1 WO 2015159851 A1 WO2015159851 A1 WO 2015159851A1 JP 2015061369 W JP2015061369 W JP 2015061369W WO 2015159851 A1 WO2015159851 A1 WO 2015159851A1
Authority
WO
WIPO (PCT)
Prior art keywords
electrode
unit
user
electrooculogram
detection unit
Prior art date
Application number
PCT/JP2015/061369
Other languages
French (fr)
Japanese (ja)
Inventor
一鷹 井上
晋 一戸
淳子 中嶋
Original Assignee
株式会社ジェイアイエヌ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ジェイアイエヌ filed Critical 株式会社ジェイアイエヌ
Publication of WO2015159851A1 publication Critical patent/WO2015159851A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • the present invention relates to a detection unit, eyewear, and an electrooculogram detection system.
  • Patent Documents 1-3 An apparatus for measuring an electrooculogram or the like is known (see, for example, Patent Documents 1-3).
  • Patent Document 1 International Publication No. 2010/82496
  • Patent Document 2 Japanese Patent Publication No. Sho 61-166515
  • Patent Document 3 Japanese Patent Publication No. 2006-525829
  • the electrooculogram may not be measured accurately.
  • the detection unit can be attached to and detached from the eyewear main body, provided on the surface of the first nose pad of the pair of nose pads and the pair of nose pads, and for detecting the electrooculogram of the wearer.
  • the first electrode to be detected and the first nose pad are provided at different positions, the second electrode for detecting the electrooculogram of the wearer, the detection unit is attached to the eyewear body, the first electrode and the A connector unit that outputs the electrooculogram detected by the second electrode to the eyewear body.
  • the second electrode may be provided on the surface of the second nose pad of the pair of nose pads to detect the electrooculogram of the wearer.
  • a third electrode for detecting the electrooculogram of the wearer in contact with the wearer's eyebrow may be further provided.
  • the pair of nose pads, the third electrode, and the connector part may be provided so as to be positioned relative to each other on the face of the wearer according to the shape.
  • the second electrode may abut the wearer's eyebrow.
  • a nonvolatile storage unit that stores identification information for identifying a user of the detection unit may be further included, and the connector unit may further output the identification information stored in the storage unit to the eyewear body.
  • the eyewear includes the detection unit described above and the eyewear body.
  • the eyewear includes the detection unit described above and an eyewear body, and the eyewear body acquires the identification information stored in the storage unit from the detection unit, and performs the detection.
  • a control unit configured to detect the electrooculogram by the first electrode and the second electrode when the identification information acquired from the unit matches predetermined identification information;
  • the electrooculogram detection system receives the identification information through the detection unit and the eyewear main body, and based on the identification information received through the eyewear main body, the user of the detection unit And an electrooculogram information processing device for identifying.
  • FIG. 1 An example of the utilization form of electrooculogram information processing system 10 in one embodiment is shown roughly.
  • the glasses 100 and the smartphone 40 are schematically shown.
  • a detection unit 102 is schematically shown.
  • 3 is a cross-sectional view schematically showing a connection structure between a detection unit 102 and a bridge 124.
  • FIG. The functional block configuration of the smart phone 40 and the functional block configuration of the processing unit 180 are schematically shown.
  • the positional relationship between the electrooculogram detection electrode of the glasses 100 and the user 20 is schematically shown.
  • 3 is a flowchart illustrating processing executed in the glasses main body 101.
  • FIG. 1 schematically shows an example of a usage pattern of an electrooculogram information processing system 10 according to an embodiment.
  • the electrooculogram information processing system 10 includes glasses 100 and a smartphone 40.
  • User 20 is a user of each of glasses 100 and smartphone 40.
  • the user 20 is a wearer who wears the glasses 100.
  • the smartphone 40 is an example of an information processing apparatus that processes information such as electrooculogram.
  • the glasses 100 are worn on the face of the user 20.
  • the glasses 100 have a function of communicating with the smartphone 40.
  • the glasses 100 detect the electrooculogram of the user 20 via the electrode that contacts the user 20, and transmit the detected electrooculogram information to the smartphone 40. Further, the glasses 100 detect the acceleration of the glasses 100 and transmit the detected acceleration information to the smartphone 40. Further, the glasses 100 detect the angular velocity of the glasses 100 and transmit information on the detected angular velocity to the smartphone 40.
  • the smartphone 40 analyzes the electrooculogram received from the glasses 100.
  • the smartphone 40 analyzes the electrooculogram received from the glasses 100 together with at least one of acceleration and angular velocity received from the glasses 100.
  • the smartphone 40 provides information to the user 20 based on analysis results such as electrooculogram, acceleration, and angular velocity.
  • the smartphone 40 identifies the state of the user 20 by analyzing the electrooculogram, acceleration, angular velocity, and the like. Specifically, the smartphone 40 analyzes the electrooculogram and identifies the user's 20 line-of-sight direction, the blink state, and the like. For example, the smartphone 40 determines the state of the user 20 based on the line-of-sight direction and the blink state. Specifically, the smartphone 40 determines whether the user 20 is tired or painful based on the line-of-sight direction and the blink state. The smartphone 40 issues a warning to the user 20 when determining that the user 20 is tired, painful, or the like. For example, the smartphone 40 generates a warning sound when it is determined that the user 20 is tired or painful.
  • the z-axis plus direction is defined as a direction along the front of the user 20.
  • the z-axis plus direction is an acceleration in a direction from the face of the user 20 toward the front portion of the glasses 100 of the glasses 100 attached to the face of the user 20.
  • the negative y-axis direction is defined as the vertical direction, that is, vertically downward.
  • the x-axis, y-axis, and z-axis are right-handed orthogonal coordinate systems.
  • the z-axis plus direction may be referred to as the front direction.
  • the y-axis plus direction may be referred to as upward or the like.
  • the y-axis minus direction may be referred to as downward or the like.
  • the x-axis plus direction may be referred to as the left side or the like.
  • the x-axis minus direction may be referred to as the right side or the like.
  • FIG. 2 schematically shows the glasses 100 and the smartphone 40.
  • the glasses 100 include a glasses main body 101 and a detection unit 102.
  • the detection unit 102 is attached to the glasses main body 101.
  • the detection unit 102 can be attached to and detached from the glasses main body 101.
  • the glasses main body 101 includes a lens 110 and a frame 120.
  • Glasses 100 is an example of eyewear.
  • the frame 120 and the detection unit 102 are examples of eyewear.
  • the frame 120 supports a pair of lenses 110.
  • the frame 120 includes a rim 122, a bridge 124, an alloy 126, a temple 130, a modern 132, a ground electrode 154, a processing unit 180, and a power supply unit 190.
  • the lens 110, the rim 122, the armor 126, the temple 130, and the modern 132 are each provided in a pair of left and right.
  • the portions of the rim 122, the bridge 124, and the armor 126 are referred to as a front portion of the glasses 100.
  • the rim 122 holds the lens 110.
  • the armor 126 is provided outside the rim 122 and holds the temple 130 movably.
  • the temple 130 presses the upper part of the user's 20 ear and pinches the pressed part.
  • the modern 132 is provided at the tip of the temple 130. The modern 132 contacts the upper part of the ear of the user 20.
  • FIG. 3 schematically shows the detection unit 102.
  • the detection unit 102 includes a holding unit 150, a right nose pad 141, a left nose pad 142, a first electrode 151, a second electrode 152, a third electrode 153, a first support member 155, and a second support member 156.
  • the right nose pad 141 and the left nose pad 142 form a pair of nose pads.
  • the right nose pad 141 is supported by the holding unit 150 by the first support member 155.
  • the right nose pad 141 is substantially positioned with respect to the holding unit 150.
  • the left nose pad 142 is supported by the holding unit 150 by the second support member 156.
  • the left nose pad 142 is substantially positioned with respect to the holding unit 150.
  • the holding unit 150 is mounted and fixed to the bridge 124 of the glasses main body 101.
  • the detection unit 102 is fixed to the glasses main body 101 and used as the glasses 100.
  • FIG. 4 is a cross-sectional view schematically showing the connection structure of the detection unit 102 and the bridge 124.
  • the detection unit 102 is attached to the bridge 124.
  • the detection unit 102 can be attached to and detached from the bridge 124.
  • the holding part 150 is provided with a connector part 170.
  • the bridge 124 is provided with a connector portion 125.
  • a memory 157 is provided inside the holding unit 150.
  • the memory 157 stores identification information for identifying the user 20.
  • the memory 157 is an example of a nonvolatile storage unit that stores identification information for identifying the user of the detection unit 102.
  • the connector part 170 has a plurality of electrode parts 171. Each of the plurality of electrode portions 171 is electrically connected to any one of the first electrode 151, the second electrode 152, the third electrode 153, and the memory 157.
  • the connector part 125 has a plurality of electrode parts 172.
  • the plurality of electrode portions 172 are provided at positions corresponding to the plurality of electrode portions 171.
  • each of the plurality of electrode parts 172 contacts and is electrically connected to a corresponding one of the plurality of electrode parts 171.
  • the detection unit 102 and the bridge 124 are electrically connected via the connector part 170 and the connector part 125.
  • Information stored in the memory 157 is transmitted to the processing unit 180 via the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
  • the connector unit 170 outputs the identification information stored in the memory 157 to the glasses main body 101.
  • the first electrode 151 is an example of an electrooculogram detection unit that detects electrooculogram.
  • the first electrode 151 is provided on the surface of the right nose pad 141.
  • the first electrode 151 is provided on the surface of the right nose pad 141 that faces the face of the user 20 when the user 20 wears the glasses 100.
  • the first electrode 151 contacts the skin of the user 20.
  • the first electrode 151 contacts the right side of the user 20 nose.
  • the first electrode 151 mainly detects the electrooculogram of the right eye of the user 20.
  • the potential detected by the first electrode 151 is transmitted to the processing unit 180 through the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
  • the second electrode 152 is an example of an electrooculogram detection unit that detects electrooculogram.
  • the second electrode 152 is provided on the surface of the left nose pad 142.
  • the second electrode 152 is provided on the surface of the left nose pad 142 that faces the face of the user 20 when the user 20 wears the glasses 100.
  • the second electrode 152 contacts the skin of the user 20.
  • the second electrode 152 contacts the left side of the user 20 nose.
  • the second electrode 152 mainly detects the electrooculogram of the left eye of the user 20.
  • the potential detected by the second electrode 152 is transmitted to the processing unit 180 through the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
  • the third electrode 153 is an example of an electrooculogram detection unit that detects electrooculogram.
  • the third electrode 153 is provided on the surface of the holding unit 150.
  • the third electrode 153 is provided on the surface of the holding unit 150 opposite to the surface on which the connector unit 170 is provided.
  • the third electrode 153 is provided on the surface of the holding unit 150 that faces the face of the user 20 when the user 20 wears the glasses 100.
  • the third electrode 153 contacts the skin of the user 20.
  • the third electrode 153 contacts the upper part of the eyebrow of the user 20.
  • the electrooculogram detected by the third electrode 153 is used as a measurement reference for measuring the electrooculogram of the right eye and the electrooculogram of the left eye of the user 20.
  • the potential detected by the third electrode 153 is transmitted to the processing unit 180 through the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
  • the ground electrode 154 is provided on the surface of the modern 132.
  • the ground electrode 154 is provided on the surface of the modern 132 on the right side, for example.
  • the ground electrode 154 is provided on the surface of the modern 132 that faces the face of the user 20 when the user 20 wears the glasses 100.
  • the ground electrode 154 contacts the user 20's skin.
  • the ground electrode 154 contacts the upper part of the right ear of the user 20.
  • the potential of the ground electrode 154 provides the ground potential of the electric circuit included in the glasses 100.
  • the processing unit 180 is provided inside the left temple 130.
  • the electrooculogram of the user 20 detected by the first electrode 151, the second electrode 152, and the third electrode 153 is input to the processing unit 180.
  • the processing unit 180 processes the input electrooculogram and transmits the processed potential to the smartphone 40.
  • the power supply unit 190 is provided inside the temple 130 on the left side.
  • the power supply unit 190 includes a battery such as a secondary battery.
  • the power supply unit 190 supplies the electrical energy stored in the battery included in the power supply unit 190 to the processing unit 180.
  • the power supply unit 190 generates DC power based on the potential of the ground electrode 154 from the electrical energy stored in the battery.
  • the power supply unit 190 supplies DC power generated from the electrical energy stored in the battery to the processing unit 180.
  • the power supply unit 190 is provided inside the temple 130 on the side where the ground electrode 154 is provided.
  • the potential of the ground electrode 154 provides a negative potential of DC power supplied from the power supply unit 190 to the processing unit 180.
  • the right modern 132 has a charging port for charging the power supply unit 190.
  • the battery included in the power supply unit 190 is charged through a charging port provided in the modern 132 on the right side.
  • the connector unit 170 attaches the detection unit 102 to the glasses body 101 and outputs the electrooculogram detected by the first electrode 151, the second electrode 152, and the third electrode 153 to the glasses body 101.
  • the detection unit 102 may have two electrodes for detecting the electrooculogram.
  • the detection unit 102 may include a first electrode 151 and a second electrode 152.
  • the detection unit 102 may include any one of the first electrode 151 and the second electrode 152 and the third electrode 153.
  • the second electrode 152 is an example of an electrode that is provided at a position different from the right nose pad 141 and detects the electrooculogram of the user 20.
  • the electrode that is provided at a position different from the right nose pad 141 and detects the electrooculogram of the user 20 may be the third electrode 153.
  • the right nose pad 141, the left nose pad 142, and the third electrode 153 are provided to be positioned with respect to the face of the user 20 according to the shape.
  • the detection unit 102 can be designed to fit the user 20.
  • the detection unit 102 may be adjusted such that the first electrode 151, the second electrode 152, and the third electrode 153 are in sufficient contact with the user 20.
  • the length of the third electrode 153 in the z-axis direction, the surface shape of the third electrode 153, and the lengths of the first support member 155 and the second support member 156 included in the detection unit 102 respectively.
  • a plurality of parameters such as the sheath shape, the mounting angle of the right nose pad 141, and the mounting angle of the left nose pad 142 may be adjusted.
  • the user 20 may be able to select and purchase the detection unit 102 having parameters suitable for the shape of his / her eye, eyebrow and nose. These parameters may be individually adjusted for each user who uses the detection unit 102. A so-called tailored detection unit 102 may be provided to the user 20.
  • the surface shape of the 3rd electrode 153 has a surface shape along the surface shape of the user's 20 eyebrow part.
  • the third electrode 153 may have a curved surface shape along the surface of the portion between the eyebrows of the user 20.
  • the detection unit 102 can be attached to and detached from the glasses body 101. Therefore, the detection unit 102 in which the first electrode 151, the second electrode 152, and the third electrode 153 come into contact with the user 20 can be provided to the user 20. Therefore, it is possible to detect ocular potential more accurately.
  • the user 20 can select a favorite glasses body from among a plurality of glasses bodies that he / she owns, and can attach and use the detection unit 102 to the selected glasses body.
  • FIG. 5 schematically shows a functional block configuration of the smartphone 40 and a functional block configuration of the processing unit 180.
  • the processing unit 180 includes a processing unit 200, an angular velocity detection unit 260, an acceleration detection unit 270, a transmission / reception unit 280, and a substrate unit 290.
  • the processing unit 200 includes a detection processing unit 210 and a control unit 220.
  • the smartphone 40 includes a processing unit 300, a storage unit 360, a UI unit 370, a transmission / reception unit 380, and a power supply unit 390.
  • the processing unit 200 is realized by a processor such as an MPU.
  • Each unit of the smartphone 40 is mainly controlled by the processing unit 300.
  • the transmission / reception unit 280 has a function of performing wireless communication with the smartphone 40.
  • the transmission / reception unit 280 is realized by a communication processor.
  • the transmission / reception unit 280 is realized by a communication processor having a short-range wireless communication function such as Bluetooth (registered trademark).
  • an electric wire part 160 is provided inside the frame 120.
  • the electric wire part 160 electrically connects the first electrode 151, the second electrode 152, the third electrode 153, the ground electrode 154, the memory 157, the power supply unit 190, and the processing unit 180.
  • the electric wire part 160 electrically connects the first electrode 151 and the processing unit 180 via the electrode part 171 and the electrode part 172, and outputs an electrooculogram detected by the first electrode 151 to the processing unit 180.
  • the processing unit 200 acquires the electrooculogram of the user 20, and processes the acquired electrooculogram. Specifically, the detection processing unit 210 acquires the first electrooculogram that is the electrooculogram detected by the first electrode 151, and processes the acquired first electrooculogram. In addition, the detection processing unit 210 acquires a second ocular potential that is an ocular potential detected by the second electrode 152, and processes the acquired second ocular potential. Further, the detection processing unit 210 acquires a third ocular potential that is an ocular potential detected by the third electrode 153, and processes the acquired third ocular potential.
  • the detection processing unit 210 processes the first ocular potential based on the third ocular potential.
  • the first electrooculogram based on the third electrooculogram is referred to as V1.
  • the detection processing unit 210 samples V1 at a predetermined cycle and generates time series data of V1.
  • the detection processing unit 210 outputs the generated time series data of V1 to the transmission / reception unit 280.
  • the detection processing unit 210 processes the second electrooculogram based on the third electrooculogram.
  • the second ocular potential based on the third ocular potential is referred to as V2.
  • the detection processing unit 210 samples V2 at a predetermined cycle, and generates time-series data of V2.
  • the detection processing unit 210 outputs the generated V2 time-series data to the transmission / reception unit 280.
  • the acceleration detection unit 270 detects the acceleration of the glasses 100.
  • the acceleration detection unit 270 is, for example, a triaxial acceleration sensor.
  • the acceleration detection unit 270 detects the acceleration of the center of gravity of the glasses 100.
  • the acceleration of the center of gravity of the glasses 100 corresponds to the acceleration of the head of the user 20.
  • the acceleration detected by the acceleration detection unit 270 is input to the detection processing unit 210.
  • the angular velocity detection unit 260 detects the angular velocity of the glasses 100.
  • the angular velocity detection unit 260 is, for example, a triaxial angular velocity sensor.
  • the angular velocity detected by the angular velocity detector 260 is input to the detection processing unit 210.
  • the detection processing unit 210 acquires the acceleration detected by the acceleration detection unit 270 and processes the acquired acceleration.
  • the detection processing unit 210 samples acceleration at a predetermined cycle, and generates time-series acceleration data.
  • the detection processing unit 210 outputs the time series data of the generated acceleration to the transmission / reception unit 280.
  • the acceleration data output to the transmission / reception unit 280 includes time-series data of acceleration in each direction of the three axes.
  • the detection processing unit 210 acquires the angular velocity detected by the angular velocity detection unit 260 and processes the acquired angular velocity.
  • the detection processing unit 210 samples the angular velocity at a predetermined cycle, and generates time-series angular velocity data.
  • the detection processing unit 210 outputs the generated time-series data of the angular velocity to the transmission / reception unit 280.
  • the angular velocity data output to the transmission / reception unit 280 includes time-series data of angular velocities in the directions of the three axes.
  • the transmission / reception unit 280 transmits the V1 time series data, the V2 time series data, the acceleration time series data, and the angular velocity time series data acquired from the detection processing unit 210 to the transmission / reception unit 380 by radio signals. As described above, the transmission / reception unit 280 transmits the electrooculogram information, the acceleration information, and the angular velocity information that are continuously detected to the smartphone 40.
  • the processing in the detection processing unit 210 includes amplification processing for amplifying the input electro-oculogram, acceleration and angular velocity signals, and digitization processing for digitizing the input electro-oculogram, acceleration and angular velocity signals.
  • the detection processing unit 210 may include an amplifier circuit that amplifies analog signals of input electrooculogram, acceleration, and angular velocity.
  • the detection processing unit 210 may include an analog-to-digital conversion circuit that digitizes an analog signal of input ocular potential, acceleration, and angular velocity or an analog signal amplified by an amplifier circuit.
  • the control unit 220 controls the detection of the electrooculogram, acceleration, and angular velocity by the detection processing unit 210. Further, the control unit 220 is responsible for connection processing between the detection unit 102 and the processing unit 180. For example, the control unit 220 acquires the identification information stored in the memory 157 from the detection unit 102, and the identification information acquired from the detection unit 102 matches the predetermined identification information. The ocular potential is detected by the first electrode 151, the second electrode 152, and the third electrode 153. Specifically, the control unit 220 acquires the identification information of the user 20 from the smartphone 40 with which the transmission / reception unit 280 connection is established. Then, when the identification information acquired from the memory 157 matches the identification information acquired from the smartphone 40, the electrooculogram is detected by the first electrode 151, the second electrode 152, and the third electrode 153.
  • the power supply unit 390 includes a battery such as a secondary battery.
  • the power supply unit 390 supplies power to each unit of the smartphone 40 including the processing unit 300, the transmission / reception unit 380, and the UI unit 370.
  • the UI unit 370 provides a user interface (UI) with the user 20.
  • UI user interface
  • the UI unit 370 includes a touch panel, operation keys, a sound generation device, and the like.
  • the storage unit 360 is realized by a storage medium. Examples of the recording medium include a volatile storage medium and a nonvolatile storage medium.
  • the storage unit 360 stores various parameters necessary for the operation of the processing unit 300.
  • the storage unit 360 stores various types of information generated by the processing unit 300.
  • the transmission / reception unit 380 has a function of performing wireless communication with the glasses 100.
  • the transmission / reception unit 380 is realized by a communication processor having a short-range wireless communication function such as Bluetooth (registered trademark).
  • the transmission / reception unit 380 and the transmission / reception unit 280 perform wireless communication in accordance with the Bluetooth (registered trademark) standard.
  • communication between the transmission / reception unit 280 and the transmission / reception unit 380 is not limited to Bluetooth (registered trademark) communication.
  • Communication between the transmission / reception unit 280 and the transmission / reception unit 380 can be realized by various types of wireless communication including, for example, a wireless LAN.
  • Communication between the transmission / reception unit 280 and the transmission / reception unit 380 can be realized by various types of wired communication including USB.
  • the transmission / reception unit 380 receives information indicating the electrooculogram transmitted from the glasses 100. In addition, the transmission / reception unit 380 receives information indicating the acceleration transmitted from the glasses 100. In addition, the transmission / reception unit 380 receives information indicating the angular velocity transmitted from the glasses 100. Specifically, the transmission / reception unit 380 receives the radio signal received from the transmission / reception unit 280, demodulates the received radio signal, and performs time series data of V1, time series data of V2, time series data of acceleration, and Receive data including time-series data of angular velocity is generated. The transmission / reception unit 380 outputs the generated reception data to the processing unit 300.
  • the processing unit 300 includes an electrooculogram acquisition unit 310, an acceleration acquisition unit 320, an angular velocity acquisition unit 322, and an analysis unit 350.
  • the electrooculogram acquisition unit 310 acquires the electrooculogram of the user 20 detected by the electrooculogram detection unit attached to the user 20. Specifically, the electrooculogram acquisition unit 310 acquires the electrooculogram of the user 20 detected by the first electrode 151, the second electrode 152, and the third electrode 153 provided in the glasses 100 worn by the user 20. To do. More specifically, the electrooculogram acquisition unit 310 acquires the electrooculogram detected by the glasses 100 by extracting the time series data of V1 and the time series data of V2 from the reception data output from the transmission / reception unit 380. To do.
  • the acceleration acquisition unit 320 acquires the acceleration of the glasses 100 detected by the glasses 100. Specifically, the acceleration acquisition unit 320 acquires the acceleration detected by the glasses 100 based on the information received by the transmission / reception unit 380. More specifically, the acceleration acquisition unit 320 acquires the acceleration detected by the glasses 100 by extracting the acceleration data from the reception data output from the transmission / reception unit 380.
  • the angular velocity acquisition unit 322 acquires the angular velocity of the glasses 100 detected by the glasses 100. Specifically, the angular velocity acquisition unit 322 acquires the angular velocity detected by the glasses 100 based on the information received by the transmission / reception unit 380. More specifically, the angular velocity acquisition unit 322 acquires the angular velocity detected by the glasses 100 by extracting the angular velocity data from the reception data output from the transmission / reception unit 380.
  • the analysis unit 350 analyzes the electrooculogram acquired by the electrooculogram acquisition unit 310. Specifically, the analysis unit 350 uses the continuous electrooculogram information acquired by the electrooculogram acquisition unit 310 to specify the state of the user 20 at a plurality of timings. For example, the analysis unit 350 specifies the state of the user 20 at a plurality of timings using information on the continuous first and second electrooculograms acquired by the electrooculogram acquisition unit 310. Examples of the state of the user 20 include the user's 20 line-of-sight direction and the blink state.
  • the analysis unit 350 analyzes the identified line-of-sight direction of the user 20 and identifies the state of the user 20. Moreover, the analysis part 350 specifies the presence or absence of a blink based on a 1st ocular potential or a 2nd ocular potential. The analysis unit 350 also analyzes the line-of-sight direction and the blink state of the user 20 to identify the state of the user 20. For example, the analysis unit 350 determines whether the user 20 is tired or painful as the state of the user 20. The analysis unit 350 issues a warning to the user 20 through the UI unit 370 when it is determined that the user 20 is tired or painful.
  • the analysis unit 350 analyzes the information of at least one of the acceleration acquired by the acceleration acquisition unit 320 and the angular velocity acquired by the angular velocity acquisition unit 322, and identifies the movement of the user 20. For example, the analysis unit 350 specifies the running form of the body part of the user 20 based on at least one of acceleration and angular velocity.
  • the storage unit 360 causes the storage unit 360 to store the electrooculogram information analysis result described above and the information indicating the identified movement of the user 20 in association with each other.
  • the UI unit 370 may present information representing the total running posture based on the running form and the line-of-sight direction stored in the storage unit 360 to the user 20 using a video or the like. For example, the UI unit 370 may present an icon such as an arrow indicating the viewing direction of the user 20 to the user 20 together with an icon representing the running form.
  • the storage unit 360 may store identification information for identifying the user 20 who is the owner of the smartphone 40.
  • the processing unit 300 may transmit the identification information stored in the storage unit 360 to the glasses main body 101 by controlling the transmission / reception unit 380. For example, the processing unit 300 transmits the identification information stored in the storage unit 360 in response to a request from the glasses main body 101.
  • the processing unit 300 may receive the identification information stored in the memory 157 through the glasses body 101 through the glasses body 101, the transmission / reception unit 280, and the transmission / reception unit 380.
  • the processing unit 300 identifies the user 20 of the detection unit 102 based on the identification information received through the glasses main body 101.
  • the processing unit 300 may authenticate the user 20 when the identification information stored in the storage unit 360 matches the identification information acquired from the glasses main body 101.
  • the processing unit 300 may process the time series data of the electrooculogram, acceleration, and angular velocity received from the transmission / reception unit 380 on condition that the user 20 is authenticated.
  • the processing unit 300 outputs the time series data of the electro-oculogram, acceleration, and angular velocity received from the transmission / reception unit 380 and the analysis result by the analysis unit 350 to an external server or the like. You can do it.
  • FIG. 6 schematically shows the positional relationship between the electrooculogram detection electrode of the glasses 100 and the user 20.
  • FIG. 4 shows a contact position where the electrooculogram detection electrode contacts the user 20 when the user 20 is wearing the glasses 100.
  • the first contact position 451 represents a position where the first electrode 151 contacts the user 20.
  • the second contact position 452 represents a position where the second electrode 152 contacts the user 20.
  • the third contact position 453 represents a position where the third electrode 153 contacts the user 20.
  • the first contact position 451 and the second contact position 452 are located below the center of the cornea 411 of the right eyeball 401 and the center of the cornea 412 of the left eyeball 402.
  • the first contact position 451 and the second contact position 452 may be at positions where the distance between the first contact position 451 and the eyeball 401 and the distance between the second contact position 452 and the eyeball 402 are substantially equal. desirable. Further, it is desirable that the first contact position 451 and the second contact position 452 are separated from each other by a certain distance or more.
  • the third contact position 453 is located above the center of the cornea 411 of the right eyeball 401 and the center of the cornea 412 of the left eyeball 402.
  • the position of the third contact position 453 is a position where the distance between the third contact position 453 and the first contact position 451 and the distance between the third contact position 453 and the second contact position 452 are substantially equal. It may be.
  • the third contact position 453 is such that the distance between the third contact position 453 and the eyeball 401 is separated from the distance between the eyeball 401 and the first contact position 451, and the third contact position 453 and the eyeball 402 are separated from each other. May be at a position that is separated from the distance between the eyeball 402 and the second contact position 452.
  • the cornea side is positively charged and the retina side is negatively charged. Therefore, when the line-of-sight direction of the user 20 changes upward, V1 that is the potential of the first electrode 151 with respect to the third electrode 153 and V2 that is the potential of the second electrode 152 with respect to the third electrode 153 are Both will decline.
  • V1 and V2 rise.
  • V1 decreases and V2 increases.
  • V1 increases and V2 decreases.
  • the analysis unit 350 identifies the direction in which the line-of-sight direction has changed based on the change in V1 and the change in V2.
  • the detection processing unit 210 measures the right eye and left eye potential by detecting V1 and V2. Therefore, the influence of noise applied to the electrooculogram can be reduced.
  • the line-of-sight direction of the user 20 is determined by the direction of the eyeball 401 and the direction of the eyeball 402. Since the line-of-sight direction of the user 20 also changes depending on the direction of the head of the user 20, the global line-of-sight direction of the user 20 is determined by the direction of the eyeball 401, the direction of the eyeball 401 and the head of the user 20. In the description of the present embodiment, the line-of-sight direction determined by the direction of the eyeball 401 and the direction of the eyeball 401 may be simply referred to as the line-of-sight direction.
  • V1 changes according to the position of the cornea 411.
  • the analysis unit 350 specifies an angle change amount that is a change amount of the angle of the eyeball 401 in the rotation direction based on the change amount of V1.
  • the analysis unit 350 identifies the angle of the eyeball 401 after the rotation based on the angle of the eyeball 401 before the change and the specified angle change amount.
  • the analysis unit 350 specifies the orientation of the eyeball 401 based on the temporal change of V1.
  • the analysis unit 350 identifies the rotation angle in the other plane as well as the angle in the xy plane. For example, the analysis unit 350 identifies the angle of the eyeball 402 by a process similar to the process related to the angle of the eyeball 401. Specifically, the analysis unit 350 specifies the direction of the eyeball 402 based on the time change of V2.
  • the analysis unit 350 identifies the line-of-sight direction of the user 20 based on the orientation of the eyeball 401 and the orientation of the eyeball 402. For example, the analysis unit 350 may specify a direction in which a vector obtained by combining a vector of the direction of the eyeball 401 and a vector of the direction of the eyeball 402 faces as the line-of-sight direction of the user 20.
  • the orientation of the eyeball 401 and the orientation of the eyeball 402 are an example of an index that represents the visual line direction of the user 20.
  • the line-of-sight direction of the user 20 may be one line-of-sight direction determined from a vector of the direction of the eyeball 401, a vector of the direction of the eyeball 402, and the like. That is, the analysis unit 350 may specify the one line-of-sight direction from V1 and V2. In this case, the analysis unit 350 specifies one gaze direction by performing a predetermined calculation based on the change amount of V1 and the change amount of V2 without specifying the direction of the eyeball 401 and the direction of the eyeball 402. You can do it. For example, the analysis unit 350 may specify one gaze direction by performing a predetermined calculation that associates the variation amount of V1 and the variation amount of V2 with the variation amount of one gaze direction.
  • the direction of the eyeball 401 can be rephrased as the position of the cornea 411.
  • the direction of the eyeball 402 can be rephrased as the position of the cornea 412.
  • the change in the direction of the eyeball 401 can be rephrased as the eyeball movement of the eyeball 401.
  • a change in the direction of the eyeball 402 can be rephrased as eyeball movement. That is, the analysis unit 350 may specify the eye movement of the user 20 based on the electrooculogram detected by the glasses 100.
  • FIG. 7 is a flowchart showing processing executed in the glasses main body 101.
  • the processing shown in this flowchart is started when a task responsible for connection processing with the detection unit 102 is started in the processing unit 200.
  • the processing shown in this flowchart is started when the detection unit 102 is attached to the glasses main body 101.
  • step S ⁇ b> 702 the control unit 220 controls the transmission / reception unit 280 to establish a connection with the smartphone 40. Subsequently, in step S ⁇ b> 704, the identification information of the user 20 is acquired from the smartphone 40.
  • step S ⁇ b> 706 the control unit 220 accesses the memory 157 of the detection unit 102 and reads identification information from the memory 157.
  • step S ⁇ b> 710 the control unit 220 determines whether the identification information read from the memory 157 matches the identification information acquired from the smartphone 40. When the identification information read from the memory 157 and the identification information acquired from the smartphone 40 do not match, the process of this flowchart is terminated. If the identification information read from the memory 157 matches the identification information acquired from the smartphone 40, the process proceeds to step S712.
  • step S712 the control unit 220 controls the detection processing unit 210 to start detection of electrooculogram, acceleration, and angular velocity. Further, the control unit 220 controls the transmission / reception unit 280 to start transmission of the detected time series data of the electro-oculogram, acceleration, and angular velocity to the smartphone 40.
  • step S720 the connection state with the smartphone 40 is determined. If the connection is being established with the smartphone 40 that has established a connection in step S702, the determination in step S720 is repeated. If the connection with the smartphone 40 that established the connection in step S702 is disconnected, the process proceeds to step S730. When connected to another smartphone different from the smartphone 40 that established the connection in step S702, the process proceeds to step S722.
  • step S730 the control unit 220 controls the detection processing unit 210 to stop detecting the electrooculogram, acceleration, and angular velocity. Moreover, the control part 220 controls the transmission / reception part 280, and stops transmission to the smart phone 40 of each time series data. Then, the process of this flowchart is complete
  • step S722 the control unit 220 controls the detection processing unit 210 to stop detecting the electrooculogram, acceleration, and angular velocity. Moreover, the control part 220 controls the transmission / reception part 280, and stops transmission to the smart phone 40 of each time series data. Subsequently, in step S724, the control unit 220 acquires the identification information of the user 20 from another smartphone, and shifts the processing to step S710.
  • data such as detected electrooculogram is transmitted to the smartphone 40 when the identification information acquired from the detection unit 102 matches the smartphone 40. Therefore, data security can be increased.
  • the processing unit 200 performs processing for calculating V1 and V2. It replaces with this and the analysis part 350 of the smart phone 40 may perform the process which calculates V1 and V2.
  • the processing unit 200 generates time series data of the first ocular potential, the second ocular potential, and the third ocular potential, and the transmission / reception unit 280 causes the first ocular potential, the second ocular potential, and the third ocular potential.
  • Each of the time series data may be transmitted to the smartphone 40.
  • the analysis unit 350 uses V1 and V2 to specify the state of the user 20 such as the line-of-sight direction of the user 20.
  • the analysis unit 350 may specify the state of the user 20 using an arbitrary linear combination of V1 and V2.
  • the analysis unit 350 may specify the state of the user 20 using V1 + V2 and V1-V2.
  • the analysis part 350 may specify the state of the user 20 using the time differentiation of V1 and the time differentiation of V2.
  • the analysis unit 350 uses the first eye based on a predetermined reference potential such as the potential of the ground electrode 154 instead of the above-described V1 and V2 as the eye potential for specifying the line-of-sight direction.
  • the potential and the second ocular potential based on the potential of the ground electrode 154 may be applied.
  • the smartphone 40 may have a function of a detection unit that detects acceleration.
  • the smartphone 40 may have a function of a detection unit that detects the angular velocity.
  • the processing described as operations of the processing unit 300 and the transmission / reception unit 380 in the smartphone 40 is realized by a processor such as the processing unit 300 and the transmission / reception unit 380 controlling each hardware included in the smartphone 40 according to a program.
  • a processor such as the processing unit 300 and the transmission / reception unit 380 controlling each hardware included in the smartphone 40 according to a program.
  • the processing of the smartphone 40 described in relation to the smartphone 40 of the present embodiment includes a processor, a memory, and the like by the processor operating according to the program and controlling each hardware. This can be realized by the hardware and the program operating in cooperation. That is, the process can be realized by a so-called computer.
  • the computer may load a program for controlling execution of the above-described processing, operate according to the read program, and execute the processing.
  • the computer can load the program from a computer-readable recording medium storing the program.
  • the processing described as the operations of the processing unit 200 and the transmission / reception unit 280 in the glasses 100 can be realized by a so-called
  • the electrooculogram information processing system 10 is an example of an information processing system.
  • the smartphone 40 may process various information without being limited to the electrooculogram.
  • the smartphone 40 is an example of an information processing apparatus that processes information such as electrooculogram detected by the glasses 100.
  • the information processing apparatus may be various electronic devices having a communication function.
  • the information processing apparatus may be a portable electronic device such as a mobile phone, a portable information terminal, a portable music player, etc. possessed by the user 20.
  • the eyeglasses 100 as an example of eyewear can be used for the purpose of correcting the refractive error of the eyes of the user 20, protecting the eyes of the user 20, or dressing up.
  • eyewear is not limited to glasses.
  • the eyewear may be a face wearing device such as sunglasses, goggles, a head mounted display, or a head wearing device.
  • the eyewear may be a frame of a face wearing device or a head wearing device or a part of the frame.
  • Eyewear is an example of a wearing tool that can be worn by a user.
  • the wearing tool is not limited to a wearing tool related to the eye such as eyewear.
  • Various members such as a hat, a helmet, headphones, and a hearing aid can be applied as the wearing tool.
  • Ocular potential information processing system 100 Glasses 20 User 40 Smartphone 101 Glasses main body 102 Detection unit 110 Lens 120 Frame 122 Rim 124 Bridge 125 Connector part 126 Yoroi 130 Temple 132 Modern 141 Right nose pad 142 Left nose pad 150 Holding part 151 1st electrode 152 2nd electrode 153 3rd electrode 155 1st support member 156 2nd support member 157 Memory 154 Ground electrode 170 Connector part 171, 172 Electrode part 160 Electric wire part 180 Processing unit 190 Power supply unit 200 Processing part 210 Detection processing part 220 Control part 260 Angular velocity detection unit 270 Acceleration detection unit 280 Transmission / reception unit 290 Substrate unit 300 Processing unit 310 Ocular potential acquisition unit 320 Acceleration acquisition unit 322 Angular velocity acquisition unit 350 Analysis unit 360 Storage unit 3 0 UI unit 380 transceiver 390 power supply unit 401 and 402 the eyeball 411, 412 cornea 451 first contact position 452 the second

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Eyeglasses (AREA)

Abstract

In the present invention, a detection unit can be removably attached to an eyewear main body. The detection unit comprises: a pair of nose pads; a first electrode that is disposed on the surface of a first nose pad of the pair of nose pads and that detects the ocular potential of a wearer; a second electrode that is disposed at a position different from the first nose pad and that detects the ocular potential of the wearer; and a connector unit that attaches the detection unit to the eyewear main body and that outputs the ocular potential detected by the first electrode and by the second electrode to the eyewear main body.

Description

検出ユニット、アイウエア、及び眼電位検出システムDetection unit, eyewear, and electrooculogram detection system
 本発明は、検出ユニット、アイウエア、及び眼電位検出システムに関する。 The present invention relates to a detection unit, eyewear, and an electrooculogram detection system.
 眼電位等を計測する装置が知られている(例えば、特許文献1-3参照)。
 特許文献1 国際公開第2010/82496号
 特許文献2 特昭61-166515号公報
 特許文献3 特表2006-525829公報
An apparatus for measuring an electrooculogram or the like is known (see, for example, Patent Documents 1-3).
Patent Document 1 International Publication No. 2010/82496 Patent Document 2 Japanese Patent Publication No. Sho 61-166515 Patent Document 3 Japanese Patent Publication No. 2006-525829
 眼電位等を検出するための電極が、眼電位の検出対象に密着していない場合、眼電位を正確に計測できない場合がある。 If the electrode for detecting the electrooculogram or the like is not in close contact with the ocular potential detection target, the electrooculogram may not be measured accurately.
 第1の態様においては、検出ユニットは、アイウエア本体に着脱可能であり、一対のノーズパッドと、一対のノーズパッドのうちの第1ノーズパッドの表面に設けられ、前記装着者の眼電位を検出する第1電極と、第1ノーズパッドとは異なる位置に設けられ、前記装着者の眼電位を検出する第2電極と、検出ユニットを前記アイウエア本体に取り付けるとともに、前記第1電極及び前記第2電極により検出された眼電位を前記アイウエア本体に出力するコネクタ部とを備える。 In the first aspect, the detection unit can be attached to and detached from the eyewear main body, provided on the surface of the first nose pad of the pair of nose pads and the pair of nose pads, and for detecting the electrooculogram of the wearer. The first electrode to be detected and the first nose pad are provided at different positions, the second electrode for detecting the electrooculogram of the wearer, the detection unit is attached to the eyewear body, the first electrode and the A connector unit that outputs the electrooculogram detected by the second electrode to the eyewear body.
 第2電極は、前記一対のノーズパッドのうちの第2ノーズパッドの表面に設けられ、前記装着者の眼電位を検出してよい。 The second electrode may be provided on the surface of the second nose pad of the pair of nose pads to detect the electrooculogram of the wearer.
 装着者の眉間部に当接して、前記装着者の眼電位を検出する第3電極をさらに備えてよい。 A third electrode for detecting the electrooculogram of the wearer in contact with the wearer's eyebrow may be further provided.
 一対のノーズパッド、第3電極及び前記コネクタ部は、装着者の顔面に形状に応じて互いに位置決めして設けられてよい。 The pair of nose pads, the third electrode, and the connector part may be provided so as to be positioned relative to each other on the face of the wearer according to the shape.
 第2電極は、装着者の眉間部に当接してよい。 The second electrode may abut the wearer's eyebrow.
 検出ユニットのユーザを識別する識別情報を記憶する不揮発性の記憶部をさらに備え、コネクタ部はさらに、前記記憶部に記憶されている前記識別情報を前記アイウエア本体に出力してよい。 A nonvolatile storage unit that stores identification information for identifying a user of the detection unit may be further included, and the connector unit may further output the identification information stored in the storage unit to the eyewear body.
 第2の態様においては、アイウエアは、上記の検出ユニットと、アイウエア本体とを備える。 In the second aspect, the eyewear includes the detection unit described above and the eyewear body.
 第3の態様においては、アイウエアは、上記の検出ユニットと、アイウエア本体とを備え、アイウエア本体は、記憶部に記憶されている前記識別情報を前記検出ユニットから取得して、前記検出ユニットから取得した前記識別情報が、予め定められた識別情報と一致している場合に、前記第1電極及び前記第2電極による前記眼電位の検出を行わせる制御部を有する。 In the third aspect, the eyewear includes the detection unit described above and an eyewear body, and the eyewear body acquires the identification information stored in the storage unit from the detection unit, and performs the detection. A control unit configured to detect the electrooculogram by the first electrode and the second electrode when the identification information acquired from the unit matches predetermined identification information;
 第4の態様においては、眼電位検出システムは、上記の検出ユニットと、アイウエア本体を通じて前記識別情報を受信して、前記アイウエア本体を通じて受信した前記識別情報に基づいて、前記検出ユニットのユーザを識別する眼電位情報処理装置とを備える。 In a fourth aspect, the electrooculogram detection system receives the identification information through the detection unit and the eyewear main body, and based on the identification information received through the eyewear main body, the user of the detection unit And an electrooculogram information processing device for identifying.
 なお、上記の発明の概要は、本発明の特徴の全てを列挙したものではない。また、これらの特徴群のサブコンビネーションもまた、発明となりうる。 Note that the above summary of the invention does not enumerate all the features of the present invention. In addition, a sub-combination of these feature groups can also be an invention.
一実施形態における眼電位情報処理システム10の利用形態の一例を概略的に示す。An example of the utilization form of electrooculogram information processing system 10 in one embodiment is shown roughly. メガネ100及びスマートフォン40を模式的に示す。The glasses 100 and the smartphone 40 are schematically shown. 検出ユニット102を模式的に示す。A detection unit 102 is schematically shown. 検出ユニット102及びブリッジ124の接続構造を模式的に示す断面図である。3 is a cross-sectional view schematically showing a connection structure between a detection unit 102 and a bridge 124. FIG. スマートフォン40の機能ブロック構成及び処理ユニット180の機能ブロック構成を概略的に示す。The functional block configuration of the smart phone 40 and the functional block configuration of the processing unit 180 are schematically shown. メガネ100が有する眼電位検出用電極とユーザ20との位置関係を概略的に示す。The positional relationship between the electrooculogram detection electrode of the glasses 100 and the user 20 is schematically shown. メガネ本体101において実行される処理を表すフローチャートを示す。3 is a flowchart illustrating processing executed in the glasses main body 101.
 以下、発明の実施の形態を通じて本発明を説明するが、以下の実施形態は請求の範囲にかかる発明を限定するものではない。また、実施形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須であるとは限らない。 Hereinafter, the present invention will be described through embodiments of the invention. However, the following embodiments do not limit the invention according to the claims. In addition, not all the combinations of features described in the embodiments are essential for the solving means of the invention.
 図1は、一実施形態における眼電位情報処理システム10の利用形態の一例を概略的に示す。眼電位情報処理システム10は、メガネ100及びスマートフォン40を備える。 FIG. 1 schematically shows an example of a usage pattern of an electrooculogram information processing system 10 according to an embodiment. The electrooculogram information processing system 10 includes glasses 100 and a smartphone 40.
 ユーザ20は、メガネ100及びスマートフォン40のそれぞれの使用者である。ユーザ20は、メガネ100を装着する装着者である。スマートフォン40は、眼電位等の情報を処理する情報処理装置の一例である。 User 20 is a user of each of glasses 100 and smartphone 40. The user 20 is a wearer who wears the glasses 100. The smartphone 40 is an example of an information processing apparatus that processes information such as electrooculogram.
 メガネ100は、ユーザ20の顔部に装着される。メガネ100は、スマートフォン40と通信する機能を有する。メガネ100は、ユーザ20に接触する電極を介してユーザ20の眼電位を検出して、検出した眼電位の情報をスマートフォン40に送信する。また、メガネ100は、メガネ100の加速度を検出して、検出した加速度の情報をスマートフォン40に送信する。また、メガネ100は、メガネ100の角速度を検出して、検出した角速度の情報をスマートフォン40に送信する。 The glasses 100 are worn on the face of the user 20. The glasses 100 have a function of communicating with the smartphone 40. The glasses 100 detect the electrooculogram of the user 20 via the electrode that contacts the user 20, and transmit the detected electrooculogram information to the smartphone 40. Further, the glasses 100 detect the acceleration of the glasses 100 and transmit the detected acceleration information to the smartphone 40. Further, the glasses 100 detect the angular velocity of the glasses 100 and transmit information on the detected angular velocity to the smartphone 40.
 スマートフォン40は、メガネ100から受信した眼電位を解析する。スマートフォン40は、メガネ100から受信した眼電位を、メガネ100から受信した加速度及び角速度の少なくとも一方とともに解析する。スマートフォン40は、眼電位、加速度、角速度等の解析結果に基づいて、ユーザ20に情報を提供する。 The smartphone 40 analyzes the electrooculogram received from the glasses 100. The smartphone 40 analyzes the electrooculogram received from the glasses 100 together with at least one of acceleration and angular velocity received from the glasses 100. The smartphone 40 provides information to the user 20 based on analysis results such as electrooculogram, acceleration, and angular velocity.
 例えば、スマートフォン40は、眼電位、加速度及び角速度等を解析して、ユーザ20の状態を特定する。具体的には、スマートフォン40は、眼電位を解析して、ユーザ20の視線方向及び瞬目状態等を特定する。例えば、スマートフォン40は、視線方向及び瞬目状態等に基づき、ユーザ20の状態を判断する。具体的には、スマートフォン40は、視線方向及び瞬目状態等に基づき、ユーザ20に疲れ、痛み等が生じているか否かを判断する。スマートフォン40は、ユーザ20に疲れ、痛み等が生じていると判断した場合に、ユーザ20に警告を発する。例えば、スマートフォン40は、ユーザ20に疲れ、痛み等が生じていると判断した場合に、警告音を発生する。 For example, the smartphone 40 identifies the state of the user 20 by analyzing the electrooculogram, acceleration, angular velocity, and the like. Specifically, the smartphone 40 analyzes the electrooculogram and identifies the user's 20 line-of-sight direction, the blink state, and the like. For example, the smartphone 40 determines the state of the user 20 based on the line-of-sight direction and the blink state. Specifically, the smartphone 40 determines whether the user 20 is tired or painful based on the line-of-sight direction and the blink state. The smartphone 40 issues a warning to the user 20 when determining that the user 20 is tired, painful, or the like. For example, the smartphone 40 generates a warning sound when it is determined that the user 20 is tired or painful.
 なお、本実施形態の説明において、図1等に示される直交座標系の座標軸を用いて、各種の方向を特定する場合がある。z軸プラス方向を、ユーザ20の正面に沿う方向に定める。z軸プラス方向は、ユーザ20の顔に装着されているメガネ100の、ユーザ20の顔からメガネ100のフロント部へ向かう方向の加速度である。また、y軸マイナス方向を、鉛直方向、すなわち鉛直下方に定める。x軸、y軸、z軸は、右手系の直交座標系である。説明の都合上、z軸プラス方向を正面方向等と呼ぶ場合がある。また、y軸プラス方向を上方等と呼ぶ場合がある。また、y軸マイナス方向を下方等と呼ぶ場合がある。また、x軸プラス方向を左方等と呼ぶ場合がある。また、x軸マイナス方向を右方等と呼ぶ場合がある。 In the description of the present embodiment, various directions may be specified using the coordinate axes of the orthogonal coordinate system shown in FIG. The z-axis plus direction is defined as a direction along the front of the user 20. The z-axis plus direction is an acceleration in a direction from the face of the user 20 toward the front portion of the glasses 100 of the glasses 100 attached to the face of the user 20. Further, the negative y-axis direction is defined as the vertical direction, that is, vertically downward. The x-axis, y-axis, and z-axis are right-handed orthogonal coordinate systems. For convenience of explanation, the z-axis plus direction may be referred to as the front direction. In addition, the y-axis plus direction may be referred to as upward or the like. Also, the y-axis minus direction may be referred to as downward or the like. Further, the x-axis plus direction may be referred to as the left side or the like. Also, the x-axis minus direction may be referred to as the right side or the like.
 図2は、メガネ100及びスマートフォン40を模式的に示す。メガネ100は、メガネ本体101及び検出ユニット102を備える。検出ユニット102は、メガネ本体101に装着される。検出ユニット102は、メガネ本体101に着脱可能である。 FIG. 2 schematically shows the glasses 100 and the smartphone 40. The glasses 100 include a glasses main body 101 and a detection unit 102. The detection unit 102 is attached to the glasses main body 101. The detection unit 102 can be attached to and detached from the glasses main body 101.
 メガネ本体101は、レンズ110及びフレーム120を備える。メガネ100は、アイウエアの一例である。また、フレーム120及び検出ユニット102は、アイウエアの一例である。 The glasses main body 101 includes a lens 110 and a frame 120. Glasses 100 is an example of eyewear. The frame 120 and the detection unit 102 are examples of eyewear.
 フレーム120は、一対のレンズ110を支持する。フレーム120は、リム122、ブリッジ124、ヨロイ126、テンプル130、モダン132、接地電極154、処理ユニット180及び電源ユニット190を有する。レンズ110、リム122、ヨロイ126、テンプル130、及びモダン132は、それぞれ左右一対に設けられる。フレーム120のうち、リム122、ブリッジ124、及びヨロイ126の部分を、メガネ100のフロント部と呼ぶ。 The frame 120 supports a pair of lenses 110. The frame 120 includes a rim 122, a bridge 124, an alloy 126, a temple 130, a modern 132, a ground electrode 154, a processing unit 180, and a power supply unit 190. The lens 110, the rim 122, the armor 126, the temple 130, and the modern 132 are each provided in a pair of left and right. Of the frame 120, the portions of the rim 122, the bridge 124, and the armor 126 are referred to as a front portion of the glasses 100.
 リム122は、レンズ110を保持する。ヨロイ126は、リム122の外側に設けられテンプル130を可動に保持する。テンプル130は、ユーザ20の耳の上部を押圧して、押圧した部位を挟持する。モダン132は、テンプル130の先端に設けられる。モダン132は、ユーザ20の耳の上部に接触する。 The rim 122 holds the lens 110. The armor 126 is provided outside the rim 122 and holds the temple 130 movably. The temple 130 presses the upper part of the user's 20 ear and pinches the pressed part. The modern 132 is provided at the tip of the temple 130. The modern 132 contacts the upper part of the ear of the user 20.
 図3は、検出ユニット102を模式的に示す。検出ユニット102は、保持部150、右ノーズパッド141、左ノーズパッド142、第1電極151、第2電極152、第3電極153、第1支持部材155、第2支持部材156を有する。右ノーズパッド141及び左ノーズパッド142は、一対のノーズパッドを形成する。 FIG. 3 schematically shows the detection unit 102. The detection unit 102 includes a holding unit 150, a right nose pad 141, a left nose pad 142, a first electrode 151, a second electrode 152, a third electrode 153, a first support member 155, and a second support member 156. The right nose pad 141 and the left nose pad 142 form a pair of nose pads.
 右ノーズパッド141は、第1支持部材155によって、保持部150に支持される。右ノーズパッド141は、保持部150に対して略位置決めされる。左ノーズパッド142は、第2支持部材156によって、保持部150に支持される。左ノーズパッド142は、保持部150に対して略位置決めされる。 The right nose pad 141 is supported by the holding unit 150 by the first support member 155. The right nose pad 141 is substantially positioned with respect to the holding unit 150. The left nose pad 142 is supported by the holding unit 150 by the second support member 156. The left nose pad 142 is substantially positioned with respect to the holding unit 150.
 保持部150は、メガネ本体101のブリッジ124に装着されて固定される。保持部150がメガネ本体101のブリッジ124に固定されることにより、検出ユニット102がメガネ本体101に固定され、メガネ100として用いられる。 The holding unit 150 is mounted and fixed to the bridge 124 of the glasses main body 101. When the holding unit 150 is fixed to the bridge 124 of the glasses main body 101, the detection unit 102 is fixed to the glasses main body 101 and used as the glasses 100.
 図4は、検出ユニット102及びブリッジ124の接続構造を模式的に示す断面図である。検出ユニット102は、ブリッジ124に装着される。検出ユニット102は、ブリッジ124に着脱可能である。 FIG. 4 is a cross-sectional view schematically showing the connection structure of the detection unit 102 and the bridge 124. The detection unit 102 is attached to the bridge 124. The detection unit 102 can be attached to and detached from the bridge 124.
 保持部150には、コネクタ部170が設けられている。ブリッジ124には、コネクタ部125が設けられている。保持部150に設けられたコネクタ部170が、ブリッジ124に設けられたコネクタ部125に装着されることにより、検出ユニット102がブリッジ124に装着されて、検出ユニット102がブリッジ124に対して固定される。すなわち、検出ユニット102は、メガネ本体101に対して固定される。 The holding part 150 is provided with a connector part 170. The bridge 124 is provided with a connector portion 125. By attaching the connector part 170 provided in the holding part 150 to the connector part 125 provided in the bridge 124, the detection unit 102 is attached to the bridge 124, and the detection unit 102 is fixed to the bridge 124. The That is, the detection unit 102 is fixed to the glasses main body 101.
 保持部150の内部には、メモリ157が設けられる。メモリ157は、ユーザ20を識別する識別情報が記憶する。メモリ157は、検出ユニット102のユーザを識別する識別情報を記憶する不揮発性の記憶部の一例である。 A memory 157 is provided inside the holding unit 150. The memory 157 stores identification information for identifying the user 20. The memory 157 is an example of a nonvolatile storage unit that stores identification information for identifying the user of the detection unit 102.
 コネクタ部170は、複数の電極部171を有する。複数の電極部171のそれぞれは、第1電極151、第2電極152、第3電極153及びメモリ157のいずれか一つに電気的に接続される。 The connector part 170 has a plurality of electrode parts 171. Each of the plurality of electrode portions 171 is electrically connected to any one of the first electrode 151, the second electrode 152, the third electrode 153, and the memory 157.
 コネクタ部125は、複数の電極部172を有する。複数の電極部172は、複数の電極部171に対応する位置に設けられる。コネクタ部170がコネクタ部125に装着された場合、複数の電極部172のそれぞれは、複数の電極部171のうちの対応する一つの電極部に接触して、電気的に接続される。このように、検出ユニット102とブリッジ124とは、コネクタ部170及びコネクタ部125を介して電気的に接続される。 The connector part 125 has a plurality of electrode parts 172. The plurality of electrode portions 172 are provided at positions corresponding to the plurality of electrode portions 171. When the connector part 170 is attached to the connector part 125, each of the plurality of electrode parts 172 contacts and is electrically connected to a corresponding one of the plurality of electrode parts 171. Thus, the detection unit 102 and the bridge 124 are electrically connected via the connector part 170 and the connector part 125.
 メモリ157に記憶されている情報は、電極部171、電極部172及び後述する電線部160を介して、処理ユニット180に伝送される。このように、コネクタ部170は、メモリ157に記憶されている識別情報をメガネ本体101に出力する。 Information stored in the memory 157 is transmitted to the processing unit 180 via the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later. As described above, the connector unit 170 outputs the identification information stored in the memory 157 to the glasses main body 101.
 第1電極151は、眼電位を検出する眼電位検出部の一例である。第1電極151は、右ノーズパッド141の表面に設けられる。第1電極151は、右ノーズパッド141が有する表面のうち、ユーザ20がメガネ100を装着した場合にユーザ20の顔に対向する側の表面に設けられる。ユーザ20がメガネ100を装着した場合、第1電極151はユーザ20の皮膚に接触する。例えば、ユーザ20がメガネ100を装着した場合、第1電極151はユーザ20の鼻の右側に接触する。本実施形態において、第1電極151は、主としてユーザ20の右眼の眼電位を検出する。第1電極151で検出された電位は、電極部171、電極部172及び後述する電線部160を通じて、処理ユニット180に伝送される。 The first electrode 151 is an example of an electrooculogram detection unit that detects electrooculogram. The first electrode 151 is provided on the surface of the right nose pad 141. The first electrode 151 is provided on the surface of the right nose pad 141 that faces the face of the user 20 when the user 20 wears the glasses 100. When the user 20 wears the glasses 100, the first electrode 151 contacts the skin of the user 20. For example, when the user 20 wears the glasses 100, the first electrode 151 contacts the right side of the user 20 nose. In the present embodiment, the first electrode 151 mainly detects the electrooculogram of the right eye of the user 20. The potential detected by the first electrode 151 is transmitted to the processing unit 180 through the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
 第2電極152は、眼電位を検出する眼電位検出部の一例である。第2電極152は、左ノーズパッド142の表面に設けられる。第2電極152は、左ノーズパッド142が有する表面のうち、ユーザ20がメガネ100を装着した場合にユーザ20の顔に対向する側の表面に設けられる。ユーザ20がメガネ100を装着した場合、第2電極152はユーザ20の皮膚に接触する。例えば、ユーザ20がメガネ100を装着した場合、第2電極152はユーザ20の鼻の左側に接触する。本実施形態において、第2電極152は、主としてユーザ20の左眼の眼電位を検出する。第2電極152で検出された電位は、電極部171、電極部172及び後述する電線部160を通じて、処理ユニット180に伝送される。 The second electrode 152 is an example of an electrooculogram detection unit that detects electrooculogram. The second electrode 152 is provided on the surface of the left nose pad 142. The second electrode 152 is provided on the surface of the left nose pad 142 that faces the face of the user 20 when the user 20 wears the glasses 100. When the user 20 wears the glasses 100, the second electrode 152 contacts the skin of the user 20. For example, when the user 20 wears the glasses 100, the second electrode 152 contacts the left side of the user 20 nose. In the present embodiment, the second electrode 152 mainly detects the electrooculogram of the left eye of the user 20. The potential detected by the second electrode 152 is transmitted to the processing unit 180 through the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
 第3電極153は、眼電位を検出する眼電位検出部の一例である。第3電極153は、保持部150の表面に設けられる。第3電極153は、保持部150の表面のうち、コネクタ部170が設けられた表面とは反対側の表面に設けられる。第3電極153は、保持部150が有する表面のうち、ユーザ20がメガネ100を装着した場合にユーザ20の顔に対向する表面に設けられる。ユーザ20がメガネ100を装着した場合、第3電極153はユーザ20の皮膚に接触する。例えば、ユーザ20がメガネ100を装着した場合、第3電極153はユーザ20の眉間の上部に接触する。本実施形態において、第3電極153で検出される眼電位は、ユーザ20の右眼の眼電位及び左眼の眼電位を測定するための測定用の基準に用いられる。第3電極153で検出された電位は、電極部171、電極部172及び後述する電線部160を通じて、処理ユニット180に伝送される。 The third electrode 153 is an example of an electrooculogram detection unit that detects electrooculogram. The third electrode 153 is provided on the surface of the holding unit 150. The third electrode 153 is provided on the surface of the holding unit 150 opposite to the surface on which the connector unit 170 is provided. The third electrode 153 is provided on the surface of the holding unit 150 that faces the face of the user 20 when the user 20 wears the glasses 100. When the user 20 wears the glasses 100, the third electrode 153 contacts the skin of the user 20. For example, when the user 20 wears the glasses 100, the third electrode 153 contacts the upper part of the eyebrow of the user 20. In the present embodiment, the electrooculogram detected by the third electrode 153 is used as a measurement reference for measuring the electrooculogram of the right eye and the electrooculogram of the left eye of the user 20. The potential detected by the third electrode 153 is transmitted to the processing unit 180 through the electrode unit 171, the electrode unit 172, and the electric wire unit 160 described later.
 なお、メガネ本体101において、接地電極154は、モダン132の表面に設けられる。接地電極154は、例えば右側のモダン132の表面に設けられる。接地電極154は、モダン132が有する表面のうち、ユーザ20がメガネ100を装着した場合にユーザ20の顔に対向する表面に設けられる。ユーザ20がメガネ100を装着した場合、接地電極154はユーザ20の皮膚に接触する。例えば、ユーザ20がメガネ100を装着した場合、接地電極154はユーザ20の右耳の上部に接触する。本実施形態において、接地電極154の電位は、メガネ100が有する電気回路の接地電位を提供する。 In the glasses main body 101, the ground electrode 154 is provided on the surface of the modern 132. The ground electrode 154 is provided on the surface of the modern 132 on the right side, for example. The ground electrode 154 is provided on the surface of the modern 132 that faces the face of the user 20 when the user 20 wears the glasses 100. When the user 20 wears the glasses 100, the ground electrode 154 contacts the user 20's skin. For example, when the user 20 wears the glasses 100, the ground electrode 154 contacts the upper part of the right ear of the user 20. In the present embodiment, the potential of the ground electrode 154 provides the ground potential of the electric circuit included in the glasses 100.
 処理ユニット180は、左側のテンプル130の内部に設けられる。処理ユニット180には、第1電極151、第2電極152及び第3電極153で検出されたユーザ20の眼電位が入力される。処理ユニット180は、入力された眼電位を処理して、処理された電位をスマートフォン40に送信する。 The processing unit 180 is provided inside the left temple 130. The electrooculogram of the user 20 detected by the first electrode 151, the second electrode 152, and the third electrode 153 is input to the processing unit 180. The processing unit 180 processes the input electrooculogram and transmits the processed potential to the smartphone 40.
 電源ユニット190は、左側のテンプル130の内部に設けられる。電源ユニット190は、二次電池等の電池を含む。電源ユニット190は、電源ユニット190が含む電池に蓄積された電気エネルギーを、処理ユニット180に供給する。具体的には、電源ユニット190は、電池に蓄積された電気エネルギーから、接地電極154の電位を基準とした直流電力を生成する。電源ユニット190は、電池に蓄積された電気エネルギーから生成した直流電力を、処理ユニット180に供給する。 The power supply unit 190 is provided inside the temple 130 on the left side. The power supply unit 190 includes a battery such as a secondary battery. The power supply unit 190 supplies the electrical energy stored in the battery included in the power supply unit 190 to the processing unit 180. Specifically, the power supply unit 190 generates DC power based on the potential of the ground electrode 154 from the electrical energy stored in the battery. The power supply unit 190 supplies DC power generated from the electrical energy stored in the battery to the processing unit 180.
 なお、電源ユニット190は、接地電極154が設けられた側のテンプル130の内部に設けられている。接地電極154の電位は、電源ユニット190から処理ユニット180に供給される直流電力の負側の電位を提供する。また、右側のモダン132には、電源ユニット190を充電するために充電口が形成されている。電源ユニット190が含む電池は、右側のモダン132に設けられた充電口を介して充電される。 The power supply unit 190 is provided inside the temple 130 on the side where the ground electrode 154 is provided. The potential of the ground electrode 154 provides a negative potential of DC power supplied from the power supply unit 190 to the processing unit 180. The right modern 132 has a charging port for charging the power supply unit 190. The battery included in the power supply unit 190 is charged through a charging port provided in the modern 132 on the right side.
 以上に説明したように、コネクタ部170は、検出ユニット102をメガネ本体101に取り付けるとともに、第1電極151、第2電極152及び第3電極153により検出された眼電位をメガネ本体101に出力する。なお、検出ユニット102は、眼電位を検出するため電極を2つ有してよい。例えば、検出ユニット102は、第1電極151及び第2電極152を有してよい。また、検出ユニット102は、第1電極151及び第2電極152の任意の一方と、第3電極153とを有してよい。第2電極152は、右ノーズパッド141とは異なる位置に設けられ、ユーザ20の眼電位を検出する電極の一例である。右ノーズパッド141とは異なる位置に設けられ、ユーザ20の眼電位を検出する電極とは、第3電極153であってよい。 As described above, the connector unit 170 attaches the detection unit 102 to the glasses body 101 and outputs the electrooculogram detected by the first electrode 151, the second electrode 152, and the third electrode 153 to the glasses body 101. . The detection unit 102 may have two electrodes for detecting the electrooculogram. For example, the detection unit 102 may include a first electrode 151 and a second electrode 152. The detection unit 102 may include any one of the first electrode 151 and the second electrode 152 and the third electrode 153. The second electrode 152 is an example of an electrode that is provided at a position different from the right nose pad 141 and detects the electrooculogram of the user 20. The electrode that is provided at a position different from the right nose pad 141 and detects the electrooculogram of the user 20 may be the third electrode 153.
 右ノーズパッド141、左ノーズパッド142及び第3電極153は、ユーザ20の顔面に形状に応じて互いに位置決めして設けられる。このように、検出ユニット102は、ユーザ20にフィットするように設計できる。例えば、ユーザ20がメガネ100を装着した場合に第1電極151、第2電極152及び第3電極153がユーザ20に十分に接触するように、検出ユニット102が調節されてよい。具体的には、検出ユニット102において、第3電極153のz軸方向の長さ、第3電極153の表面形状、検出ユニット102が有する第1支持部材155及び第2支持部材156のそれぞれの長さや形状、右ノーズパッド141の取り付け角度、左ノーズパッド142の取り付け角度等の複数のパラメータが調節されてよい。 The right nose pad 141, the left nose pad 142, and the third electrode 153 are provided to be positioned with respect to the face of the user 20 according to the shape. In this way, the detection unit 102 can be designed to fit the user 20. For example, when the user 20 wears the glasses 100, the detection unit 102 may be adjusted such that the first electrode 151, the second electrode 152, and the third electrode 153 are in sufficient contact with the user 20. Specifically, in the detection unit 102, the length of the third electrode 153 in the z-axis direction, the surface shape of the third electrode 153, and the lengths of the first support member 155 and the second support member 156 included in the detection unit 102, respectively. A plurality of parameters such as the sheath shape, the mounting angle of the right nose pad 141, and the mounting angle of the left nose pad 142 may be adjusted.
 なお、調整済みのパラメータの組み合わせが複数パターン提供されてよい。ユーザ20は自身の眼部、眉間部及び鼻部の形状に適したパラメータを持つ検出ユニット102を選んで購入できてよい。また、検出ユニット102を使用するユーザ毎に、これらのパラメータが個々に調整されてよい。いわゆるオーダーメイドで調節された検出ユニット102が、ユーザ20に提供されてよい。 Note that multiple patterns of adjusted parameter combinations may be provided. The user 20 may be able to select and purchase the detection unit 102 having parameters suitable for the shape of his / her eye, eyebrow and nose. These parameters may be individually adjusted for each user who uses the detection unit 102. A so-called tailored detection unit 102 may be provided to the user 20.
 なお、第3電極153の表面形状は、ユーザ20の眉間部の表面形状に沿う表面形状を有することが好ましい。第3電極153は、ユーザ20の眉間部の表面に沿う曲面形状を有してよい。 In addition, it is preferable that the surface shape of the 3rd electrode 153 has a surface shape along the surface shape of the user's 20 eyebrow part. The third electrode 153 may have a curved surface shape along the surface of the portion between the eyebrows of the user 20.
 メガネ100においては、メガネ本体101に対して検出ユニット102が着脱可能である。そのため、第1電極151、第2電極152及び第3電極153がユーザ20にきちんと接触するような検出ユニット102をユーザ20に提供できる。そのため、眼電位のより正確な検出が可能になる。また、ユーザ20は、所有する複数のメガネ本体の中から好みのメガネ本体を選んで、選んだメガネ本体に検出ユニット102を装着して使用できる。 In the glasses 100, the detection unit 102 can be attached to and detached from the glasses body 101. Therefore, the detection unit 102 in which the first electrode 151, the second electrode 152, and the third electrode 153 come into contact with the user 20 can be provided to the user 20. Therefore, it is possible to detect ocular potential more accurately. In addition, the user 20 can select a favorite glasses body from among a plurality of glasses bodies that he / she owns, and can attach and use the detection unit 102 to the selected glasses body.
 図5は、スマートフォン40の機能ブロック構成及び処理ユニット180の機能ブロック構成を概略的に示す。処理ユニット180は、処理部200、角速度検出部260、加速度検出部270、送受信部280及び基板部290を含む。処理部200は、検出処理部210及び制御部220を持つ。スマートフォン40は、処理部300、格納部360、UI部370、送受信部380及び電源部390を含む。 FIG. 5 schematically shows a functional block configuration of the smartphone 40 and a functional block configuration of the processing unit 180. The processing unit 180 includes a processing unit 200, an angular velocity detection unit 260, an acceleration detection unit 270, a transmission / reception unit 280, and a substrate unit 290. The processing unit 200 includes a detection processing unit 210 and a control unit 220. The smartphone 40 includes a processing unit 300, a storage unit 360, a UI unit 370, a transmission / reception unit 380, and a power supply unit 390.
 処理ユニット180において、処理部200、角速度検出部260、加速度検出部270及び送受信部280は、基板部290に実装される。処理部200は、MPU等のプロセッサで実現される。スマートフォン40の各部は、主として処理部300によって制御される。送受信部280は、スマートフォン40と無線通信する機能を担う。送受信部280は、通信用プロセッサで実現される。例えば、送受信部280は、Bluetooth(登録商標)等の近距離無線通信機能を有する通信用プロセッサで実現される。 In the processing unit 180, the processing unit 200, the angular velocity detection unit 260, the acceleration detection unit 270, and the transmission / reception unit 280 are mounted on the substrate unit 290. The processing unit 200 is realized by a processor such as an MPU. Each unit of the smartphone 40 is mainly controlled by the processing unit 300. The transmission / reception unit 280 has a function of performing wireless communication with the smartphone 40. The transmission / reception unit 280 is realized by a communication processor. For example, the transmission / reception unit 280 is realized by a communication processor having a short-range wireless communication function such as Bluetooth (registered trademark).
 フレーム120の内部には、電線部160が設けられる。電線部160は、第1電極151、第2電極152、第3電極153、接地電極154、メモリ157及び電源ユニット190と、処理ユニット180とを電気的に接続する。電線部160は、電極部171及び電極部172を介して第1電極151と処理ユニット180とを電気的に接続して、第1電極151で検出された眼電位を処理ユニット180に出力する電線と、電極部171及び電極部172を介して第2電極152と処理ユニット180とを電気的に接続して、第2電極152で検出された眼電位を処理ユニット180に出力する電線と、電極部171及び電極部172を介して第3電極153と処理ユニット180とを電気的に接続して、第3電極153で検出された眼電位を処理ユニット180に出力する電線と、電極部171及び電極部172を介してメモリ157と処理ユニット180とを電気的に接続して、処理ユニット180との間でメモリ157対してデータを読み書きするための信号を伝送するに電線と、電源ユニット190から処理ユニット180に電力を供給する電線とを有する。 Inside the frame 120, an electric wire part 160 is provided. The electric wire part 160 electrically connects the first electrode 151, the second electrode 152, the third electrode 153, the ground electrode 154, the memory 157, the power supply unit 190, and the processing unit 180. The electric wire part 160 electrically connects the first electrode 151 and the processing unit 180 via the electrode part 171 and the electrode part 172, and outputs an electrooculogram detected by the first electrode 151 to the processing unit 180. And an electric wire that electrically connects the second electrode 152 and the processing unit 180 via the electrode unit 171 and the electrode unit 172, and outputs the ocular potential detected by the second electrode 152 to the processing unit 180, and an electrode An electric wire that electrically connects the third electrode 153 and the processing unit 180 via the unit 171 and the electrode unit 172 and outputs the electrooculogram detected by the third electrode 153 to the processing unit 180; A signal for electrically connecting the memory 157 and the processing unit 180 via the electrode unit 172 and reading / writing data from / to the memory 157 with the processing unit 180 It has a wire to transmit, and a wire for supplying electric power from the power supply unit 190 to the processing unit 180.
 メガネ100において、処理部200は、ユーザ20の眼電位を取得して、取得した眼電位を処理する。具体的には、検出処理部210は、第1電極151で検出された眼電位である第1眼電位を取得して、取得した第1眼電位を処理する。また、検出処理部210は、第2電極152で検出された眼電位である第2眼電位を取得して、取得した第2眼電位を処理する。また、検出処理部210は、第3電極153で検出された眼電位である第3眼電位を取得して、取得した第3眼電位を処理する。 In the glasses 100, the processing unit 200 acquires the electrooculogram of the user 20, and processes the acquired electrooculogram. Specifically, the detection processing unit 210 acquires the first electrooculogram that is the electrooculogram detected by the first electrode 151, and processes the acquired first electrooculogram. In addition, the detection processing unit 210 acquires a second ocular potential that is an ocular potential detected by the second electrode 152, and processes the acquired second ocular potential. Further, the detection processing unit 210 acquires a third ocular potential that is an ocular potential detected by the third electrode 153, and processes the acquired third ocular potential.
 例えば、検出処理部210は、第3眼電位を基準とした第1眼電位を処理する。本実施形態では、第3眼電位を基準とした第1眼電位を、V1と呼ぶ。検出処理部210は、V1を予め定められた周期でサンプリングして、V1の時系列データを生成する。検出処理部210は、生成したV1の時系列データを、送受信部280に出力する。 For example, the detection processing unit 210 processes the first ocular potential based on the third ocular potential. In the present embodiment, the first electrooculogram based on the third electrooculogram is referred to as V1. The detection processing unit 210 samples V1 at a predetermined cycle and generates time series data of V1. The detection processing unit 210 outputs the generated time series data of V1 to the transmission / reception unit 280.
 また、検出処理部210は、第3眼電位を基準とした第2眼電位を処理する。本実施形態では、第3眼電位を基準とした第2眼電位を、V2と呼ぶ。検出処理部210は、V2を予め定められた周期でサンプリングして、V2の時系列データを生成する。検出処理部210は、生成したV2の時系列データを、送受信部280に出力する。 Also, the detection processing unit 210 processes the second electrooculogram based on the third electrooculogram. In the present embodiment, the second ocular potential based on the third ocular potential is referred to as V2. The detection processing unit 210 samples V2 at a predetermined cycle, and generates time-series data of V2. The detection processing unit 210 outputs the generated V2 time-series data to the transmission / reception unit 280.
 加速度検出部270は、メガネ100の加速度を検出する。加速度検出部270は、例えば3軸加速度センサである。加速度検出部270は、メガネ100の重心の加速度を検出する。メガネ100がユーザ20に装着されている場合、メガネ100の重心の加速度は、ユーザ20の頭部の加速度に対応する。検出処理部210には、加速度検出部270で検出された加速度が入力される。 The acceleration detection unit 270 detects the acceleration of the glasses 100. The acceleration detection unit 270 is, for example, a triaxial acceleration sensor. The acceleration detection unit 270 detects the acceleration of the center of gravity of the glasses 100. When the glasses 100 are worn on the user 20, the acceleration of the center of gravity of the glasses 100 corresponds to the acceleration of the head of the user 20. The acceleration detected by the acceleration detection unit 270 is input to the detection processing unit 210.
 角速度検出部260は、メガネ100の角速度を検出する。角速度検出部260は、例えば3軸角速度センサである。検出処理部210には、角速度検出部260で検出された角速度が入力される。 The angular velocity detection unit 260 detects the angular velocity of the glasses 100. The angular velocity detection unit 260 is, for example, a triaxial angular velocity sensor. The angular velocity detected by the angular velocity detector 260 is input to the detection processing unit 210.
 検出処理部210は、加速度検出部270で検出された加速度を取得して、取得した加速度を処理する。検出処理部210は、加速度を予め定められた周期でサンプリングして、時系列の加速度のデータを生成する。検出処理部210は、生成した加速度の時系列データを送受信部280に出力する。送受信部280に出力される加速度データは、3軸の各方向の加速度の時系列データを含む。 The detection processing unit 210 acquires the acceleration detected by the acceleration detection unit 270 and processes the acquired acceleration. The detection processing unit 210 samples acceleration at a predetermined cycle, and generates time-series acceleration data. The detection processing unit 210 outputs the time series data of the generated acceleration to the transmission / reception unit 280. The acceleration data output to the transmission / reception unit 280 includes time-series data of acceleration in each direction of the three axes.
 検出処理部210は、角速度検出部260で検出された角速度を取得して、取得した角速度を処理する。検出処理部210は、角速度を予め定められた周期でサンプリングして、時系列の角速度のデータを生成する。検出処理部210は、生成した角速度の時系列データを送受信部280に出力する。送受信部280に出力される角速度データは、3軸の各方向の角速度の時系列データを含む。 The detection processing unit 210 acquires the angular velocity detected by the angular velocity detection unit 260 and processes the acquired angular velocity. The detection processing unit 210 samples the angular velocity at a predetermined cycle, and generates time-series angular velocity data. The detection processing unit 210 outputs the generated time-series data of the angular velocity to the transmission / reception unit 280. The angular velocity data output to the transmission / reception unit 280 includes time-series data of angular velocities in the directions of the three axes.
 送受信部280は、検出処理部210から取得したV1の時系列データ、V2の時系列データ、加速度の時系列データ及び角速度の時系列データを、無線信号で送受信部380に送信する。このように、送受信部280は、連続的に検出された眼電位の情報、加速度の情報及び角速度の情報を、スマートフォン40に送信する。 The transmission / reception unit 280 transmits the V1 time series data, the V2 time series data, the acceleration time series data, and the angular velocity time series data acquired from the detection processing unit 210 to the transmission / reception unit 380 by radio signals. As described above, the transmission / reception unit 280 transmits the electrooculogram information, the acceleration information, and the angular velocity information that are continuously detected to the smartphone 40.
 なお、検出処理部210における処理には、入力眼電位、加速度及び角速度の信号を増幅する増幅処理、入力眼電位、加速度及び角速度の信号をデジタル化するデジタル化処理が含まれる。検出処理部210は、入力眼電位、加速度及び角速度のアナログ信号を増幅する増幅回路を有してよい。検出処理部210は、入力眼電位、加速度及び角速度のアナログ信号又は増幅回路で増幅されたアナログ信号をデジタル化するAD変換回路を有してよい。 Note that the processing in the detection processing unit 210 includes amplification processing for amplifying the input electro-oculogram, acceleration and angular velocity signals, and digitization processing for digitizing the input electro-oculogram, acceleration and angular velocity signals. The detection processing unit 210 may include an amplifier circuit that amplifies analog signals of input electrooculogram, acceleration, and angular velocity. The detection processing unit 210 may include an analog-to-digital conversion circuit that digitizes an analog signal of input ocular potential, acceleration, and angular velocity or an analog signal amplified by an amplifier circuit.
 制御部220は、検出処理部210による眼電位、加速度及び角速度の検出を制御する。また、制御部220は、検出ユニット102と処理ユニット180との間の接続処理を担う。例えば、制御部220は、メモリ157に記憶されている識別情報を検出ユニット102から取得して、検出ユニット102から取得した識別情報が、予め定められた識別情報と一致している場合に、第1電極151、第2電極152及び第3電極153による眼電位の検出を行わせる。具体的には、制御部220は、送受信部280接続が確立されたスマートフォン40から、ユーザ20の識別情報を取得する。そして、メモリ157から取得した識別情報が、スマートフォン40から取得した識別情報と一致している場合に、第1電極151、第2電極152及び第3電極153による眼電位の検出を行わせる。 The control unit 220 controls the detection of the electrooculogram, acceleration, and angular velocity by the detection processing unit 210. Further, the control unit 220 is responsible for connection processing between the detection unit 102 and the processing unit 180. For example, the control unit 220 acquires the identification information stored in the memory 157 from the detection unit 102, and the identification information acquired from the detection unit 102 matches the predetermined identification information. The ocular potential is detected by the first electrode 151, the second electrode 152, and the third electrode 153. Specifically, the control unit 220 acquires the identification information of the user 20 from the smartphone 40 with which the transmission / reception unit 280 connection is established. Then, when the identification information acquired from the memory 157 matches the identification information acquired from the smartphone 40, the electrooculogram is detected by the first electrode 151, the second electrode 152, and the third electrode 153.
 スマートフォン40において、電源部390は、二次電池等の電池を含む。電源部390は、処理部300、送受信部380及びUI部370を含むスマートフォン40の各部に電力を供給する。 In the smartphone 40, the power supply unit 390 includes a battery such as a secondary battery. The power supply unit 390 supplies power to each unit of the smartphone 40 including the processing unit 300, the transmission / reception unit 380, and the UI unit 370.
 UI部370は、ユーザ20とのユーザインタフェース(UI)を提供する。例えば、UI部370は、タッチパネル、操作キー、音生成装置等を含む。 The UI unit 370 provides a user interface (UI) with the user 20. For example, the UI unit 370 includes a touch panel, operation keys, a sound generation device, and the like.
 格納部360は、記憶媒体で実現される。記録媒体としては、揮発性記憶媒体、不揮発性記憶媒体等を例示できる。格納部360は、処理部300の動作に必要な各種のパラメータを格納する。また、格納部360は、処理部300で生成された各種の情報を格納する。 The storage unit 360 is realized by a storage medium. Examples of the recording medium include a volatile storage medium and a nonvolatile storage medium. The storage unit 360 stores various parameters necessary for the operation of the processing unit 300. The storage unit 360 stores various types of information generated by the processing unit 300.
 送受信部380は、メガネ100と無線通信する機能を担う。送受信部380は、Bluetooth(登録商標)等の近距離無線通信機能を有する通信用プロセッサで実現される。送受信部380と送受信部280とは、Bluetooth(登録商標)規格に従って無線通信を行う。なお、送受信部280と送受信部380との間の通信は、Bluetooth(登録商標)通信に限られない。送受信部280と送受信部380との間の通信は、例えば無線LAN等を含む様々な方式の無線通信で実現され得る。送受信部280と送受信部380との間の通信は、USB等を含む様々な方式の有線通信によって実現され得る。 The transmission / reception unit 380 has a function of performing wireless communication with the glasses 100. The transmission / reception unit 380 is realized by a communication processor having a short-range wireless communication function such as Bluetooth (registered trademark). The transmission / reception unit 380 and the transmission / reception unit 280 perform wireless communication in accordance with the Bluetooth (registered trademark) standard. Note that communication between the transmission / reception unit 280 and the transmission / reception unit 380 is not limited to Bluetooth (registered trademark) communication. Communication between the transmission / reception unit 280 and the transmission / reception unit 380 can be realized by various types of wireless communication including, for example, a wireless LAN. Communication between the transmission / reception unit 280 and the transmission / reception unit 380 can be realized by various types of wired communication including USB.
 送受信部380は、メガネ100から送信された眼電位を示す情報を受信する。また、送受信部380は、メガネ100から送信された加速度を示す情報を受信する。また、送受信部380は、メガネ100から送信された角速度を示す情報を受信する。具体的には、送受信部380は、送受信部280から受信した無線信号を受信して、受信した無線信号を復調して、V1の時系列データ、V2の時系列データ、加速度の時系列データ及び角速度の時系列データを含む受信データを生成する。送受信部380は、生成した受信データを、処理部300に出力する。 The transmission / reception unit 380 receives information indicating the electrooculogram transmitted from the glasses 100. In addition, the transmission / reception unit 380 receives information indicating the acceleration transmitted from the glasses 100. In addition, the transmission / reception unit 380 receives information indicating the angular velocity transmitted from the glasses 100. Specifically, the transmission / reception unit 380 receives the radio signal received from the transmission / reception unit 280, demodulates the received radio signal, and performs time series data of V1, time series data of V2, time series data of acceleration, and Receive data including time-series data of angular velocity is generated. The transmission / reception unit 380 outputs the generated reception data to the processing unit 300.
 処理部300は、眼電位取得部310、加速度取得部320、角速度取得部322、及び解析部350を有する。 The processing unit 300 includes an electrooculogram acquisition unit 310, an acceleration acquisition unit 320, an angular velocity acquisition unit 322, and an analysis unit 350.
 眼電位取得部310は、ユーザ20に装着された眼電位検出部で検出されたユーザ20の眼電位を取得する。具体的には、眼電位取得部310は、ユーザ20が装着しているメガネ100に設けられた第1電極151、第2電極152及び第3電極153で検出されたユーザ20の眼電位を取得する。より具体的には、眼電位取得部310は、送受信部380から出力された受信データからV1の時系列データ及びV2の時系列データを抽出することにより、メガネ100で検出された眼電位を取得する。 The electrooculogram acquisition unit 310 acquires the electrooculogram of the user 20 detected by the electrooculogram detection unit attached to the user 20. Specifically, the electrooculogram acquisition unit 310 acquires the electrooculogram of the user 20 detected by the first electrode 151, the second electrode 152, and the third electrode 153 provided in the glasses 100 worn by the user 20. To do. More specifically, the electrooculogram acquisition unit 310 acquires the electrooculogram detected by the glasses 100 by extracting the time series data of V1 and the time series data of V2 from the reception data output from the transmission / reception unit 380. To do.
 加速度取得部320は、メガネ100で検出された、メガネ100の加速度を取得する。具体的には、加速度取得部320は、送受信部380で受信した情報に基づいて、メガネ100で検出された加速度を取得する。より具体的には、加速度取得部320は、送受信部380から出力された受信データから加速度データを抽出することにより、メガネ100で検出された加速度を取得する。 The acceleration acquisition unit 320 acquires the acceleration of the glasses 100 detected by the glasses 100. Specifically, the acceleration acquisition unit 320 acquires the acceleration detected by the glasses 100 based on the information received by the transmission / reception unit 380. More specifically, the acceleration acquisition unit 320 acquires the acceleration detected by the glasses 100 by extracting the acceleration data from the reception data output from the transmission / reception unit 380.
 角速度取得部322は、メガネ100で検出された、メガネ100の角速度を取得する。具体的には、角速度取得部322は、送受信部380で受信した情報に基づいて、メガネ100で検出された角速度を取得する。より具体的には、角速度取得部322は、送受信部380から出力された受信データから角速度データを抽出することにより、メガネ100で検出された角速度を取得する。 The angular velocity acquisition unit 322 acquires the angular velocity of the glasses 100 detected by the glasses 100. Specifically, the angular velocity acquisition unit 322 acquires the angular velocity detected by the glasses 100 based on the information received by the transmission / reception unit 380. More specifically, the angular velocity acquisition unit 322 acquires the angular velocity detected by the glasses 100 by extracting the angular velocity data from the reception data output from the transmission / reception unit 380.
 解析部350は、眼電位取得部310により取得された眼電位を解析する。具体的には、解析部350は、眼電位取得部310により取得された連続的な眼電位の情報を用いて、複数のタイミングにおけるユーザ20の状態を特定する。例えば、解析部350は、眼電位取得部310により取得された連続的な第1眼電位及び第2眼電位の情報を用いて、複数のタイミングにおけるユーザ20の状態を特定する。ユーザ20の状態としては、ユーザ20の視線方向、瞬目状態等を例示できる。 The analysis unit 350 analyzes the electrooculogram acquired by the electrooculogram acquisition unit 310. Specifically, the analysis unit 350 uses the continuous electrooculogram information acquired by the electrooculogram acquisition unit 310 to specify the state of the user 20 at a plurality of timings. For example, the analysis unit 350 specifies the state of the user 20 at a plurality of timings using information on the continuous first and second electrooculograms acquired by the electrooculogram acquisition unit 310. Examples of the state of the user 20 include the user's 20 line-of-sight direction and the blink state.
 解析部350は、特定したユーザ20の視線方向を解析して、ユーザ20の状態を特定する。また、解析部350は、第1眼電位又は第2眼電位に基づいて、瞬目の有無を特定する。また、解析部350は、視線方向及びユーザ20の瞬目状態を解析して、ユーザ20の状態を特定する。例えば、解析部350は、ユーザ20の状態として、ユーザ20に疲れ、痛み等が生じているか否かを判断する。解析部350は、ユーザ20に疲れ、痛み等が生じていると判断した場合、UI部370を通じてユーザ20に警告を発する。 The analysis unit 350 analyzes the identified line-of-sight direction of the user 20 and identifies the state of the user 20. Moreover, the analysis part 350 specifies the presence or absence of a blink based on a 1st ocular potential or a 2nd ocular potential. The analysis unit 350 also analyzes the line-of-sight direction and the blink state of the user 20 to identify the state of the user 20. For example, the analysis unit 350 determines whether the user 20 is tired or painful as the state of the user 20. The analysis unit 350 issues a warning to the user 20 through the UI unit 370 when it is determined that the user 20 is tired or painful.
 また、解析部350は、加速度取得部320により取得された加速度及び角速度取得部322により取得された角速度の少なくとも一方の情報を解析してユーザ20の動きを特定する。例えば、解析部350は、加速度及び角速度の少なくとも一方の情報に基づいて、ユーザ20の体部のランニングフォームを特定する。格納部360は、上述した眼電位の情報の解析結果と、特定したユーザ20の動きを示す情報とを対応づけて、格納部360に格納させる。UI部370は、格納部360に格納されているランニングフォームと視線方向とに基づくトータルのランニング姿勢を表す情報を、映像等によりユーザ20に提示してよい。例えば、UI部370は、ランニングフォームを表すアイコンとともに、ユーザ20の視線方向を示す矢印等のアイコンを、ユーザ20に提示してよい。 Further, the analysis unit 350 analyzes the information of at least one of the acceleration acquired by the acceleration acquisition unit 320 and the angular velocity acquired by the angular velocity acquisition unit 322, and identifies the movement of the user 20. For example, the analysis unit 350 specifies the running form of the body part of the user 20 based on at least one of acceleration and angular velocity. The storage unit 360 causes the storage unit 360 to store the electrooculogram information analysis result described above and the information indicating the identified movement of the user 20 in association with each other. The UI unit 370 may present information representing the total running posture based on the running form and the line-of-sight direction stored in the storage unit 360 to the user 20 using a video or the like. For example, the UI unit 370 may present an icon such as an arrow indicating the viewing direction of the user 20 to the user 20 together with an icon representing the running form.
 なお、格納部360には、スマートフォン40の所有者であるユーザ20を識別する識別情報を格納してよい。処理部300は、送受信部380を制御して、格納部360に格納されている識別情報を、メガネ本体101に送信してよい。例えば、処理部300は、メガネ本体101からの要求に応じて、格納部360に格納されている識別情報を送信する。 The storage unit 360 may store identification information for identifying the user 20 who is the owner of the smartphone 40. The processing unit 300 may transmit the identification information stored in the storage unit 360 to the glasses main body 101 by controlling the transmission / reception unit 380. For example, the processing unit 300 transmits the identification information stored in the storage unit 360 in response to a request from the glasses main body 101.
 なお、処理部300は、メガネ本体101を通じて、メモリ157に格納されている識別情報を、メガネ本体101、送受信部280及び送受信部380を通じて受信してよい。処理部300は、メガネ本体101を通じて受信した識別情報に基づいて、検出ユニット102のユーザ20を識別する。例えば、処理部300は、格納部360に格納している識別情報と、メガネ本体101から取得した識別情報一致した場合に、ユーザ20を認証してよい。処理部300は、ユーザ20を認証したことを条件として、送受信部380から受信した眼電位、加速度及び角速度の各時系列データを処理してよい。また、処理部300は、ユーザ20を認証したことを条件として、送受信部380から受信した眼電位、加速度及び角速度の各時系列データや、解析部350による解析結果を、外部のサーバ等に出力してよい。 Note that the processing unit 300 may receive the identification information stored in the memory 157 through the glasses body 101 through the glasses body 101, the transmission / reception unit 280, and the transmission / reception unit 380. The processing unit 300 identifies the user 20 of the detection unit 102 based on the identification information received through the glasses main body 101. For example, the processing unit 300 may authenticate the user 20 when the identification information stored in the storage unit 360 matches the identification information acquired from the glasses main body 101. The processing unit 300 may process the time series data of the electrooculogram, acceleration, and angular velocity received from the transmission / reception unit 380 on condition that the user 20 is authenticated. Further, on the condition that the user 20 is authenticated, the processing unit 300 outputs the time series data of the electro-oculogram, acceleration, and angular velocity received from the transmission / reception unit 380 and the analysis result by the analysis unit 350 to an external server or the like. You can do it.
 図6は、メガネ100が有する眼電位検出用電極とユーザ20との位置関係を概略的に示す。図4には、ユーザ20がメガネ100を装着している場合において、眼電位検出用電極がユーザ20に接触する位置である接触位置が示されている。 FIG. 6 schematically shows the positional relationship between the electrooculogram detection electrode of the glasses 100 and the user 20. FIG. 4 shows a contact position where the electrooculogram detection electrode contacts the user 20 when the user 20 is wearing the glasses 100.
 第1接触位置451は、第1電極151がユーザ20に接触する位置を表す。第2接触位置452は、第2電極152がユーザ20に接触する位置を表す。第3接触位置453は、第3電極153がユーザ20に接触する位置を表す。 The first contact position 451 represents a position where the first electrode 151 contacts the user 20. The second contact position 452 represents a position where the second electrode 152 contacts the user 20. The third contact position 453 represents a position where the third electrode 153 contacts the user 20.
 第1接触位置451及び第2接触位置452は、右眼の眼球401の角膜411の中心及び左眼の眼球402の角膜412の中心より、下側に位置する。 The first contact position 451 and the second contact position 452 are located below the center of the cornea 411 of the right eyeball 401 and the center of the cornea 412 of the left eyeball 402.
 第1接触位置451及び第2接触位置452は、第1接触位置451と眼球401との間の距離と、第2接触位置452と眼球402との間の距離とが略等しい位置にあることが望ましい。また、第1接触位置451及び第2接触位置452は、互いに一定の距離以上離間していることが望ましい。 The first contact position 451 and the second contact position 452 may be at positions where the distance between the first contact position 451 and the eyeball 401 and the distance between the second contact position 452 and the eyeball 402 are substantially equal. desirable. Further, it is desirable that the first contact position 451 and the second contact position 452 are separated from each other by a certain distance or more.
 第3接触位置453は、右眼の眼球401の角膜411の中心及び左眼の眼球402の角膜412の中心より、上側に位置する。第3接触位置453の位置は、第3接触位置453と第1接触位置451との間の距離と、第3接触位置453と第2接触位置452との間の距離とが略等しくなる位置にあってよい。第3接触位置453は、第3接触位置453と眼球401との間の距離が、眼球401と第1接触位置451との間の距離より離間し、かつ、第3接触位置453と眼球402との間の距離が、眼球402と第2接触位置452との間の距離より離間する位置にあってよい。 The third contact position 453 is located above the center of the cornea 411 of the right eyeball 401 and the center of the cornea 412 of the left eyeball 402. The position of the third contact position 453 is a position where the distance between the third contact position 453 and the first contact position 451 and the distance between the third contact position 453 and the second contact position 452 are substantially equal. It may be. The third contact position 453 is such that the distance between the third contact position 453 and the eyeball 401 is separated from the distance between the eyeball 401 and the first contact position 451, and the third contact position 453 and the eyeball 402 are separated from each other. May be at a position that is separated from the distance between the eyeball 402 and the second contact position 452.
 眼球においては、角膜側が正に帯電しており、網膜側が負に帯電している。したがって、ユーザ20の視線方向が上方に変化した場合、第3電極153を基準とした第1電極151の電位であるV1及び第3電極153を基準とした第2電極152の電位であるV2は、ともに低下する。ユーザ20の視線方向が下方に変化した場合、V1及びV2の電位はともに上昇する。ユーザ20の視線方向が右方に変化した場合、V1は低下し、V2は上昇する。ユーザ20の視線方向が左方に変化した場合、V1は上昇し、V2は低下する。解析部350は、V1の変化及びV2の変化に基づいて、視線方向が変化した方向を特定する。 In the eyeball, the cornea side is positively charged and the retina side is negatively charged. Therefore, when the line-of-sight direction of the user 20 changes upward, V1 that is the potential of the first electrode 151 with respect to the third electrode 153 and V2 that is the potential of the second electrode 152 with respect to the third electrode 153 are Both will decline. When the line-of-sight direction of the user 20 changes downward, both the potentials V1 and V2 rise. When the line-of-sight direction of the user 20 changes to the right, V1 decreases and V2 increases. When the line-of-sight direction of the user 20 changes to the left, V1 increases and V2 decreases. The analysis unit 350 identifies the direction in which the line-of-sight direction has changed based on the change in V1 and the change in V2.
 なお、検出処理部210は、V1及びV2を検出することにより、右眼の眼電位及び左眼の眼電位を測定する。そのため、眼電位に加わるノイズの影響を軽減できる。 Note that the detection processing unit 210 measures the right eye and left eye potential by detecting V1 and V2. Therefore, the influence of noise applied to the electrooculogram can be reduced.
 なお、ユーザ20の頭部を固定したとすると、ユーザ20の視線方向は、眼球401の向き及び眼球402の向きによって定まる。ユーザ20の視線方向は、ユーザ20の頭部の向きによっても変わるから、ユーザ20のグローバルな視線方向は、眼球401の向き、眼球401及びユーザ20の頭部の向きによって定まる。本実施形態の説明において、眼球401の向き及び眼球401の向きによって定まる視線方向のことを、単に視線方向という場合がある。 If the head of the user 20 is fixed, the line-of-sight direction of the user 20 is determined by the direction of the eyeball 401 and the direction of the eyeball 402. Since the line-of-sight direction of the user 20 also changes depending on the direction of the head of the user 20, the global line-of-sight direction of the user 20 is determined by the direction of the eyeball 401, the direction of the eyeball 401 and the head of the user 20. In the description of the present embodiment, the line-of-sight direction determined by the direction of the eyeball 401 and the direction of the eyeball 401 may be simply referred to as the line-of-sight direction.
 ここで、ユーザ20の視線方向について説明する。眼球401がxz面内で回転することで角膜411の位置が変化すると、角膜411の位置に応じてV1が変化する。解析部350は、V1の変化量に基づいて、眼球401の回転方向の角度の変化量である角度変化量を特定する。解析部350は、変化前の眼球401の角度と、特定した角度変化量とに基づいて、回転後の眼球401の角度を特定する。このように、解析部350は、V1の時間変化に基づいて眼球401の向きを特定する。 Here, the line-of-sight direction of the user 20 will be described. When the position of the cornea 411 changes as the eyeball 401 rotates in the xz plane, V1 changes according to the position of the cornea 411. The analysis unit 350 specifies an angle change amount that is a change amount of the angle of the eyeball 401 in the rotation direction based on the change amount of V1. The analysis unit 350 identifies the angle of the eyeball 401 after the rotation based on the angle of the eyeball 401 before the change and the specified angle change amount. As described above, the analysis unit 350 specifies the orientation of the eyeball 401 based on the temporal change of V1.
 解析部350は、xy面内の角度と同様に、他の面内の回転角度を特定する。例えば、解析部350は、眼球401の角度に関する処理と同様の処理により、眼球402の角度を特定する。具体的には、解析部350は、V2の時間変化に基づいて、眼球402の向きを特定する。 The analysis unit 350 identifies the rotation angle in the other plane as well as the angle in the xy plane. For example, the analysis unit 350 identifies the angle of the eyeball 402 by a process similar to the process related to the angle of the eyeball 401. Specifically, the analysis unit 350 specifies the direction of the eyeball 402 based on the time change of V2.
 解析部350は、眼球401の向き及び眼球402の向きに基づいて、ユーザ20の視線方向を特定する。例えば、解析部350は、眼球401の向きのベクトルと、眼球402の向きのベクトルとを合成したベクトルが向く方向を、ユーザ20の視線方向として特定してよい。 The analysis unit 350 identifies the line-of-sight direction of the user 20 based on the orientation of the eyeball 401 and the orientation of the eyeball 402. For example, the analysis unit 350 may specify a direction in which a vector obtained by combining a vector of the direction of the eyeball 401 and a vector of the direction of the eyeball 402 faces as the line-of-sight direction of the user 20.
 なお、眼球401の向き及び眼球402の向きは、ユーザ20の視線方向を表す指標の一例である。ユーザ20の視線方向とは、眼球401の向きのベクトル及び眼球402の向きのベクトル等から定まる1つの視線方向であってよい。すなわち、解析部350は、V1及びV2から、当該1つの視線方向を特定してよい。この場合、解析部350は、眼球401の向き及び眼球402の向きを特定することなく、V1の変化量及びV2の変化量に基づく予め定められた演算を行うことで、1つの視線方向を特定してよい。例えば、解析部350は、V1の変化量及びV2の変化量と1つの視線方向の変化量とを対応づける予め定められた演算を行うことで、1つの視線方向を特定してよい。 Note that the orientation of the eyeball 401 and the orientation of the eyeball 402 are an example of an index that represents the visual line direction of the user 20. The line-of-sight direction of the user 20 may be one line-of-sight direction determined from a vector of the direction of the eyeball 401, a vector of the direction of the eyeball 402, and the like. That is, the analysis unit 350 may specify the one line-of-sight direction from V1 and V2. In this case, the analysis unit 350 specifies one gaze direction by performing a predetermined calculation based on the change amount of V1 and the change amount of V2 without specifying the direction of the eyeball 401 and the direction of the eyeball 402. You can do it. For example, the analysis unit 350 may specify one gaze direction by performing a predetermined calculation that associates the variation amount of V1 and the variation amount of V2 with the variation amount of one gaze direction.
 ここでは視線方向を特定する場合について説明したが、眼球401の向きは、角膜411の位置と言い換えることができる。また、眼球402の向きを、角膜412の位置と言い換えることができる。また、眼球401の向きの変化を、眼球401の眼球運動と言い換えることができる。また、眼球402の向きの変化を、眼球運動と言い換えることができる。すなわち、解析部350は、メガネ100で検出された眼電位に基づき、ユーザ20の眼球運動を特定してよい。 Here, the case where the line-of-sight direction is specified has been described, but the direction of the eyeball 401 can be rephrased as the position of the cornea 411. Further, the direction of the eyeball 402 can be rephrased as the position of the cornea 412. Further, the change in the direction of the eyeball 401 can be rephrased as the eyeball movement of the eyeball 401. Further, a change in the direction of the eyeball 402 can be rephrased as eyeball movement. That is, the analysis unit 350 may specify the eye movement of the user 20 based on the electrooculogram detected by the glasses 100.
 図7は、メガネ本体101において実行される処理を表すフローチャートを示す。本フローチャートで示される処理は、処理部200において、検出ユニット102との接続処理を担うタスクが立ち上がった場合に、開始される。例えば、本フローチャートで示される処理は、検出ユニット102がメガネ本体101に装着された場合に、開始される。 FIG. 7 is a flowchart showing processing executed in the glasses main body 101. The processing shown in this flowchart is started when a task responsible for connection processing with the detection unit 102 is started in the processing unit 200. For example, the processing shown in this flowchart is started when the detection unit 102 is attached to the glasses main body 101.
 ステップS702において、制御部220は、送受信部280を制御して、スマートフォン40との接続を確立する。続いて、ステップS704において、スマートフォン40からユーザ20の識別情報を取得する。 In step S <b> 702, the control unit 220 controls the transmission / reception unit 280 to establish a connection with the smartphone 40. Subsequently, in step S <b> 704, the identification information of the user 20 is acquired from the smartphone 40.
 続いて、ステップS706において、制御部220は、検出ユニット102のメモリ157にアクセスして、メモリ157から識別情報を読み出す。 Subsequently, in step S <b> 706, the control unit 220 accesses the memory 157 of the detection unit 102 and reads identification information from the memory 157.
 続いて、ステップS710において、制御部220は、メモリ157から読み出した識別情報と、スマートフォン40から取得した識別情報とが一致しているか否かを判断する。メモリ157から読み出した識別情報と、スマートフォン40から取得した識別情報とが一致していない場合は、本フローチャートの処理を終了する。メモリ157から読み出した識別情報と、スマートフォン40から取得した識別情報とが一致している場合は、ステップS712に処理を移行する。 Subsequently, in step S <b> 710, the control unit 220 determines whether the identification information read from the memory 157 matches the identification information acquired from the smartphone 40. When the identification information read from the memory 157 and the identification information acquired from the smartphone 40 do not match, the process of this flowchart is terminated. If the identification information read from the memory 157 matches the identification information acquired from the smartphone 40, the process proceeds to step S712.
 ステップS712において、制御部220は、検出処理部210を制御して、眼電位、加速度及び角速度の検出を開始させる。また、制御部220は、送受信部280を制御して、検出された眼電位、加速度及び角速度のそれぞれの時系列データのスマートフォン40への送信を開始させる。 In step S712, the control unit 220 controls the detection processing unit 210 to start detection of electrooculogram, acceleration, and angular velocity. Further, the control unit 220 controls the transmission / reception unit 280 to start transmission of the detected time series data of the electro-oculogram, acceleration, and angular velocity to the smartphone 40.
 続いて、ステップS720において、スマートフォン40との接続状態を判断する。ステップS702で接続を確立したスマートフォン40と接続中である場合は、ステップS720の判断を繰り返す。ステップS702で接続を確立したスマートフォン40との接続が切断された場合は、ステップS730に処理を移行する。ステップS702で接続を確立したスマートフォン40とは異なる他のスマートフォンと接続された場合は、ステップS722に処理を移行する。 Subsequently, in step S720, the connection state with the smartphone 40 is determined. If the connection is being established with the smartphone 40 that has established a connection in step S702, the determination in step S720 is repeated. If the connection with the smartphone 40 that established the connection in step S702 is disconnected, the process proceeds to step S730. When connected to another smartphone different from the smartphone 40 that established the connection in step S702, the process proceeds to step S722.
 ステップS730においては、制御部220は、検出処理部210を制御して、眼電位、加速度及び角速度の検出を停止させる。また、制御部220は、送受信部280を制御して、各時系列データのスマートフォン40への送信を停止させる。続いて、本フローチャートの処理を終了する。 In step S730, the control unit 220 controls the detection processing unit 210 to stop detecting the electrooculogram, acceleration, and angular velocity. Moreover, the control part 220 controls the transmission / reception part 280, and stops transmission to the smart phone 40 of each time series data. Then, the process of this flowchart is complete | finished.
 ステップS722においては、制御部220は、検出処理部210を制御して、眼電位、加速度及び角速度の検出を停止させる。また、制御部220は、送受信部280を制御して、各時系列データのスマートフォン40への送信を停止させる。続いて、ステップS724において、制御部220は、他のスマートフォンからユーザ20の識別情報を取得して、ステップS710に処理を移行する。 In step S722, the control unit 220 controls the detection processing unit 210 to stop detecting the electrooculogram, acceleration, and angular velocity. Moreover, the control part 220 controls the transmission / reception part 280, and stops transmission to the smart phone 40 of each time series data. Subsequently, in step S724, the control unit 220 acquires the identification information of the user 20 from another smartphone, and shifts the processing to step S710.
 以上に説明した処理部200の制御によれば、検出ユニット102から取得した識別情報とスマートフォン40とが一致している場合に、検出された眼電位等のデータがスマートフォン40に送信される。そのため、データのセキュリティを高めることができる。 According to the control of the processing unit 200 described above, data such as detected electrooculogram is transmitted to the smartphone 40 when the identification information acquired from the detection unit 102 matches the smartphone 40. Therefore, data security can be increased.
 以上に説明した眼電位情報処理システム10においては、V1及びV2を演算する処理は、処理部200が行う。これに代えて、スマートフォン40の解析部350が、V1及びV2を演算する処理を行ってよい。この場合、処理部200は、第1眼電位、第2眼電位及び第3眼電位のそれぞれの時系列データを生成し、送受信部280が第1眼電位、第2眼電位及び第3眼電位のそれぞれの時系列データをスマートフォン40に送信してよい。 In the electrooculogram information processing system 10 described above, the processing unit 200 performs processing for calculating V1 and V2. It replaces with this and the analysis part 350 of the smart phone 40 may perform the process which calculates V1 and V2. In this case, the processing unit 200 generates time series data of the first ocular potential, the second ocular potential, and the third ocular potential, and the transmission / reception unit 280 causes the first ocular potential, the second ocular potential, and the third ocular potential. Each of the time series data may be transmitted to the smartphone 40.
 また、以上に説明した眼電位情報処理システム10では、解析部350は、V1及びV2を用いて、ユーザ20の視線方向等のユーザ20の状態を特定する。他にも、解析部350は、V1及びV2の任意の線形結合を用いて、ユーザ20の状態を特定してよい。例えば、解析部350は、V1+V2と、V1-V2とを用いて、ユーザ20の状態を特定してよい。また、解析部350は、V1の時間微分と、V2の時間微分とを用いて、ユーザ20の状態を特定してもよい。また、解析部350は、視線方向を特定するための眼電位として、以上に説明したV1及びV2に代えて、接地電極154等の電位等の予め定められた基準電位を基準とした第1眼電位と、接地電極154の電位を基準とした第2眼電位とを適用してよい。 In the electrooculogram information processing system 10 described above, the analysis unit 350 uses V1 and V2 to specify the state of the user 20 such as the line-of-sight direction of the user 20. In addition, the analysis unit 350 may specify the state of the user 20 using an arbitrary linear combination of V1 and V2. For example, the analysis unit 350 may specify the state of the user 20 using V1 + V2 and V1-V2. Moreover, the analysis part 350 may specify the state of the user 20 using the time differentiation of V1 and the time differentiation of V2. In addition, the analysis unit 350 uses the first eye based on a predetermined reference potential such as the potential of the ground electrode 154 instead of the above-described V1 and V2 as the eye potential for specifying the line-of-sight direction. The potential and the second ocular potential based on the potential of the ground electrode 154 may be applied.
 なお、加速度を検出する検出部の機能は、スマートフォン40が有してよい。また、角速度を検出する検出部の機能は、スマートフォン40が有してよい。 Note that the smartphone 40 may have a function of a detection unit that detects acceleration. In addition, the smartphone 40 may have a function of a detection unit that detects the angular velocity.
 上記の説明において、スマートフォン40において処理部300、送受信部380の動作として説明した処理は、処理部300、送受信部380等のプロセッサがプログラムに従ってスマートフォン40が有する各ハードウェアを制御することにより実現される。このように、本実施形態のスマートフォン40に関連して説明した、スマートフォン40の少なくとも一部の処理は、プロセッサがプログラムに従って動作して各ハードウェアを制御することにより、プロセッサ、メモリ等を含む各ハードウェアとプログラムとが協働して動作することにより実現することができる。すなわち、当該処理を、いわゆるコンピュータによって実現することができる。コンピュータは、上述した処理の実行を制御するプログラムをロードして、読み込んだプログラムに従って動作して、当該処理を実行してよい。コンピュータは、当該プログラムを記憶しているコンピュータ読取可能な記録媒体から当該プログラムをロードすることができる。同様に、メガネ100において処理部200、送受信部280の動作として説明した処理は、いわゆるコンピュータによって実現することができる。 In the above description, the processing described as operations of the processing unit 300 and the transmission / reception unit 380 in the smartphone 40 is realized by a processor such as the processing unit 300 and the transmission / reception unit 380 controlling each hardware included in the smartphone 40 according to a program. The As described above, at least a part of the processing of the smartphone 40 described in relation to the smartphone 40 of the present embodiment includes a processor, a memory, and the like by the processor operating according to the program and controlling each hardware. This can be realized by the hardware and the program operating in cooperation. That is, the process can be realized by a so-called computer. The computer may load a program for controlling execution of the above-described processing, operate according to the read program, and execute the processing. The computer can load the program from a computer-readable recording medium storing the program. Similarly, the processing described as the operations of the processing unit 200 and the transmission / reception unit 280 in the glasses 100 can be realized by a so-called computer.
 眼電位情報処理システム10は、情報処理システムの一例である。スマートフォン40は、眼電位に限らず、種々の情報を処理してよい。また、スマートフォン40は、メガネ100で検出された眼電位等の情報を処理する情報処理装置の一例である。情報処理装置としては、通信機能を有する種々の電子機器であってよい。情報処理装置は、ユーザ20が所持する携帯電話機、携帯情報端末、携帯音楽プレーヤ等の携帯型の電子機器であってよい。 The electrooculogram information processing system 10 is an example of an information processing system. The smartphone 40 may process various information without being limited to the electrooculogram. The smartphone 40 is an example of an information processing apparatus that processes information such as electrooculogram detected by the glasses 100. The information processing apparatus may be various electronic devices having a communication function. The information processing apparatus may be a portable electronic device such as a mobile phone, a portable information terminal, a portable music player, etc. possessed by the user 20.
 なお、アイウエアの一例としてのメガネ100は、ユーザ20の目の屈折異常を補正したり、ユーザ20の目を保護したり、着飾ったりすること等を目的として利用され得る。しかし、アイウエアはメガネに限定されない。アイウエアは、サングラス、ゴーグル、ヘッドマウントディスプレイなどの顔面装着具または頭部装着具であってよい。アイウエアは、顔面装着具または頭部装着具のフレームまたは当該フレームの一部であってもよい。アイウエアは、ユーザに装着され得る装着具の一例である。装着具は、アイウエア等の眼に関連する装着具に限られない。装着具としては、帽子、ヘルメット、ヘッドフォン、補聴器等の様々な部材を適用できる。 The eyeglasses 100 as an example of eyewear can be used for the purpose of correcting the refractive error of the eyes of the user 20, protecting the eyes of the user 20, or dressing up. However, eyewear is not limited to glasses. The eyewear may be a face wearing device such as sunglasses, goggles, a head mounted display, or a head wearing device. The eyewear may be a frame of a face wearing device or a head wearing device or a part of the frame. Eyewear is an example of a wearing tool that can be worn by a user. The wearing tool is not limited to a wearing tool related to the eye such as eyewear. Various members such as a hat, a helmet, headphones, and a hearing aid can be applied as the wearing tool.
 以上、本発明を実施の形態を用いて説明したが、本発明の技術的範囲は上記実施の形態に記載の範囲には限定されない。上記実施の形態に、多様な変更または改良を加えることが可能であることが当業者に明らかである。その様な変更または改良を加えた形態も本発明の技術的範囲に含まれ得ることが、請求の範囲の記載から明らかである。 As mentioned above, although this invention was demonstrated using embodiment, the technical scope of this invention is not limited to the range as described in the said embodiment. It will be apparent to those skilled in the art that various modifications or improvements can be added to the above-described embodiment. It is apparent from the scope of the claims that the embodiments added with such changes or improvements can be included in the technical scope of the present invention.
 請求の範囲、明細書、および図面中において示した装置、システム、プログラム、および方法における動作、手順、ステップ、および段階等の各処理の実行順序は、特段「より前に」、「先立って」等と明示しておらず、また、前の処理の出力を後の処理で用いるのでない限り、任意の順序で実現しうることに留意すべきである。請求の範囲、明細書、および図面中の動作フローに関して、便宜上「まず、」、「次に、」等を用いて説明したとしても、この順で実施することが必須であることを意味するものではない。 The execution order of each process such as operations, procedures, steps, and stages in the apparatus, system, program, and method shown in the claims, the description, and the drawings is particularly “before” or “prior”. It should be noted that they can be implemented in any order unless the output of the previous process is used in the subsequent process. Regarding the operation flow in the claims, the description, and the drawings, even if it is described using “first”, “next”, etc. for the sake of convenience, it means that it is essential to carry out in this order. is not.
10 眼電位情報処理システム
100 メガネ
20 ユーザ
40 スマートフォン
101 メガネ本体
102 検出ユニット
110 レンズ
120 フレーム
122 リム
124 ブリッジ
125 コネクタ部
126 ヨロイ
130 テンプル
132 モダン
141 右ノーズパッド
142 左ノーズパッド
150 保持部
151 第1電極
152 第2電極
153 第3電極
155 第1支持部材
156 第2支持部材
157 メモリ
154 接地電極
170 コネクタ部
171、172 電極部
160 電線部
180 処理ユニット
190 電源ユニット
200 処理部
210 検出処理部
220 制御部
260 角速度検出部
270 加速度検出部
280 送受信部
290 基板部
300 処理部
310 眼電位取得部
320 加速度取得部
322 角速度取得部
350 解析部
360 格納部
370 UI部
380 送受信部
390 電源部
401、402 眼球
411、412 角膜
451 第1接触位置
452 第2接触位置
453 第3接触位置
DESCRIPTION OF SYMBOLS 10 Ocular potential information processing system 100 Glasses 20 User 40 Smartphone 101 Glasses main body 102 Detection unit 110 Lens 120 Frame 122 Rim 124 Bridge 125 Connector part 126 Yoroi 130 Temple 132 Modern 141 Right nose pad 142 Left nose pad 150 Holding part 151 1st electrode 152 2nd electrode 153 3rd electrode 155 1st support member 156 2nd support member 157 Memory 154 Ground electrode 170 Connector part 171, 172 Electrode part 160 Electric wire part 180 Processing unit 190 Power supply unit 200 Processing part 210 Detection processing part 220 Control part 260 Angular velocity detection unit 270 Acceleration detection unit 280 Transmission / reception unit 290 Substrate unit 300 Processing unit 310 Ocular potential acquisition unit 320 Acceleration acquisition unit 322 Angular velocity acquisition unit 350 Analysis unit 360 Storage unit 3 0 UI unit 380 transceiver 390 power supply unit 401 and 402 the eyeball 411, 412 cornea 451 first contact position 452 the second contact position 453 third contact position

Claims (9)

  1.  アイウエア本体に着脱可能な検出ユニットであって、
     一対のノーズパッドと、
     前記一対のノーズパッドのうちの第1ノーズパッドの表面に設けられ、装着者の眼電位を検出する第1電極と、
     前記第1ノーズパッドとは異なる位置に設けられ、前記装着者の眼電位を検出する第2電極と、
     前記検出ユニットを前記アイウエア本体に取り付けるとともに、前記第1電極及び前記第2電極により検出された眼電位を前記アイウエア本体に出力するコネクタ部と
    を備える検出ユニット。
    A detection unit that can be attached to and detached from the eyewear body,
    A pair of nose pads;
    A first electrode that is provided on a surface of the first nose pad of the pair of nose pads and detects an electrooculogram of the wearer;
    A second electrode provided at a position different from the first nose pad and detecting the electrooculogram of the wearer;
    A detection unit comprising: a connector portion that attaches the detection unit to the eyewear body and outputs an electrooculogram detected by the first electrode and the second electrode to the eyewear body.
  2.  前記第2電極は、前記一対のノーズパッドのうちの第2ノーズパッドの表面に設けられ、前記装着者の眼電位を検出する
    請求項1に記載の検出ユニット。
    2. The detection unit according to claim 1, wherein the second electrode is provided on a surface of a second nose pad of the pair of nose pads and detects an electrooculogram of the wearer.
  3.  前記装着者の眉間部に当接して、前記装着者の眼電位を検出する第3電極
    をさらに備える請求項2に記載の検出ユニット。
    The detection unit according to claim 2, further comprising a third electrode that contacts an eyebrow portion of the wearer and detects an electrooculogram of the wearer.
  4.  前記一対のノーズパッド、前記第3電極及び前記コネクタ部は、装着者の顔面に形状に応じて互いに位置決めして設けられる
    請求項3に記載の検出ユニット。
    4. The detection unit according to claim 3, wherein the pair of nose pads, the third electrode, and the connector portion are provided to be positioned relative to each other on the face of the wearer according to the shape.
  5.  前記第2電極は、前記装着者の眉間部に当接する
    請求項1に記載の検出ユニット。
    The detection unit according to claim 1, wherein the second electrode is in contact with an eyebrow portion of the wearer.
  6.  前記検出ユニットのユーザを識別する識別情報を記憶する不揮発性の記憶部
    をさらに備え、
     前記コネクタ部はさらに、前記記憶部に記憶されている前記識別情報を前記アイウエア本体に出力する
    請求項1から5のいずれか一項に記載の検出ユニット。
    A non-volatile storage unit that stores identification information for identifying a user of the detection unit;
    The detection unit according to any one of claims 1 to 5, wherein the connector unit further outputs the identification information stored in the storage unit to the eyewear main body.
  7.  請求項1から6のいずれか一項に記載の検出ユニットと、
     前記アイウエア本体と
    を備えるアイウエア。
    The detection unit according to any one of claims 1 to 6,
    Eyewear comprising the eyewear body.
  8.  請求項6に記載の検出ユニットと、
     前記アイウエア本体と
    を備え、
     前記アイウエア本体は、
     前記記憶部に記憶されている前記識別情報を前記検出ユニットから取得して、前記検出ユニットから取得した前記識別情報が、予め定められた識別情報と一致している場合に、前記第1電極及び前記第2電極による前記眼電位の検出を行わせる制御部
    を有するアイウエア。
    A detection unit according to claim 6;
    Comprising the eyewear body,
    The eyewear body is
    When the identification information stored in the storage unit is acquired from the detection unit, and the identification information acquired from the detection unit matches the predetermined identification information, the first electrode and Eyewear having a control unit for detecting the electrooculogram by the second electrode.
  9.  請求項6に記載の検出ユニットと、
     前記アイウエア本体を通じて前記識別情報を受信して、前記アイウエア本体を通じて受信した前記識別情報に基づいて、前記検出ユニットのユーザを識別する眼電位情報処理装置と
    を備える眼電位検出システム。
    A detection unit according to claim 6;
    An electrooculogram detection system comprising: an electrooculogram information processing apparatus that receives the identification information through the eyewear body and identifies a user of the detection unit based on the identification information received through the eyewear body.
PCT/JP2015/061369 2014-04-14 2015-04-13 Detection unit, eyewear, and ocular potential detection system WO2015159851A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-082755 2014-04-14
JP2014082755A JP2015202185A (en) 2014-04-14 2014-04-14 Detection unit, eyewear, and eye potential detection system

Publications (1)

Publication Number Publication Date
WO2015159851A1 true WO2015159851A1 (en) 2015-10-22

Family

ID=54324057

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/061369 WO2015159851A1 (en) 2014-04-14 2015-04-13 Detection unit, eyewear, and ocular potential detection system

Country Status (2)

Country Link
JP (1) JP2015202185A (en)
WO (1) WO2015159851A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110441912A (en) * 2019-08-27 2019-11-12 高维度(深圳)生物信息智能应用有限公司 A kind of Portable glasses intelligent wearable device and its control method
GB2574580A (en) * 2018-05-25 2019-12-18 Uea Enterprises Ltd Portable wearable eye movement monitoring system, device and monitoring method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3061312B1 (en) * 2016-12-22 2021-04-30 Microoled GLASSES WITH OPTICAL DISPLAY SYSTEM
JP7236852B2 (en) 2018-12-03 2023-03-10 株式会社ジンズホールディングス nose pads and eyewear

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008264551A (en) * 2007-04-18 2008-11-06 National Yang Ming Univ Sunglass type sleep detective and preventive device
JP2011125693A (en) * 2009-11-18 2011-06-30 Panasonic Corp Device for estimation of eye potential, method for calculation of eye potential, visual axis detector, wearable camera, head-mounted display, and electronic glasses
US20130172759A1 (en) * 2011-08-08 2013-07-04 Richard J. Melker Systems And Methods For Using Photoplethysmography In The Administration Of Narcotic Reversal Agents

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008264551A (en) * 2007-04-18 2008-11-06 National Yang Ming Univ Sunglass type sleep detective and preventive device
JP2011125693A (en) * 2009-11-18 2011-06-30 Panasonic Corp Device for estimation of eye potential, method for calculation of eye potential, visual axis detector, wearable camera, head-mounted display, and electronic glasses
US20130172759A1 (en) * 2011-08-08 2013-07-04 Richard J. Melker Systems And Methods For Using Photoplethysmography In The Administration Of Narcotic Reversal Agents

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2574580A (en) * 2018-05-25 2019-12-18 Uea Enterprises Ltd Portable wearable eye movement monitoring system, device and monitoring method
GB2574580B (en) * 2018-05-25 2023-03-29 Uea Enterprises Ltd Portable wearable eye movement monitoring system, device and monitoring method
CN110441912A (en) * 2019-08-27 2019-11-12 高维度(深圳)生物信息智能应用有限公司 A kind of Portable glasses intelligent wearable device and its control method
CN110441912B (en) * 2019-08-27 2024-04-19 高维度(深圳)生物信息智能应用有限公司 Portable glasses intelligent wearing equipment and control method thereof

Also Published As

Publication number Publication date
JP2015202185A (en) 2015-11-16

Similar Documents

Publication Publication Date Title
US20160070122A1 (en) Computerized replacement temple for standard eyewear
WO2015159862A1 (en) Eyewear
US20170255029A1 (en) Systems and methods for charging eyewear
WO2015159851A1 (en) Detection unit, eyewear, and ocular potential detection system
US11029754B2 (en) Calibration method, portable device, and computer-readable storage medium
WO2015159861A1 (en) Detection control device, mounting fixture, ocular potential information processing system and program
WO2015159853A1 (en) Ocular potential information processing device, ocular potential information processing system, mounting fixture and program
US20240134449A1 (en) Eye detection methods and devices
CN110366388B (en) Information processing method, information processing apparatus, and computer-readable storage medium
WO2015159850A1 (en) Ocular potential information processing device, ocular potential information processing system, mounting fixture and program
JP2019195591A (en) Biological information detection device, eyewear, and eyewear system
JP6687639B2 (en) Information processing method, information processing device, program, and eyewear
JP2015202198A (en) Eyewear
EP3112927A1 (en) A vision monitoring module fixed on a spectacle frame
JP6266417B2 (en) Information processing apparatus, information processing system, and program
WO2016072395A1 (en) Program, information processing device, and eyewear
JP7170258B2 (en) Electro-oculogram data processing device compatible with wearable device, glasses-type wearable device provided with the same, and electro-oculography data processing method compatible with wearable device
WO2016076268A1 (en) Program, information processing device, and eyewear
WO2015159858A1 (en) Eyewear
JP5919323B2 (en) Blink detector and glasses-type electronic device
US20180344196A1 (en) Information processing method, information processing device, program, and eyewear
JP6557582B2 (en) Human body potential detection apparatus, glasses-type electronic device, human body potential detection method, and program
JP2015202188A (en) Eyewear

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15780461

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15780461

Country of ref document: EP

Kind code of ref document: A1