GB2548885A - Collision detection - Google Patents
Collision detection Download PDFInfo
- Publication number
- GB2548885A GB2548885A GB1605486.8A GB201605486A GB2548885A GB 2548885 A GB2548885 A GB 2548885A GB 201605486 A GB201605486 A GB 201605486A GB 2548885 A GB2548885 A GB 2548885A
- Authority
- GB
- United Kingdom
- Prior art keywords
- smart glasses
- communication
- signals
- wearer
- accelerometer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
- A61B5/1117—Fall detection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7465—Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/0167—Emergency system, e.g. to prevent injuries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- General Physics & Mathematics (AREA)
- Physiology (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Primary Health Care (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Epidemiology (AREA)
- Nursing (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Telephone Function (AREA)
Abstract
A pair of smart glasses 2 is disclosed comprising an accelerometer 6 and a processor 10. The processor is configured to process signals from the accelerometer to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision. The communication may be by a mobile telephone (70, figure 2) or similar notification apparatus (40, figure 2) and may communicate to the emergency services that a head collision has occurred. Gyroscope 8, humidity sensor 16 and temperature sensor 14 may also be included to provide further information about the collision and camera 20 may send relevant image data. Position determining unit 18 may provide GPS data to help locate the user.
Description
COLLISION DETECTION
Field of the Invention
The present invention relates to the detection of a collision involving a person’s head.
Background
Head injuries can be particularly dangerous. The time between a person incurring a head injury and receiving treatment can affect both the recovery time and the severity of any long-term effects. However, a head injury can cause unconsciousness or shock, leading to an inability of the person suffering the head injury to call emergency services.
The present invention has been made with this in mind.
Summary
According to the present invention, there is provided a pair of smart glasses, comprising an accelerometer and a processor. The processor is configured to process signals from the accelerometer to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision.
The present invention also provides a system, comprising a pair of smart glasses having an accelerometer located therein and a mobile telephone. The mobile telephone is configured to receive signals from the accelerometer transmitted by the smart glasses, to process the signals to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision.
The present invention also provides a system, comprising a pair of smart glasses having an accelerometer located therein and a notification apparatus. The notification apparatus is configured to receive signals from the accelerometer transmitted by the smart glasses, to process the signals to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision.
The present invention further provides a method comprising receiving signals from an accelerometer located within a pair of smart glasses, and processing the received signals from the accelerometer to detect a collision involving the wearer of the smart glasses.
Brief Description of the Drawings
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which like reference numbers designate like parts, and in which: FIG. 1 illustrates a pair of smart glasses in accordance with an embodiment of the invention and schematically shows the functional components of the smart glasses; FIG. 2 schematically illustrates communication in the event that a collision involving the head of the wearer of the smart glasses is detected in an embodiment; FIG. 3 shows processing operations performed in an embodiment to detect a collision involving the head of the wearer of the smart glasses; FIG. 4 shows processing operations performed in an embodiment to detect a sudden temperature change in the environment of the wearer of the smart glasses; FIG. 5 shows processing operations performed in an embodiment to detect a sudden humidity change in the environment of the wearer of the smart glasses; FIG. 6 shows processing operations performed in an embodiment to determine if the wearer of the smart glasses is outside an allowed area; FIG. 7 schematically illustrates the components, and the communication therebetween, of a system according to a second embodiment; FIG. 8 shows processing operations performed in the second embodiment; FIG. 9 shows processing operations performed in a third embodiment.
First Embodiment
Figure 1 illustrates a pair of smart glasses 2 in accordance with an embodiment of the present invention. As is well known, smart glasses are glasses that are configured to add information to what the wearer sees for augmented or mixed reality. In the present embodiment, this is achieved by displaying images on head-mounted display 4 in the wearer’s field of view, although it will be appreciated that other types of display can be used instead, such as a heads-up display through which the wearer can see.
In this embodiment, the smart glasses 2 include an accelerometer 6, a processor 10 and a communication module 12. Optionally, the smart glasses 2 may further comprise one or more of a gyroscope 8, a temperature sensor 14, a humidity sensor 16, a position-determination unit 18 and a camera 20. The accelerometer 6, gyroscope 8, processor 10, communication module 12, temperature sensor 14, humidity sensor 16, position-determination unit 18 and camera 20 are contained within one or more housings forming part of, or carried by, the frame of the smart glasses 2.
The accelerometer 6, gyroscope 8, communication module 12, temperature sensor 14, humidity sensor 16, position-determination unit 18 and camera 20 are operatively connected to processor 10 for communication therewith. In Figure 1, this connection is shown in the form of a bus 22, although other forms of connection can be used instead; for example, each component may be connected to processor 10 via an individual connection such as a wire or cable.
Accelerometer 6 is configured to generate signals defining the acceleration and deceleration of the smart glasses 2, and hence the acceleration and deceleration of the head of the wearer of the smart glasses.
Gyroscope 8 is configured to generate signals defining the orientation of the smart glasses 2 and hence the orientation of the head of the wearer of the smart glasses.
Communication module 12 is configured to transmit, and optionally receive, communication signals with an external network or external device, as explained in more detail below.
Temperature sensor 14 is configured to generate signals defining the temperature of the environment in which the smart glasses 2 are located.
Humidity sensor 16 is configured to generate signals defining the humidity of the environment in which the smart glasses 2 are located.
Position-determination unit 18 is configured to determine the position of the smart glasses 2, with the determined position being defined, for example, using latitude and longitude coordinates or in any other suitable way. By way of non-limiting example, position-determination unit 18 may be a Global Position System (GPS) unit or a unit for determining positon using assisted-GPS. The position-determination unit 18 is illustrated in Figure 1 as a separate unit, but it may include processor 10 for performing some, or all, of the position calculations.
Camera 20 is configured and positioned to capture image data showing at least part of the view of the wearer of the smart glasses 2.
As will be described in more detail below, processor 10 is configured to process signals from the accelerometer 6 and, optionally, the gyroscope 8 to detect a collision involving the head of the wearer of the smart glasses 2. More particularly, although processor 10 can process signals from only accelerometer 6 to detect a collision, processing signals from gyroscope 8 as well may improve the accuracy of detection. For example, processor 10 may be configured to process signals from gyroscope 8 to detect if the smart glasses 2 were being worn by the wearer prior to the collision or if the smart glasses 2 had fallen off the wearer prior to the collision. This may help to reduce false alarms. Upon detection of a collision involving the head of the wearer, processor 10 is configured to initiate a communication via communication module 12. Such a communication can raise an alert that the wearer of the smart glasses 2 has undergone a head collision. Accordingly, an automatic safety mechanism can be provided, as prompt assistance can be summoned for the wearer of the smart glasses 2, for example from emergency services. The communication initiated by the processor 10 upon detection of a collision may include image data captured by camera 20 and/or position data defining the position of the smart glasses 2 (and hence their wearer) determined by position-determination unit 18 to assist in the detection of false alarms and in guiding assistance to the wearer, respectively.
More particularly, referring to Figure 2, when a wearer 30 of the smart glasses 2 suffers a head collision, the signal from the accelerometer 6 and, optionally, gyroscope 8 are processed by processor 10 to detect the collision, and the processor 10 initiates a communication to at least one remote apparatus 40, 50. The communication may be effected by communication module 12 communicating with a mobile telephone network 60. Alternatively, the communication could be effected by communication module 12 communicating with a mobile telephone 70 carried by the wearer 30, for example via Bluetooth, and the mobile telephone 70 communicating with the mobile telephone network 60. It will, of course, be appreciated that mobile telephone network 60 may comprise one or more sub-networks, routers, switching components, etc. but is referred to herein merely as a mobile telephone network for convenience.
The remote apparatus 40, 50 to which smart glasses 2 send the communication upon detection of a head collision may comprise a notification apparatus 40 and/or an emergency services communication apparatus 50. Notification apparatus 40 may be located within a monitoring centre (not shown) arranged to receive communications from a plurality of different pairs of smart glasses 2. The notification apparatus 40 or monitoring centre may, upon receipt of a communication notifying a head collision of wearer 30, attempt to communicate with the wearer 30 via the smart glasses 2 or the mobile telephone 70 to ascertain whether the wearer 30 requires assistance. If no reply is received from the wearer 30 within a predetermined time, then the notification apparatus 40 or the monitoring centre may contact the emergency services to summon assistance for the wearer 30. In this regard, if position data was provided as part of the communication notifying the head collision, then the notification apparatus 40 or the monitoring centre may provide the emergency services with the location of the wearer 30.
If the initial communication notifying the head collision included image data from camera 20, the notification apparatus 40 or the monitoring centre may process the image data to determine if the head collision notification is a false alarm. For example, the image data may be processed to determine whether the view shown in the image data is a view expected if the wearer 30 is going about his or her business normally or whether the image data suggests that the wearer is unconscious or in shock and requires immediate assistance, for example if the image data shows a static view and/or a view of the ground.
It should be noted that the notification apparatus 40 could simply be a mobile telephone, or land-line telephone, of a friend or colleague. On receipt of a head collision notification, the friend or colleague could then go to the assistance of the wearer 30 and/or summon assistance from the emergency services.
In addition to initiating a communication upon detection of a collision, the smart glasses 2 may optionally also initiate a communication upon detection of one or more other non-standard, potentially dangerous events. By way of a first example, processor 10 may optionally be configured to process signals from temperature sensor 14 to detect a sudden temperature change in the environment of the wearer 30 (that may, for example, be indicative of a fire and/or explosion in the vicinity of the wearer 30) and to initiate a communication upon detection of a sudden temperature change. The communication may be transmitted to notification apparatus 40, emergency services communication apparatus 50, a different communication apparatus (not shown) and/or a different emergency services communication apparatus (not shown). As before, the communication initiated by the processor 10 may be sent to the mobile telephone network 60 directly by communication module 12 or via the wearer’s mobile telephone 70. Furthermore, as before, the communication that is sent upon detection of a sudden temperature change may include position data defining the position of the smart glasses 2 determined by the position-determination unit 18 and/or image data from camera 20. In addition, the communication may include temperature information.
By way of second example, the smart glasses 2 may optionally be configured to initiate a communication upon detection of a sudden change in humidity (that may, for example, be indicative of the wearer 30 falling into water). More particularly, processor 10 may be configured to process signals from humidity sensor 16 to detect a sudden humidity change in the environment of the wearer 30 and to initiate a communication upon detection of a sudden humidity change. The communication may be transmitted to notification apparatus 40, emergency services communication apparatus 50, a different communication apparatus (not shown) and/or a different emergency services communication apparatus (not shown). As before, the communication initiated by the processor 10 may be sent to the mobile telephone network 60 directly by communication module 12 or via the wearer’s mobile telephone 70. Furthermore, as before, the communication that is sent upon detection of a sudden humidity change may include position data defining the position of the smart glasses determined by the position-determination unit 18 and/or image data from camera 20. In addition, the communication may include humidity information.
By way of a third example, the smart glasses 2 may optionally be configured to initiate a communication upon determining that the wearer 30 of the smart glasses 2 is outside an allowed area. More particularly, processor 10 may be configured to process the position of the smart glasses 2 determined by position-determination unit 18 to determine whether the wearer 30 is within an allowed area and to initiate a communication upon determining that the wearer is outside the allowed area. Processor 10 maybe configured to determine whether the user is allowed within a single allowed area or within one of a plurality of allowed areas. Each allowed area may represent, for example, an area in which it has been determined that the wearer 30 will be safe or an area to which the wearer 30 is permitted access. The communication may be transmitted to notification apparatus 40, emergency services communication apparatus 50, a different communication apparatus (not shown) and/or a different emergency services communication apparatus (not shown). As before, the communication initiated by the processor 10 may be sent to the telephone network 60 directly by communication module 12 or via the wearer’s mobile telephone 70. Furthermore, as before, the communication that is sent upon determining that the wearer is not within an allowed area may include position data defining the position of the smart glasses determined by the position-determination unit 18 and/or image data from camera 20.
The processing operations performed by the smart glasses 2 in the present embodiment will now be described.
Referring to Figure 3, at step S3-2, processor 10 receives signals from accelerometer 6. Optionally, processor 10 may also receive signals from gyroscope 8 in this step.
At step S3-4, processor 10 processes the received signals to determine if a collision involving the smart glasses 2 (and hence the head of the wearer 30) has occurred. As will be understood by the skilled person, the processing at step S3-4 can be performed in a number of different ways. For example, collision detection processing is performed in some vehicles, and the same or similar processing could be performed at step S3-4. By way of further example, collision detection processing is performed in some mobile telephones for senior citizens, and the same or similar processing could be performed at step S3-4. If signals are received from gyroscope 8 in step S3-2, processor 10 may process the signals at step S3-4 to detect if the smart glasses 2 are being worn by the wearer 30 or if the smart glasses have fallen off the wearer, thereby helping to reduce false alarms.
If it is determined at step S3-6 that no collision involving the head of the wearer 30 has been detected, then processing returns to step S3-2. The processing at steps 53- 2 to S3-6 is repeated until it is determined at step S3-6 that a collision has been detected.
Upon detection of a collision, processing proceeds to step S3-8, at which processor 10 initiates a communication. As explained above, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with mobile telephone network 60 or, alternatively, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with a mobile telephone 70 carried by the wearer 30. As also noted above, processor 10 may initiate the communication to include data defining the position of the smart glasses 2 determined by position-determination unit 18 and/or image data captured by camera 20.
Figure 4 shows processing operations that may be performed by processor 10 if a temperature sensor 14 is included in the smart glasses 2.
Referring to Figure 4, at step S4-2, processor 10 receives signals from temperature sensor 14.
At step S4-4, processor 10 processes the received signals to determine if there has been a sudden temperature change in the environment of the wearer 30. More particularly, processor 10 processes the received signals to determine if the temperature has changed by more than a predetermined amount within a predetermined time.
If it is determined at step S4-6 that there has not been a sudden temperature change, then processing returns to step S4-2. The processing at steps S4-2 to S4-6 is repeated until it is determined at step S4-6 that there has been a sudden temperature change (that is, that the temperature has changed by more than a predetermined amount in a predetermined time).
Upon detection of a sudden temperature change, the processing proceeds to step 54- 8, at which processor 10 initiates a communication. As explained above. processor 10 may initiate the communication so that it is effected by communication module 12 communicating with mobile telephone network 60 or, alternatively, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with a mobile telephone 70 carried by the wearer 30. As also noted above, processor 10 may initiate the communication to include data defining the position of the smart glasses 2 determined by position-determination unit 18 and/or image data captured by camera 20. In addition, the communication may include temperature information.
Figure 5 shows processing operations that may be performed by processor 10 if a humidity sensor 16 is included in the smart glasses 2.
Referring to Figure 5, step S5-2, processor 10 receives signals from humidity sensor 16.
At step S5-4, processor 10 processes the received signals to determine if there has been a sudden humidity change in the environment of the wearer 30. More particularly, processor 10 processes the received signals to determine if the humidity has changed by more than a predetermined amount within a predetermined time.
If it is determined at step S5-6 that there has not been a sudden humidity change, then processing returns to step S5-2. The processing at steps S5-2 to S5-6 is repeated until it is determined at step S5-6 that there has been a sudden humidity change (that is, that the humidity has changed by more than a predetermined amount in a predetermined time).
Upon detection of a sudden humidity change, the processing proceeds to step S5-8, at which processor 10 initiates a communication. As explained above, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with mobile telephone network 60 or, alternatively, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with a mobile telephone 70 carried by the wearer 30. As also noted above, processor 10 may initiate the communication to include data defining the position of the smart glasses 2 determined by position-determination unit 18 and/or image data captured by camera 20. In addition, the communication may include humidity information.
Figure 6 shows further optional processing operations performed by processor 10 if a position-determination unit 18 is present in smart glasses 2.
Referring to Figure 6, at step S6-2, processor 10 receives position data from position-determination unit 18 defining the current position of the smart glasses 2 (and hence the current position of the wearer 30).
At step S6-4, processor 10 processes the received signals to determine if the current position is within an allowed area. The processing at step S6-4 may determine whether the current position is within a single allowed area or within one of a plurality of allowed areas. Each allowed area may represent, for example, an area in which it has been determined that the wearer 30 will be safe or an area to which the wearer 30 is permitted access.
If it is determined at step S6-6 that the current position is within an allowed area, then processing returns to step S6-2. The processing at steps S6-2 to S6-6 is repeated until it is determined at step S6-6 that the current position is not within an allowed area.
Upon determination that the current position is not within an allowed area, processing proceeds to step S6-8 at which processor 10 initiates a communication. As explained before, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with mobile telephone network 60 or, alternatively, processor 10 may initiate the communication so that it is effected by communication module 12 communicating with a mobile telephone 70 carried by the wearer 30. As also noted above, processor 10 may initiate the communication to include data defining the position of the smart glasses 2 determined by position-determination unit 18 and/or image data captured by camera 20.
Second Embodiment A second embodiment of the present invention will now be described.
As explained above, in the first embodiment, processor 10 in smart glasses 2 is configured to process signals from the accelerometer 6, gyroscope 8, temperature sensor 14, humidity sensor 16 and position-determination unit 18 as described above with reference to Figures 3 - 6. In the second embodiment, the components of the smart glasses 2 are the same as those in the first embodiment, except that the processor 10 is configured to instruct communication module 12 to communicate the signals from accelerometer 6, gyroscope 8, temperature sensor 14, humidity sensor 16 and position-determination unit 18 to a mobile telephone 100. Mobile telephone 100 is configured, for example by providing software in the form of an app, to process the signals from accelerometer 6, gyroscope 8, temperature sensor 14, humidity sensor 16 and position-determination unit 18 in the same way as shown in Figures 3-6.
Accordingly, in the second embodiment, a system comprises the smart glasses 2 and mobile telephone 100, with mobile telephone 100 being operable to perform the processing operations that were previously performed by the processor 10 in the first embodiment.
In the second embodiment, as mobile telephone 100 already has capability to communicate with mobile telephone network 60, steps S3-8, S4-8, S5-8 and S6-8 comprise mobile telephone 100 communicating with the mobile telephone network 60.
In a modification to the second embodiment, the components of the smart glasses 2 are configured to transmit signals to mobile telephone 100 via communication module 12 without the assistance of processor 10. In this way, the processor 10 can be omitted from the smart glasses 2.
Figure 8 shows the processing operations performed in the second embodiment and in the modification of the second embodiment.
Referring to Figure 8, at step S8-2, signals from accelerometer 6, gyroscope 8, temperature sensor 14 and humidity sensor 16 are transmitted from the smart glasses 2 to the mobile telephone 100, together with position data from position-determination unit 18 and image data from camera 20.
At step S8-4, the mobile telephone 100 receives the signals from accelerometer 6, gyroscope 8, temperature sensor 14 and humidity sensor 16, along with the position data from position-determination unit 18 and the image data from camera 20.
At step S8-6, mobile telephone 100 processes the signals from accelerometer 6, gyroscope 8, temperature sensor 14 and humidity sensor 16. Mobile telephone 100 also processes the position data from position-determination unit 18. The processing performed at step S8-6 is the same as that shown in, and described above with reference to, Figures 3-6, with mobile telephone 100 communicating with mobile telephone network 60 in steps S3-8, S4-8, S5-8 and S6-8. Accordingly, as this processing has already been described, it will not be described again here.
Third Embodiment A third embodiment of the present invention will now be described.
As explained above, in the first embodiment, processor 10 in smart glasses 2 is configured to process signals from accelerometer 6, gyroscope 8, temperature sensor 14, humidity sensor 16 and position-determination unit 18 as described above with reference to Figures 3-6. In the third embodiment, the components of the smart glasses 2 are the same as those in the first embodiment, expect that the processor 10 is configured to instruct communication module 12 to communicate the signals from accelerometer 6, gyroscope 8, temperature sensor 14, humidity sensor 16 and position-determination unit 18 to notification apparatus 40 either directly or via mobile telephone 70. Notification apparatus 40 is configured to process the signals from accelerometer 6, gyroscope 8, temperature sensor 14, humidity sensor 16 and position-determination unit 18 in the same way shown in Figures 3-6.
Accordingly, in the third embodiment, a system comprises smart glasses 2 and notification apparatus 40, with notification apparatus 40 being operable to perform the processing operations that were previously performed by the processor 10 in the first embodiment.
In a modification to the third embodiment, the components of the smart glasses 2 are configured to transmit signals to notification apparatus 40 via communication module 12 (either directly using mobile telephone network 60 or via mobile telephone 70 using mobile telephone network 60) without the assistance of processor 10. In this way, the processor 10 can be omitted from the smart glasses 2.
Figure 9 shows the processing operations performed in the third embodiment and in the modification of the third embodiment.
Referring to Figure 9, at step S9-2, signals from the accelerometer 6, gyroscope 8, temperature sensor 14 and humidity sensor 16 are transmitted from the smart glasses 2 to the notification apparatus 40, together with position data from position-determination unit 18 and image data from camera 20.
At step S9-4, notification apparatus 40 receives the signals from accelerometer 6, gyroscope 8, temperature sensor 14 and humidity sensor 16, along with the position data from position-determination unit 18 and the image data from camera 20.
At step S9-6, notification apparatus 40 processes the signals from accelerometer 6, gyroscope 8, temperature sensor 14 and humidity sensor 16. Notification apparatus 40 also processes the position data from position-determination unit 18. The processing performed at step S9-6 is the same as that shown in, and described above with reference to. Figures 3-6, except that the communication in steps S3-8, S4-8, S5-8 and S6-8 is sent by notification apparatus 40 to the wearer 30 of the smart glasses 2, a different notification apparatus (not shown) and/or emergency services communication apparatus 50. Accordingly, as this processing has already been described, it will not be described again here.
Claims (31)
1. A pair of smart glasses, comprising: an accelerometer; and a processor configured to process signals from the accelerometer to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision.
2. A pair of smart glasses according to Claim 1, wherein: the smart glasses further comprise a gyroscope; and the processor is configured to process signals from the accelerometer and the gyroscope to detect a collision involving the wearer of the smart glasses.
3. A pair of smart glasses according to Claim 1 or Claim 2, wherein: the smart glasses further comprise a temperature sensor; and the processor is further configured to process signals from the temperature sensor to detect a sudden temperature change, and to initiate a communication upon detection of a sudden temperature change.
4. A pair of smart glasses according to any preceding claim, wherein: the smart glasses further comprise a humidity sensor; and the prooessor is further oonfigured to prooess signals from the humidity sensor to deteot a sudden humidity ohange, and to initiate a communication upon detection of a sudden humidity ohange.
5. A pair of smart glasses aooording to any preceding claim, wherein: the smart glasses further comprise a position-determination unit oonfigured to determine the position of the smart glasses; and the processor is further oonfigured to include in each said communication the determined position of the smart glasses.
6. A pair of smart glasses aooording to Claim 4, wherein: the processor is further oonfigured to process the determined position of the smart glasses to determine whether the wearer is within an allowed area and to initiate a oommunioation upon determining that the wearer is outside the allowed area.
7. A pair of smart glasses according to any preceding claim, wherein: the smart glasses further comprise a camera; and the processor is further configured to include in each said communication image data from the camera.
8. A system, comprising: a pair of smart glasses having an accelerometer located therein; and a mobile telephone configured to receive signals from the accelerometer transmitted by the smart glasses, to process the signals to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision.
9. A system according to Claim 8, wherein: the smart glasses further comprise a gyroscope; and the mobile telephone is configured to receive signals from the gyroscope transmitted by the smart glasses, and to process the signals from the accelerometer and the gyroscope to detect a collision involving the wearer of the smart glasses.
10. A system according to Claim 8 or Claim 9, wherein: the smart glasses further comprise a temperature sensor; and the mobile telephone is further configured to receive signals from the temperature sensor transmitted by the smart glasses, to process the signals from the temperature sensor to detect a sudden temperature change, and to initiate a communication upon detection of a sudden temperature ohange.
11. A system according to any of Claims 8 to 10, wherein: the smart glasses further comprise a humidity sensor; and the mobile telephone is further configured to reoeive signals from the humidity sensor transmitted by the smart glasses, to prooess the signals from the humidity sensor to deteot a sudden humidity change, and to initiate a communioation upon detection of a sudden humidity change.
12. A system according to any of Claims 8 to 11, wherein: the smart glasses further oomprise a position-determination unit configured to determine the position of the smart glasses; and the mobile telephone is further configured to reoeive data defining the determined position transmitted by the smart glasses, and to inolude in each said communication the determined position.
13. A system according to Claim 12, wherein: the mobile telephone is further configured to process the determined position of the smart glasses to determine whether the wearer is within an allowed area and to initiate a communication upon determining that the wearer is outside the allowed area.
14. A system according to any of Claims 8 to 13, wherein: the smart glasses further comprise a camera; and the mobile telephone is further configured to receive image data from the camera transmitted by the smart glasses, and to include in each said communication image data from the camera.
15. A system, comprising: a pair of smart glasses having an accelerometer located therein; and a notification apparatus configured to receive signals from the accelerometer transmitted by the smart glasses, to process the signals to detect a collision involving the wearer of the smart glasses, and to initiate a communication upon detection of a collision.
16. A system according to Claim 15, wherein: the smart glasses further comprise a gyroscope; and the notification apparatus is configured to receive signals from the gyroscope transmitted by the smart glasses, and to process the signals from the accelerometer and the gyroscope to detect a collision involving the wearer of the smart glasses.
17. A system according to Claim 15 or Claim 16, wherein: the smart glasses further comprise a temperature sensor; and the notification apparatus is further configured to receive signals from the temperature sensor transmitted by the smart glasses, to process the signals from the temperature sensor to detect a sudden temperature change, and to initiate a communication upon detection of a sudden temperature change.
18. A system according to any of Claims 15 to 17, wherein: the smart glasses further comprise a humidity sensor; and the notification apparatus is further configured to receive signals from the humidity sensor transmitted by the smart glasses, to process the signals from the humidity sensor to detect a sudden humidity change, and to initiate a communication upon detection of a sudden humidity change.
19. A system according to any of Claims 15 to 18, wherein: the smart glasses further comprise a position-determination unit configured to determine the position of the smart glasses; and the notification apparatus is further configured to receive data defining the determined position transmitted by the smart glasses, and to include in each said communication the determined position.
20. A system according to Claim 19, wherein: the notification apparatus is further configured to process the determined position of the smart glasses to determine whether the wearer is within an allowed area and to initiate a communication upon determining that the wearer is outside the allowed area.
21. A system according to any of Claims 13 to 17, wherein: the smart glasses further comprise a camera; and the notification apparatus is further configured to receive image data from the camera transmitted by the smart glasses, and to include in each said communication image data from the camera.
22. A method comprising: receiving signals from an accelerometer located within a pair of smart glasses; and processing the received signals from the accelerometer to detect a collision involving the wearer of the smart glasses.
23. A method according to Claim 22, wherein: signals are also received from a gyroscope located within the smart glasses; and the signals from the accelerometer and the gyroscope are processed to detect a collision involving the wearer of the smart glasses.
24. A method according to Claim 22 or Claim 23, further comprising: initiating a communication upon detection of the collision.
25. A method according to any of Claims 22 to 24, further comprising: receiving signals from a temperature sensor located within the smart glasses; and processing the received signals from the temperature sensor to detect a sudden temperature change.
26. A method according to Claim 25, further comprising: initiating a communication upon detection of the sudden temperature change.
27. A method according to any of Claims 22 to 26, further comprising: receiving signals from a humidity sensor located within the smart glasses; and processing the received signals from the humidity sensor to detect a sudden humidity change.
28. A method according to Claim 27, further comprising: initiating a communication upon detection of the sudden humidity change.
29. A method according to any of Claims 24, 26 and 28, wherein: the method further comprises determining the position of the smart glasses; and each said communication includes the determined position of the smart glasses.
30. A method according to Claim 29, further comprising: processing the determined position of the smart glasses to determine whether the wearer is within an allowed area; and initiating a communication upon determining that the wearer is not within the allowed area.
31. A method according to any of Claims 24, 26, 28, 29 and 30, wherein: the method further comprises receiving image data from a camera located within the smart glasses; and each said communication includes the received image data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1605486.8A GB2548885A (en) | 2016-03-31 | 2016-03-31 | Collision detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1605486.8A GB2548885A (en) | 2016-03-31 | 2016-03-31 | Collision detection |
Publications (1)
Publication Number | Publication Date |
---|---|
GB2548885A true GB2548885A (en) | 2017-10-04 |
Family
ID=59773322
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1605486.8A Withdrawn GB2548885A (en) | 2016-03-31 | 2016-03-31 | Collision detection |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2548885A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110426852A (en) * | 2019-07-25 | 2019-11-08 | 南昌如鱼电子商务有限公司 | A kind of VR glasses and application method with anticollision effect |
WO2022236994A1 (en) * | 2021-05-11 | 2022-11-17 | 惠州Tcl云创科技有限公司 | Data processing method, data processing apparatus, and smart glasses |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012037290A2 (en) * | 2010-09-14 | 2012-03-22 | Osterhout Group, Inc. | Eyepiece with uniformly illuminated reflective display |
US20150305426A1 (en) * | 2014-04-25 | 2015-10-29 | Ford Global Technologies, Llc | Bicycle helmet with integrated electronics |
US9247779B1 (en) * | 2012-11-08 | 2016-02-02 | Peter Aloumanis | Enhanced global positioning system (GPS) based functionality for helmets |
US20160044276A1 (en) * | 2014-08-08 | 2016-02-11 | Fusar Technologies, Inc. | Helmet system and methods |
-
2016
- 2016-03-31 GB GB1605486.8A patent/GB2548885A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012037290A2 (en) * | 2010-09-14 | 2012-03-22 | Osterhout Group, Inc. | Eyepiece with uniformly illuminated reflective display |
US9247779B1 (en) * | 2012-11-08 | 2016-02-02 | Peter Aloumanis | Enhanced global positioning system (GPS) based functionality for helmets |
US20150305426A1 (en) * | 2014-04-25 | 2015-10-29 | Ford Global Technologies, Llc | Bicycle helmet with integrated electronics |
US20160044276A1 (en) * | 2014-08-08 | 2016-02-11 | Fusar Technologies, Inc. | Helmet system and methods |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110426852A (en) * | 2019-07-25 | 2019-11-08 | 南昌如鱼电子商务有限公司 | A kind of VR glasses and application method with anticollision effect |
WO2022236994A1 (en) * | 2021-05-11 | 2022-11-17 | 惠州Tcl云创科技有限公司 | Data processing method, data processing apparatus, and smart glasses |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7188513B2 (en) | MONITORING SYSTEM, MONITORING METHOD, AND PROGRAM | |
US10600311B2 (en) | Eyeglasses-type wearable terminal, control method thereof, and control program | |
CN103568945B (en) | Vehicle collision accident notification system and method | |
KR102543161B1 (en) | Distracted Driving Monitoring Methods, Systems and Electronics | |
JP6650928B2 (en) | In-vehicle device, mobile terminal and emergency notification system | |
CN110223474A (en) | A kind of intelligent monitoring and alarming method, system and storage medium | |
US11694547B2 (en) | Server, server control method, server control program, vehicle, vehicle control method, and vehicle control program | |
US11812356B2 (en) | Vehicle with automatic report function | |
JP2018073339A (en) | Portable electronic apparatus with accident prevention function | |
GB2548885A (en) | Collision detection | |
CN109637089A (en) | The method for early warning and device of user security | |
CN113593172B (en) | Ship fire monitoring method, device and medium | |
CN106683335A (en) | Alarm method and equipment | |
JP2018077294A (en) | Function limiting device of electronic equipment utilizing user's notice viewpoint and transfer speed | |
KR20150083227A (en) | Smartphone application to detect and report traffiic accident automatically | |
KR20150049097A (en) | Emergency safety service system and method using telematics | |
KR20140078063A (en) | Method and system for safety returning-home navigation service | |
CA2900168A1 (en) | A method for autonomously personal safety monitoring in automobile vehicles | |
JP2003272086A (en) | Tunnel inside monitoring system, program and recording medium | |
WO2023095196A1 (en) | Passenger monitoring device, passenger monitoring method, and non-transitory computer-readable medium | |
CN108216097A (en) | A kind of traffic accident countermeasure and device | |
Fanca et al. | A Survey on Smartphone-Based Accident Reporting and Guidance Systems | |
JP6749470B2 (en) | Local safety system and server | |
JP2021164008A (en) | Information processing method, information processing device, program, and information processing system | |
CN111002989A (en) | Driving safety monitoring method, vehicle-mounted terminal and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |