WO2015116126A1 - Notifying users of mobile devices - Google Patents

Notifying users of mobile devices Download PDF

Info

Publication number
WO2015116126A1
WO2015116126A1 PCT/US2014/013991 US2014013991W WO2015116126A1 WO 2015116126 A1 WO2015116126 A1 WO 2015116126A1 US 2014013991 W US2014013991 W US 2014013991W WO 2015116126 A1 WO2015116126 A1 WO 2015116126A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
sensor
user
movement
orientation
Prior art date
Application number
PCT/US2014/013991
Other languages
French (fr)
Inventor
Akihiko Ikeda
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2014/013991 priority Critical patent/WO2015116126A1/en
Publication of WO2015116126A1 publication Critical patent/WO2015116126A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72418User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting emergency services
    • H04M1/72421User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting emergency services with automatic activation of emergency service functions, e.g. upon sensing an alarm
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • Figure 1 is a block diagram of a system, for
  • Figure 2 illustrates a method for notifying a user- according to an example implementation.
  • FIG. 3 is a plan view of an example scenario.
  • Figure 4 is a three-dimensional depth map according to an example implementation.
  • Figure 5 is a cutaway plan view of a mobile
  • the present disclosure describes a system for notifying a user of a mobile device.
  • the system comprises a distance sensor which measures a distance from the system to an object.
  • a direction sensor is associated with the distance sensor and measures an orientation of the distance sensor.
  • the direction sensor also measures a direction of movement of the system.
  • the system further comprises a processor.
  • the processor determines based on the distance, based on the orientation and based on the direction of movement that the system is about to cross a threshold distance from the object.
  • the system also comprises a notification system.
  • the notification system generates a notification based, on. the determination of the processor .
  • the processor can compensate for
  • the orientation of the system is such that the distance sensor is facing in front of the user and there are no obstacles within 5 meters. Then the user changes the orientation of the system such. that, the distance sensor faces to the ground. As a result, the system may detect the ground, itself as an object because the distance to the ground is within 1 meter. However, since the system measures the orientation of the distance sensor, the ground can be disregarded as a relevant obj ect .
  • Users like to change the orientation of their devices often v/hile operating them and therefore,
  • the disclosed device measures the orientation of the distance sensor and can therefore disregard the ground as a relevant object.
  • the user of the mobile communication device can change the orientation often while the device can still warn the user effectively and. prevent the user from causing the above mentioned accidents.
  • Figure 1 illustrates a computer system 100 for notifying a user of a mobile device.
  • the computer system 100 comprises a processor 102 connected to a program memory 104, a data memory 106, a communication port 108 and a user port 110.
  • the program memory 104 is a non- transitory computer readable medium., such as a hard drive, a solid state disk, flash, read only memory (ROM) , or compact disk (CD) ROM.
  • Software that is, an executable program stored on program memory 104 causes the processor 102 to execute the program,
  • Data memory 106 may comprise volatile or nonvolatile memory, such as a hard drive, a solid state disk, CD-ROM, random access memory (RAM) , dynamic random access memory (DRAM) or flash ROM.
  • processor 102 may load program code into volatile memory, such as RAM, and. execute the program code from there.
  • the user port 110 is connected to a. display 112, which displays a user interface 114 to a user 116.
  • User 116 interacts with the user interface 114 while walking. For example, user 116 browses the Internet using user- interface 114.
  • Processor 102 sends and receives data from a web-server over communication port. 108 using a third generation (3G) cellular communication network, Long Terra Evolution (LTE) , Wifi or other network protocols.
  • 3G third generation
  • LTE Long Terra Evolution
  • Wifi Wireless Fidelity
  • user 116 uses mobi1e communi cation device 100 as a telephone, such that the transmitted data is voice data.
  • Processor 102 is further connected to a distance sensor 120 and a direction sensor 122.
  • Distance sensor 120 senses a distance 124 of an object 126 from the device 100.
  • the object 126 may be anything- that the user could perceive or see if the user was not distracted by using the mobile device and includes a step, an obstacle on the road, a moving object, such .as a car, a hole in the ground, a light pole or any other object.
  • the distance sensor 120 is an optical sensor and comprises a video graphics array (VGA) camera with three channels for red, green and blue (RGB) , respectively, with a resolution of 640x480 pixels and a frame rate of 30 frames per second.
  • the distance sensor 120 may further comprise a depth sensor having an infrared projector and a monochrome complementary metal-oxide semiconductor (CMOS) infrared sensor.
  • CMOS monochrome complementary metal-oxide semiconductor
  • Firmware integrated into the distance sensor 120 may fuse the data from the camera and the infrared sensor to identify objects.
  • the distance sensor 120 may be interfaced using a proprietary or open source software development kit (SDK) , which allows direct computation of a depth map.
  • SDK also provides an application programming interface (API) to the firmware of the distance sensor 120.
  • API application programming interface
  • the direction sensor 122 may be an inertial sensor, such as accelerometer or gyroscope, that is integrated together with processor 102 into a mobile phone, such as a smart phone, a tablet computer or other mobile device.
  • a mobile phone such as a smart phone, a tablet computer or other mobile device.
  • the direction sensor 122 may also comprise an inertial component to measure the orientation of the distance sensor 120 and a Global Positioning System (GPS) component, to measure the direction and speed of movement of the device 100.
  • the direction sensor 122 may also comprise a compass to determine the orientation of the distance sensor 120 with reference to the earth magnetic field.
  • an operating system is installed, on. program memory 104 and executed by processor 102,
  • Processor 102 accesses a SensorEvent.valu.es [0] attribute of the API to receive the sensor data for a particular sensor .
  • the direction sensor 122 is associated, with the distance sensor 120 such that, movement of the direction sensor 122 also causes movement of the distance sensor 120.
  • the distance sensor 120 may be
  • the distance sensor 120 is rigidly fastened to device 100.
  • the direction sensor 122 is integrated into the device 100 and therefore, device 100 and direction sensor 120 move together, which allows direction sensor 122 to sense the orientation of the distance sensor 120,
  • the processor 102 receives and processes the sensing data, in real time, such as every 100 ms .
  • the processor 102 determines that the device 100 is about to cross a threshold distance from the object every time sensing data is received from the sensors and completes this calculation before the sensors send, the next sensing data update.
  • Figure 2 illustrates a method 200 as performed by- processor 102 for notifying a user using a mobile device, such as device 100. The method commences by the processor 102 receiving 202 sensing d.ata.
  • processor 102 receives the sensing data directly from the sensors 120 and 122.
  • the sensors 120 and 122 write the sensing data to a buffer data structure on data memory 106 and the processor 102 requests the sensor data from data memory 106.
  • Processor 102 may also temporarily store the sensing data by defining an onSensorChanged event of the API and pre-process the sensing data, such as by noise filtering or normalising, and store the pre-processed sensing data on data store 106, such as RAM or on an internal processor- register. After that the processor 102 receives the sensing data from the data store 106.
  • data store 106 such as RAM or on an internal processor- register.
  • the sensor data is indicative of a distance of an object 126 from the device 100, an orientation in which the distance is sensed, that is, the orientation of the distance sensor 120 and. a direction of movement of the device 100.
  • the sensing data explicitly includes these values, such as a distance of 10 m.
  • the sensing data comprises raw data, such as the reflection time of a laser range finder.
  • the processor 102 may compute the explicit distance value based on the raw data or may simply use the reflection time as a distance value. It is to be
  • Figure 3 is a plan view of an example scenario 300 where the user 116 with device 100 is walking towards first object 126, second object. 302 and third object. 304 while the user is operating the device 100.
  • Optical sensor 120 senses a distance 124 of the first object 126 from the device 100, which is also the radius of a semi-circle 305 centred at the location of device 100.
  • Inertial sensor 122 senses an orientation 306 of the distance sensor 120 in which the distance 124 is sensed., In this example, the orientation of the distance sensor 120 defines a sensing sector limited by two radii 308 and 310.
  • Figure 3 shows that the orientation 306 of the distance sensor 120 may be different, to the direction from the device 100 to the object 126.
  • Orientation 306 of the distance sensor 120 is to be understood as the direction the sensor is measuring or facing- or as the orientation of the sensor 120 in which the sensor 120 senses the distance 124 to the object 126.
  • the distance sensor 120 has an angular offset from, the inertial.
  • processor 102 measures the orientation of the distance sensor by retrieving the raw readout from the inertial sensors and applying the constant angular offset.
  • Inertial sensors further sense a direction of movement. 312 of the device 100.
  • the example of Figure 3 shows that user 116 is walking towards first, object. 126 and if the communication device 100 does not notify the user 116, the user 116 will cross a threshold distance from the object 126. As a result, the user 116 is at risk of encountering the object 126, which may result in inj ury .
  • the system 100 is configured such that the threshold distance is 1 m. For example, coming closer than 1 m to another person may be undesirable since the personal space of that person is entered.
  • distance sensor 120 is a three- dimensional depth sensor and generates a three-dimensional depth map -
  • Figure 4 is a three-dimensional depth map 400 comprising, first, second and third objects 126, 302 and 304, respectively.
  • the first two dimensions are
  • First object 126 is at about the same distance away from device 100 as third object 304 and therefore, first object 126 and third object 304 are depicted with equal shading. In contrast, second object 302 is further away, which is represented by different shading.
  • Depth map 400 is received by processor 102 from, distance sensor 120 and stored on a data store, such as RAM, hard disk drive or solid state disk. As mentioned before, in one example, processor 102 executes function calls of the API to retrieve the depth map 400. Depth map 400 also shows the orientation 306 of the distance sensor 120 and the direction of movement 312 of the device 100, which are not stored within the depth map but separately i n this examp1e .
  • depth map 400 is a projection of the sensing sector onto a rectangular image plane, such that radii 308 and 310 form the sides of the image plane, the semi-circle 305 forms the top edge and user location 116 forms a bottom edge 402, As a result, the bottom edge 402 can be labelled with a scale that indicates the angle of the orientation 306 of the distance sensor 120. For example, depth map 400 shows that the direction of movement 312 of the system 100 is -10" from the
  • First radius 308 of depth map 400 is annotated with a further scale that indicates the vertical angle of the distance sensor 120.
  • the natural horizon is located at 0°, while the feet of the user would be located at -90°.
  • the orientation 306 of the distance sensor 120 also includes the vertical angle of the distance sensor 120 measured by direction sensor 122.
  • processor 102 can determine a depth gradient that would be expected if the measured object is the ground. Processor 102 can then exclude pixels of the depth map that have depth values according to this gradient. Further, processor 102 can vertically translate the received depth map such that it is located at the correct location with respect, to the vertical and.
  • the length of the arc of semi-circle 305 between radii 308 and 310 is 1 m. It is further noted that an object may also be defined by the distance being greater than the area surrounding the object, for example, when approaching stairs or a hole.
  • the processor 102 determines 204 that the system 100 is about to cross a threshold distance from the object 126 based on the sensing data indicative of the distance 124, the orientation306, and the direction of movement 312.
  • processor 102 operates on the depth map 400 and a vector representing the direction of movement. 312. If the vector representing the direction of movement 312 intersects with the object 126, the system is about to cross a threshold distance from the object 126.
  • processor 102 determines whether there is an intersection and if there is, processor notifies 206 the user that the system 100 is about to cross a threshold distance from the object. 126.
  • a library of functions implemented as executable program code is installed on program memory 104, v/hich provides functions for image segmentation.
  • processor 102 performs
  • Processor 102 executes a function named intersectLinePolygon. to
  • Notifying the user may be performed by various different means, such as by activation of a vibration of the device 100, by an audio system, such as a. beeping sound.
  • the beeping sound may become louder or higher pitched as the device 100 approaches the object 126.
  • Another option is to use the display 112 as an optical notification system and display a warning message to the user 116, which can be particularly useful if the user is reading emails, writing SMS messages or browsing the Internet, while walking.
  • the warning message may be a popup message similar to a message that alerts the user 116 of running low on battery.
  • Figures 3 and 4 further comprise second object 302 and. third object 304.
  • Second object 302 is further away than first object 126 and processor therefore does not. identify object 302 as relevant.
  • Third object 304 is in a different direction to the direction of movement of the system 100 and therefore, third object 304 is also not relevant.
  • the inertial sensor 122 further- senses the speed of the system 100, The vector
  • Speed may be measured by detecting the steps of the user 116 and deriving a speed measure from the rate of steps assuming a fixed length of each step of 1 m, for example .
  • the notification may ⁇ be limited and provided in cases where the speed of movement, of the system 100 is less than a predetermined threshold, such as 30 km/h, which also includes riding a bicycle at a moderate speed.
  • processor 102 may repeat the processing of sensing data in real. time. When multiple samples of sensing data are received within a certain time frame, the processor 102 can calculate the difference between the distance to the object 126 in the first sample and the distance to the object 126 in the second sample. This difference divided by the time between the two samples yields the rate of change in the distance 124. In one example, processor 102 stores 50 samples of sensing data on data store 106 and determines the rate of change based on these 50 samples, such as by computing an average rate of change. The number of samples is adjustable and another possible value is 10 samples.
  • the threshold distance may be based on the rate of change of the distance 124, For example, if the rate of change is faster than 2 m/s, the threshold distance is 5 m. If the rate of change is less than 1 m/ ' s, the threshold distance is 2 ra and if the rate of change is between 2 m/s and 1 m/s, the threshold distance is 3 m.
  • processor 102 employs a greater distance threshold for objects with a determined speed that is above a predetermined speed threshold.
  • the speed threshold is 5 km/h to distinguish betv/een other walking persons and faster objects, such as cars.
  • processor 102 also determines a type of the object 126 based on the distance information and the threshold distance is based on the determined type of the object 126.
  • the distance sensor 120 and API provide the feature of detecting humans using a skeleton model.
  • Processor 102 then queries a database for a distance associated -with that, particular object type, such, as 'person' , 'non-person object' or 'car' , As a. result, a low distance threshold can be applied for less dangerous encounters, such as encountering other people, while a high distance threshold can be applied for encountering more dangerous situations, such as falling off a railway platform..
  • FIG. 5 illustrates a mobile communication device 500, such as a tablet computer, for notifying a user that, the mobile communication device 500 is about, to cross a threshold distance from the object 126.
  • the communication device 500 comprises a processing- module 502, such as a mobile communication signal processor, connected to program memory 504 and data memory 506.
  • Processing module 502 is further connected to communications port 508 to send and. receive data and voice signals over a 3G, LTE, Wifi or other physical network.
  • Processing module 502 is also connected via. user port 510 with touch sensitive display 512.
  • a user (not shown) operates display 512 to create or access information stored on the device or retrieved via data port 508.
  • Communication device 500 further comprises a distance sensing module 520 to sense a. distance 124 of an object from the mobile communication device 500 and a. direction sensing module 522 associated with the distance sensing module 520 to sense an orientation 306 of the distance sensor 120 in which the distance 124 is sensed, and to sense a direction of movement, of the mobile communication device 500.
  • the direction sensing module 522 may be a nricroelectromechanical system (MEMS) .
  • the processing module 502 determines that the mobile communication device 500 is about to cross a threshold distance from the object 126 based on the distance, the orientation and the direction of movement as explained above with reference to Figures 3 and 4.
  • a notification system 524 notifies the user in response to determining that the mobile communication device 500 is .about to cross a threshold distance from the object 126.
  • the notification system. 524 may be a buzzer, loudspeaker, vibrating element, or integrated, into the display 512, such that a notification message 530 is shown to the user 116 on display 512.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The present disclosure describes a system for notifying a user of a mobile device. The system comprises a distance sensor to measure a distance from the system to an object and an orientation sensor associated with the distance sensor to measure an orientation of the distance sensor and to measure a direction of movement of the system. The system further comprises a processor to determine based on the distance, based on the orientation and based on the direction of movement that the system is about to cross a threshold distance from the object. The system also comprises a notification system to generate a notification based on the determination of the processor.

Description

NOTIFYING USERS OF MOBILE DEVICES
[01] Many users of mobile devices, such as phones, e- readers and tablet computers operate their devices while walking. They read e-books, browse the internet, read their emails and write SMS messages.
Brief Description of the Drawings
[02] By way of non-limiting examples, systems, mobile communication devices and methods for notifying users according to the present disclosure will be described with reference to the following drawings in which
[03] Figure 1 is a block diagram of a system, for
notifying a user according to an example implementation,
[04] Figure 2 illustrates a method for notifying a user- according to an example implementation.
[051 Figure 3 is a plan view of an example scenario.
[06] Figure 4 is a three-dimensional depth map according to an example implementation.
[07] Figure 5 is a cutaway plan view of a mobile
communication device according to an example
implementation .
Detailed Description
[08] The present disclosure describes a system for notifying a user of a mobile device. The system comprises a distance sensor which measures a distance from the system to an object. A direction sensor is associated with the distance sensor and measures an orientation of the distance sensor. The direction sensor also measures a direction of movement of the system. The system, further comprises a processor. The processor determines based on the distance, based on the orientation and based on the direction of movement that the system is about to cross a threshold distance from the object. The system also comprises a notification system. The notification system, generates a notification based, on. the determination of the processor .
[09] Users of mobile communication devices operate their devices while they are unav/are of the dangers around them. As a result, some of these users fall down stairs, walk in front of cars or bump into other people
[10] Since the system measures the orientation of the distance sensor, such as towards the ground, in front or towards the sky, the processor can compensate for
different orientations and the determination that the system is about to cross a threshold distance from the object is more accurate than without considering the orientation .
[11] For example, initially the orientation of the system is such that the distance sensor is facing in front of the user and there are no obstacles within 5 meters. Then the user changes the orientation of the system such. that, the distance sensor faces to the ground. As a result, the system may detect the ground, itself as an object because the distance to the ground is within 1 meter. However, since the system measures the orientation of the distance sensor, the ground can be disregarded as a relevant obj ect . [12] Users like to change the orientation of their devices often v/hile operating them and therefore,
restraining the orientation of the device would reduce the enjoyment, for the user. However, the disclosed device measures the orientation of the distance sensor and can therefore disregard the ground as a relevant object. As a result, the user of the mobile communication device can change the orientation often while the device can still warn the user effectively and. prevent the user from causing the above mentioned accidents.
[13] Figure 1 illustrates a computer system 100 for notifying a user of a mobile device. The computer system 100 comprises a processor 102 connected to a program memory 104, a data memory 106, a communication port 108 and a user port 110. The program memory 104 is a non- transitory computer readable medium., such as a hard drive, a solid state disk, flash, read only memory (ROM) , or compact disk (CD) ROM. Software, that is, an executable program stored on program memory 104 causes the processor 102 to execute the program,
[14] Data memory 106 may comprise volatile or nonvolatile memory, such as a hard drive, a solid state disk, CD-ROM, random access memory (RAM) , dynamic random access memory (DRAM) or flash ROM. In order to execute machine readable instructions being stored on program memory 104, processor 102 may load program code into volatile memory, such as RAM, and. execute the program code from there.
[15] The user port 110 is connected to a. display 112, which displays a user interface 114 to a user 116. User 116 interacts with the user interface 114 while walking. For example, user 116 browses the Internet using user- interface 114. Processor 102 sends and receives data from a web-server over communication port. 108 using a third generation (3G) cellular communication network, Long Terra Evolution (LTE) , Wifi or other network protocols. In another exaraple, user 116 uses mobi1e communi cation device 100 as a telephone, such that the transmitted data is voice data.
[16] Processor 102 is further connected to a distance sensor 120 and a direction sensor 122. Distance sensor 120 senses a distance 124 of an object 126 from the device 100.
[17] The object 126 may be anything- that the user could perceive or see if the user was not distracted by using the mobile device and includes a step, an obstacle on the road, a moving object, such .as a car, a hole in the ground, a light pole or any other object.
[18] In one example, the distance sensor 120 is an optical sensor and comprises a video graphics array (VGA) camera with three channels for red, green and blue (RGB) , respectively, with a resolution of 640x480 pixels and a frame rate of 30 frames per second. The distance sensor 120 may further comprise a depth sensor having an infrared projector and a monochrome complementary metal-oxide semiconductor (CMOS) infrared sensor. Firmware integrated into the distance sensor 120 may fuse the data from the camera and the infrared sensor to identify objects. The distance sensor 120 may be interfaced using a proprietary or open source software development kit (SDK) , which allows direct computation of a depth map. The SDK also provides an application programming interface (API) to the firmware of the distance sensor 120.
[19] The direction sensor 122 may be an inertial sensor, such as accelerometer or gyroscope, that is integrated together with processor 102 into a mobile phone, such as a smart phone, a tablet computer or other mobile device.
The direction sensor 122 may also comprise an inertial component to measure the orientation of the distance sensor 120 and a Global Positioning System (GPS) component, to measure the direction and speed of movement of the device 100. The direction sensor 122 may also comprise a compass to determine the orientation of the distance sensor 120 with reference to the earth magnetic field.
[20] In one example, an operating system is installed, on. program memory 104 and executed by processor 102,
Processor 102 accesses a SensorEvent.valu.es [0] attribute of the API to receive the sensor data for a particular sensor .
[21] The direction sensor 122 is associated, with the distance sensor 120 such that, movement of the direction sensor 122 also causes movement of the distance sensor 120. For example, the distance sensor 120 may be
externally attached to the device 100 by inserting a plug into the appropriate socket of device 100 and the distance sensor 120 is directly attached to the plug without the use of a cable. As a result, the distance sensor 120 is rigidly fastened to device 100. The direction sensor 122 is integrated into the device 100 and therefore, device 100 and direction sensor 120 move together, which allows direction sensor 122 to sense the orientation of the distance sensor 120,
[22] In one example, the processor 102 receives and processes the sensing data, in real time, such as every 100 ms . The processor 102 determines that the device 100 is about to cross a threshold distance from the object every time sensing data is received from the sensors and completes this calculation before the sensors send, the next sensing data update. [23] Figure 2 illustrates a method 200 as performed by- processor 102 for notifying a user using a mobile device, such as device 100. The method commences by the processor 102 receiving 202 sensing d.ata. In one example, processor 102 receives the sensing data directly from the sensors 120 and 122. In another example, the sensors 120 and 122 write the sensing data to a buffer data structure on data memory 106 and the processor 102 requests the sensor data from data memory 106.
[24] Processor 102 may also temporarily store the sensing data by defining an onSensorChanged event of the API and pre-process the sensing data, such as by noise filtering or normalising, and store the pre-processed sensing data on data store 106, such as RAM or on an internal processor- register. After that the processor 102 receives the sensing data from the data store 106.
[25] The sensor data is indicative of a distance of an object 126 from the device 100, an orientation in which the distance is sensed, that is, the orientation of the distance sensor 120 and. a direction of movement of the device 100. In one example, the sensing data explicitly includes these values, such as a distance of 10 m.
[26] In a different example, the sensing data comprises raw data, such as the reflection time of a laser range finder. The processor 102 may compute the explicit distance value based on the raw data or may simply use the reflection time as a distance value. It is to be
understood that, throughout this specification 'distance' , 'direction' and 'orientation' is not limited to the explicit distance value, direction value or orientation value but may also be an implicit value from which the distance, direction or orientation can be derived as in the example above of the raw range finder reflection data. [27] Figure 3 is a plan view of an example scenario 300 where the user 116 with device 100 is walking towards first object 126, second object. 302 and third object. 304 while the user is operating the device 100.
[28] Optical sensor 120 senses a distance 124 of the first object 126 from the device 100, which is also the radius of a semi-circle 305 centred at the location of device 100. Inertial sensor 122 senses an orientation 306 of the distance sensor 120 in which the distance 124 is sensed., In this example, the orientation of the distance sensor 120 defines a sensing sector limited by two radii 308 and 310.
[29] Figure 3 shows that the orientation 306 of the distance sensor 120 may be different, to the direction from the device 100 to the object 126. Orientation 306 of the distance sensor 120 is to be understood as the direction the sensor is measuring or facing- or as the orientation of the sensor 120 in which the sensor 120 senses the distance 124 to the object 126. In one example, the distance sensor 120 has an angular offset from, the inertial.
sensors, such as 90 degrees. As a result, processor 102 measures the orientation of the distance sensor by retrieving the raw readout from the inertial sensors and applying the constant angular offset.
[30] Inertial sensors further sense a direction of movement. 312 of the device 100. The example of Figure 3 shows that user 116 is walking towards first, object. 126 and if the communication device 100 does not notify the user 116, the user 116 will cross a threshold distance from the object 126. As a result, the user 116 is at risk of encountering the object 126, which may result in inj ury . [31] In one example, the system 100 is configured such that the threshold distance is 1 m. For example, coming closer than 1 m to another person may be undesirable since the personal space of that person is entered.
[32] In one example, distance sensor 120 is a three- dimensional depth sensor and generates a three-dimensional depth map -
[33] Figure 4 is a three-dimensional depth map 400 comprising, first, second and third objects 126, 302 and 304, respectively. The first two dimensions are
represented by the horizontal and the vertical axis while the third dimension is the distance of the measured objects represented by different colours, generally shown as different shading in Figure 4. First object 126 is at about the same distance away from device 100 as third object 304 and therefore, first object 126 and third object 304 are depicted with equal shading. In contrast, second object 302 is further away, which is represented by different shading.
[34] Depth map 400 is received by processor 102 from, distance sensor 120 and stored on a data store, such as RAM, hard disk drive or solid state disk. As mentioned before, in one example, processor 102 executes function calls of the API to retrieve the depth map 400. Depth map 400 also shows the orientation 306 of the distance sensor 120 and the direction of movement 312 of the device 100, which are not stored within the depth map but separately i n this examp1e .
[35] In this example, depth map 400 is a projection of the sensing sector onto a rectangular image plane, such that radii 308 and 310 form the sides of the image plane, the semi-circle 305 forms the top edge and user location 116 forms a bottom edge 402, As a result, the bottom edge 402 can be labelled with a scale that indicates the angle of the orientation 306 of the distance sensor 120. For example, depth map 400 shows that the direction of movement 312 of the system 100 is -10" from the
orientation 306 of the distance sensor 120.
[36] First radius 308 of depth map 400 is annotated with a further scale that indicates the vertical angle of the distance sensor 120. The natural horizon is located at 0°, while the feet of the user would be located at -90°. In this example, the orientation 306 of the distance sensor 120 also includes the vertical angle of the distance sensor 120 measured by direction sensor 122.
Based on the vertical angle processor 102 can transform the received depth map.
[37] For example, processor 102 can determine a depth gradient that would be expected if the measured object is the ground. Processor 102 can then exclude pixels of the depth map that have depth values according to this gradient. Further, processor 102 can vertically translate the received depth map such that it is located at the correct location with respect, to the vertical and.
horizontal angles shown in Figure 4.
[38] In one example, the length of the arc of semi-circle 305 between radii 308 and 310 is 1 m. It is further noted that an object may also be defined by the distance being greater than the area surrounding the object, for example, when approaching stairs or a hole.
[39] Referring- back to Figure 2, the processor 102 then determines 204 that the system 100 is about to cross a threshold distance from the object 126 based on the sensing data indicative of the distance 124, the orientation306, and the direction of movement 312. In this example, processor 102 operates on the depth map 400 and a vector representing the direction of movement. 312. If the vector representing the direction of movement 312 intersects with the object 126, the system is about to cross a threshold distance from the object 126.
Therefore, processor 102 determines whether there is an intersection and if there is, processor notifies 206 the user that the system 100 is about to cross a threshold distance from the object. 126.
[40] In one example, a library of functions implemented as executable program code is installed on program memory 104, v/hich provides functions for image segmentation.
Executing these functions, processor 102 performs
segmentation of depth map 400 and edge detection to obtain the shape of object 126. For example, Processor 102 executes a function named intersectLinePolygon. to
determine whether direction of movement 312 intersects with the shape of object 126.
[41] Notifying the user may be performed by various different means, such as by activation of a vibration of the device 100, by an audio system, such as a. beeping sound. The beeping sound may become louder or higher pitched as the device 100 approaches the object 126.
Another option is to use the display 112 as an optical notification system and display a warning message to the user 116, which can be particularly useful if the user is reading emails, writing SMS messages or browsing the Internet, while walking. The warning message may be a popup message similar to a message that alerts the user 116 of running low on battery.
[42] For clarification, Figures 3 and 4 further comprise second object 302 and. third object 304. Second object 302 is further away than first object 126 and processor therefore does not. identify object 302 as relevant. Third object 304 is in a different direction to the direction of movement of the system 100 and therefore, third object 304 is also not relevant.
[43] In one example, the inertial sensor 122 further- senses the speed of the system 100, The vector
representing the direction of movement 312 may be scaled by the speed of movement, such that the vector 312 intersects the object 126 in cases where the speed is sufficiently high. If the speed is low, the user 116 can be notified later, that is, when the system 100 is closer to the object, without significantly increasing the risk for the user 116. Speed may be measured by detecting the steps of the user 116 and deriving a speed measure from the rate of steps assuming a fixed length of each step of 1 m, for example .
[44] In examples where the notification is found to be reliable for walking speeds but not reliable for higher speeds, such as when driving a car, the notification may¬ be limited and provided in cases where the speed of movement, of the system 100 is less than a predetermined threshold, such as 30 km/h, which also includes riding a bicycle at a moderate speed.
[45] As mentioned before, processor 102 may repeat the processing of sensing data in real. time. When multiple samples of sensing data are received within a certain time frame, the processor 102 can calculate the difference between the distance to the object 126 in the first sample and the distance to the object 126 in the second sample. This difference divided by the time between the two samples yields the rate of change in the distance 124. In one example, processor 102 stores 50 samples of sensing data on data store 106 and determines the rate of change based on these 50 samples, such as by computing an average rate of change. The number of samples is adjustable and another possible value is 10 samples.
[46] The threshold distance may be based on the rate of change of the distance 124, For example, if the rate of change is faster than 2 m/s, the threshold distance is 5 m. If the rate of change is less than 1 m/'s, the threshold distance is 2 ra and if the rate of change is between 2 m/s and 1 m/s, the threshold distance is 3 m.
[47] Subtracting the speed of the movement of the system 100 from the rate of change the processor 102 may
determine the speed of the object itself. In some situations, faster objects are more dangerous and
therefore, processor 102 employs a greater distance threshold for objects with a determined speed that is above a predetermined speed threshold. In one example, the speed threshold is 5 km/h to distinguish betv/een other walking persons and faster objects, such as cars.
[48] In one example, processor 102 also determines a type of the object 126 based on the distance information and the threshold distance is based on the determined type of the object 126. For example, the distance sensor 120 and API provide the feature of detecting humans using a skeleton model. Processor 102 then queries a database for a distance associated -with that, particular object type, such, as 'person' , 'non-person object' or 'car' , As a. result, a low distance threshold can be applied for less dangerous encounters, such as encountering other people, while a high distance threshold can be applied for encountering more dangerous situations, such as falling off a railway platform.. [49] Figure 5 illustrates a mobile communication device 500, such as a tablet computer, for notifying a user that, the mobile communication device 500 is about, to cross a threshold distance from the object 126. The communication device 500 comprises a processing- module 502, such as a mobile communication signal processor, connected to program memory 504 and data memory 506. Processing module 502 is further connected to communications port 508 to send and. receive data and voice signals over a 3G, LTE, Wifi or other physical network. Processing module 502 is also connected via. user port 510 with touch sensitive display 512. A user (not shown) operates display 512 to create or access information stored on the device or retrieved via data port 508.
[50] Communication device 500 further comprises a distance sensing module 520 to sense a. distance 124 of an object from the mobile communication device 500 and a. direction sensing module 522 associated with the distance sensing module 520 to sense an orientation 306 of the distance sensor 120 in which the distance 124 is sensed, and to sense a direction of movement, of the mobile communication device 500. The direction sensing module 522 may be a nricroelectromechanical system (MEMS) .
[511 The processing module 502 determines that the mobile communication device 500 is about to cross a threshold distance from the object 126 based on the distance, the orientation and the direction of movement as explained above with reference to Figures 3 and 4.
[52] A notification system 524 notifies the user in response to determining that the mobile communication device 500 is .about to cross a threshold distance from the object 126. The notification system. 524 may be a buzzer, loudspeaker, vibrating element, or integrated, into the display 512, such that a notification message 530 is shown to the user 116 on display 512.
[53] Throughout this specification the word, "have", or variations such as "has" or "having", will be understood to have the same meaning as the word "comprise" and to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.
[54] It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the above-described embodiments, without departing from the broad general scope of the present disclosure. The present embodiments are, therefore, to be considered in all respects as illustrative and not. restrictive.

Claims

CLAIMS :
1. A system for notifying a user of a mobile device, the system comprising:
a first sensor to measure a. distance from the system, to an object;
a second sensor associated with the first sensor to measure an orientation of the first sensor and to measure a direction of movement of the system;
a processor to determine based on the distance, based on the orientation and based on the direction of movement that the system is about to cross a threshold distance from the object; and
a notification system to generate a notification based on the determination of the processor.
2. The system of claim. 1 wherein the first sensor is a 3-dimensional depth sensor,
3. The system of claim 1, wherein the second sensor is an inertial sensor.
4. The system of claim 1, wherein the second sensor is an inertial sensor of the mobile device and. the processor i s the processor o f the mobile device .
5. A method for notifying a user using- a mobile device, the method comprising:
receiving sensing data indicative of
a distance of an object from the mobile device, an orientation in which the distance is sensed, and
a direction of movement of the mobile device; determining that the mobile device is about to cross a threshold distance from the object based on the sensing data indicative of the distance, the orientation, and the direction of movement; and notifying the user in response to determining that the mobile device is about to cross a threshold distance from the object.
6. The method of claim 5, further comprising sensing speed of movement of the mobile device, wherein
determining that the mobile device is about to cross a threshold distance from the object is based on the speed of movement of the mobile device.
7. The method, of claim 6, wherein notifying the user comprises selectively notifying the user where the speed of the mobile device is less than 30 kilometres per hour.
8. The method of claim 5, further comprising determining a rate of change in the distance based on the distance, wherein determining that the mobile device is about to cross a threshold distance from the object is based on the rate of change of the distance.
9. The method of claim 8, further comprising determining speed of the object based on the rate of change and. based on the speed of movement of the mobile device,
wherein determining that the mobile dev.ice is about to cross a threshold distance from the object is based on the speed of the object.
10. The method of claim 5, further comprising determining a type of the object, wherein the threshold distance is based on the type of the object.
11. The method of claim 10, further comprising
determining the threshold distance based on the speed of the object.
12. A mobile communication device for notifying a user of the mobile communication device, the mobile communication device comprising :
a first sensing module to sense a distance of an object from the mobile communication device;
a second sensing module associated with the distance sensing module to sense an orientation of the distance sensing module ;
a third sensing module to sense a direction of movement of the mobile communication device;
a processing module to determine that the mobile communication device is about to cross a threshold distance from the object based on the distance, the orientation and the direction of movement; and
a notification system to notify the user in response to determining that the mobile communication device is about to cross a threshold, distance from the object.
13. The mobile communication device of claim 12, wherein the notification system is a display to display to the user a notification message.
14. The mobile communication device of claim. 12, wherein the notification system is a vibration system to cause vibration of the mobile communication device.
15. The mobile communication device of claim 12, wherein the notification system is an audio system to generate an audio signal.
PCT/US2014/013991 2014-01-31 2014-01-31 Notifying users of mobile devices WO2015116126A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2014/013991 WO2015116126A1 (en) 2014-01-31 2014-01-31 Notifying users of mobile devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2014/013991 WO2015116126A1 (en) 2014-01-31 2014-01-31 Notifying users of mobile devices

Publications (1)

Publication Number Publication Date
WO2015116126A1 true WO2015116126A1 (en) 2015-08-06

Family

ID=53757527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/013991 WO2015116126A1 (en) 2014-01-31 2014-01-31 Notifying users of mobile devices

Country Status (1)

Country Link
WO (1) WO2015116126A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6502032B1 (en) * 2001-06-25 2002-12-31 The United States Of America As Represented By The Secretary Of The Air Force GPS urban navigation system for the blind
US20040183674A1 (en) * 2003-01-31 2004-09-23 Ruvarac Thomas C. Apparatus, system and method for monitoring a location of a portable device
US20080134102A1 (en) * 2006-12-05 2008-06-05 Sony Ericsson Mobile Communications Ab Method and system for detecting movement of an object
JP4646697B2 (en) * 2005-05-18 2011-03-09 富士通株式会社 Portable device
US20110143816A1 (en) * 2008-06-10 2011-06-16 Frank Fischer Portable device including warning system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6502032B1 (en) * 2001-06-25 2002-12-31 The United States Of America As Represented By The Secretary Of The Air Force GPS urban navigation system for the blind
US20040183674A1 (en) * 2003-01-31 2004-09-23 Ruvarac Thomas C. Apparatus, system and method for monitoring a location of a portable device
JP4646697B2 (en) * 2005-05-18 2011-03-09 富士通株式会社 Portable device
US20080134102A1 (en) * 2006-12-05 2008-06-05 Sony Ericsson Mobile Communications Ab Method and system for detecting movement of an object
US20110143816A1 (en) * 2008-06-10 2011-06-16 Frank Fischer Portable device including warning system and method

Similar Documents

Publication Publication Date Title
US11113544B2 (en) Method and apparatus providing information for driving vehicle
JP6763448B2 (en) Visually enhanced navigation
KR102456248B1 (en) Curve guidance method, curve guidance apparatus, electronic apparatus and program stored in the computer-readable recording meduim
WO2011053335A1 (en) System and method of detecting, populating and/or verifying condition, attributions, and/or objects along a navigable street network
US20160364621A1 (en) Navigation device with integrated camera
EP2990936A1 (en) Communication of spatial information based on driver attention assessment
WO2020119567A1 (en) Data processing method, apparatus, device and machine readable medium
KR20170106963A (en) Object detection using location data and scale space representations of image data
US20120194554A1 (en) Information processing device, alarm method, and program
US20230331485A1 (en) A method for locating a warehousing robot, a method of constructing a map, robot and storage medium
US10769420B2 (en) Detection device, detection method, computer program product, and information processing system
KR102406491B1 (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR102337352B1 (en) Method and device for providing advanced pedestrian assistance system to protect pedestrian preoccupied with smartphone
US11668573B2 (en) Map selection for vehicle pose system
US10853667B2 (en) Method and apparatus with linearity detection
JP2020032866A (en) Vehicular virtual reality providing device, method and computer program
US20160290799A1 (en) Device Orientation Detection
CN113140132A (en) Pedestrian anti-collision early warning system and method based on 5G V2X mobile intelligent terminal
KR20220142424A (en) Curve guidance method, curve guidance apparatus, electronic apparatus and program stored in the computer-readable recording meduim
US20210056308A1 (en) Navigation method for blind person and navigation device using the navigation method
KR102406489B1 (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
WO2015116126A1 (en) Notifying users of mobile devices
KR102274544B1 (en) Electronic device and image processing method thereof
JP2021107828A (en) Electronic device, map matching method, and program
KR102573410B1 (en) Electronic device providing putting guide

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14880941

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14880941

Country of ref document: EP

Kind code of ref document: A1