WO2017007643A1 - Systems and methods for providing non-intrusive indications of obstacles - Google Patents

Systems and methods for providing non-intrusive indications of obstacles Download PDF

Info

Publication number
WO2017007643A1
WO2017007643A1 PCT/US2016/039879 US2016039879W WO2017007643A1 WO 2017007643 A1 WO2017007643 A1 WO 2017007643A1 US 2016039879 W US2016039879 W US 2016039879W WO 2017007643 A1 WO2017007643 A1 WO 2017007643A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile
display
obstacle
user
indicator
Prior art date
Application number
PCT/US2016/039879
Other languages
French (fr)
Inventor
James ROBARTS
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201562189034P priority Critical
Priority to US62/189,034 priority
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2017007643A1 publication Critical patent/WO2017007643A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/523Details of pulse systems
    • G01S7/526Receivers
    • G01S7/53Means for transforming coordinates or for evaluating data, e.g. using computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00664Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera
    • G06K9/00671Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera for providing information about objects in the scene to a user, e.g. as in augmented reality applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Abstract

The present disclosure describes systems and methods for enabling non-intrusive indications of obstacles present in the path of a user of a mobile device. In an exemplary method performed on a mobile device, the device operates a sensor to determining the location of an obstacle with respect to the user, and the mobile device displays an indicator representing the obstacle as an overlay on a display of the mobile device, wherein the location of the indicator on the display represents the location of the obstacle. The location of the indicator on the display is selected such that greater distance from a bottom of the display represents greater distance from the user, and the left-right location of the indicator on the display is selected to correspond with the left-right position of the obstacle with respect to the user.

Description

SYSTEMS AND METHODS FOR PROVIDING NON-INTRUSIVE INDICATIONS OF
OBSTACLES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a non-provisional filing of, and claims benefit under 35 U.S.C. §119(e) from, U.S. Provisional Patent Application Serial No. 62/189,034, entitled "Systems and Methods for Providing Non-Intrusive Indications of Obstacles", filed July 6, 2015, the entire disclosure of which is incorporated herein by reference.
BACKGROUND
[0002] Mobile devices such as smartphones are frequently used by people while on the move, including while walking. Viewing a screen of a mobile device while walking diverts a user's attention away from his surroundings and can lead to the user tripping or otherwise injuring himself or others. Other efforts have been made to provide smartphone users with warnings of impending obstacles. For example, in Foerster, Klaus-Tycho, et al. "SpareEye: enhancing the safety of inattentionally blind smartphone users," Proceedings of the 13th International Conference on Mobile and Ubiquitous Multimedia, ACM, 2014, a smartphone application issues a siren and a vibration if the user is at risk of colliding with an obstacle. In Hincapie-Ramos, Juan David, and Pourang Irani, "CrashAlert: enhancing peripheral alertness for eyes-busy mobile interaction while walking," Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2013, a mobile device provides visual alerts of obstacles in a band at the top of the display. These systems, however, do not provide the most effective use of a mobile device's visual display capabilities for conveying indications of potential obstacles.
SUMMARY
[0003] The present disclosure describes systems and methods for enabling non-intrusive indications of obstacles present in the path of a user of a mobile device. In an exemplary method performed on a mobile device, the device operates a sensor to determine the location of an obstacle with respect to the user, and the mobile device displays an indicator representing the obstacle on a display of the mobile device, wherein the location of the indicator on the display represents the location of the obstacle. The location of the indicator on the display is selected such that greater distance from a bottom of the display represents greater distance from the user, and the left-right location of the indicator on the display is selected to correspond with the left- right position of the obstacle with respect to the user. The indicator may be displayed as an overlay over an application program, such as an email or texting application. In some embodiments, the mobile device detects a potential collision between the user and the obstacle and provides an alert of the potential collision to the user.
[0004] In an exemplary method, at least one sensor of a mobile device (such as a camera or ultrasonic sensor of a smartphone) is operated to determine a position of one or more obstacles in a two-dimensional horizontal plane relative to a user carrying the mobile device. For each obstacle, an indicator of the respective obstacle is displayed as an overlay on a display of the mobile device. The horizontal and vertical position of each indicator on the display represents the position of the respective obstacle in the horizontal plane. The indicator may be displayed in a region of the display controlled by an application program on the mobile device, such as a text messaging application or an email application. Each indicator may be at least partially transparent so as not entirely to obstruct the display of the application program. In different embodiments, the total of all displayed indicators may cover less than 10%, less than 5%, or less than 1% of the display.
[0005] In some embodiments, the vertical position of the indicator on the display may be selected such that greater distance from a bottom of the display represents greater distance of the respective obstacle from the user.
[0006] In some embodiments, the sensors are further operated to detect a direction of motion of the obstacles relative to the user. The indicators of the objects may be oriented to indicate the detected relative direction of motion. In some embodiments, the sensors are operated to detect a velocity of the obstacles relative to the user, wherein the indicators of the object are sized to indicate a magnitude of the detected velocities.
[0007] In another exemplary method performed on a mobile device, the device determines the location of an obstacle with respect to the user. The mobile device further determines whether that device is in a relatively horizontal orientation or in a relatively vertical orientation. The device displays an indicator representing the obstacle on a display of the mobile device, wherein, if the mobile device is in a relatively horizontal orientation, the location of the indicator on the display represents the location of the obstacle from a bird's-eye view; and if the mobile device is in a relatively vertical orientation, the location of the indicator on the display represents the location of the obstacle from an augmented-reality view.
[0008] In some embodiments, a mobile device is provided, where the mobile device includes at least one sensor, a display, a processor, and a non-transitory computer-readable storage medium. The storage medium stores instructions that are operative, when executed on the processor, to perform functions including (i) operating the sensor to determine a position of at least one obstacle in a two-dimensional horizontal plane relative to a user carrying the mobile device; and (ii) for each obstacle, displaying an indicator of the respective obstacle as an overlay on a display of the mobile device, wherein horizontal and vertical position of the indicator on the display represent the position of the respective obstacle in the horizontal plane.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIG. 1 is a flow diagram depicting an example method, in accordance with an embodiment.
[0010] FIG. 2 depicts a use case of a mobile device, in accordance with an embodiment.
[0011] FIG. 3 A is a plan view of a use case of a mobile device, in accordance with an embodiment.
[0012] FIG. 3B is a schematic illustration of a display of a mobile device, in accordance with an embodiment.
[0013] FIG. 4 is a schematic illustration of object indications on a display of a mobile device, in accordance with an embodiment.
[0014] FIG. 5 is a functional block diagram of a mobile device, in accordance with an embodiment.
[0015] FIG. 6 is an example message flow diagram, in accordance with an embodiment.
[0016] FIGs. 7A-7C are a partial front view (FIG. 7A), partial rear view (FIG. 7B), and bottom view (FIG. 7C) of a mobile device, in accordance with an embodiment.
[0017] FIG. 8 is a schematic illustration of a mobile device display, in accordance with an embodiment.
[0018] FIG. 9 is a schematic illustration of a mobile device display, in accordance with an embodiment.
[0019] FIG. 10 is an example message flow diagram of a process, in accordance with an embodiment.
[0020] FIG. 11 is a schematic functional block diagram of an example mobile device, in accordance with some embodiments.
DETAILED DESCRIPTION
[0021] FIG. 1 depicts an example method in accordance with an embodiment. In particular, FIG. 1 depicts the example method 100, which may be performed with the use of a mobile device, such as a smart phone or other wireless transmit-receive unit (WTRU) as described below. The terms 'mobile device' and 'User Equipment' (UE) may be used interchangeably throughout the specification. In accordance with an embodiment, a mobile device operates one or more sensors to detect an obstacle in step 102. The obstacle's distance and bearing from the mobile device is determined at step 104. An indication of the position of the obstacle is displayed on a display of the mobile device in step 106. In various embodiments, additional steps may be included. For example, in some embodiments, an obstacle type is determined in step 108. In step 110, an analysis may be performed to determine a risk of collision. If the risk of collision is determined to be sufficiently high, an alert (e.g. an audible and/or visible alert) may be provided to alert a user to the risk of collision. The determination of the risk of collision may take into consideration the obstacle type determined in step 108. The appearance of the indication on the mobile device display may depend on the risk of collision determined in step 110. For example, the indicator of an object may be displayed more prominently (e.g. larger or with brighter colors) if a collision is determined to be likely.
[0022] In step 102, the mobile device receives sensor readings and analyzes the readings for objects. Analyzing the readings for objects may include associating the sensor readings with previously detected objects. At step 104, the mobile device determines i) a distance from the UE, and ii) a bearing relative to the UE's typical forward direction, based on current sensor data. Step 104 may be completed multiple times if multiple objects are detected.
[0023] In accordance with some embodiments, the method 100 may also include determining the obstacle's type, or classification, at step 108. The obstacle's type of classification may include the type of object the obstacle is or the size of the object, for example. Example types include a person, a car, a bicycle, a lamp post, and the like. A style associated with the object indication may change or be altered based on the determined obstacle type.
[0024] The method 100 may also include analyzing for a collision risk between the detected object and the mobile device at step 110. In step 110, the detected object's position, velocity, path, and acceleration may all be determined using techniques known by those with skill in the art. Example methods to determine the object's status are to compare multiple detections of an object over time, utilize sensors capable of detecting object speed, such as with the Doppler effect, and others as known by those with skill in the relevant art. The collision risk analysis may also incorporate data extrinsic to the motion and position characteristics of the object, for example, a large truck moving quickly.
[0025] If the collision risk analysis indicates that there is a sufficiently high collision risk, a collision risk notification may be displayed at step 112. Factors analyzed during the collision risk analysis may include the amount of time before a predicted collision, the predicted closest point of approach with a distance less than a predetermined threshold, the velocity during the predicted collision, and the like. The collision risk notification may be a separate notification from the object indication displayed during step 106, or it may modify the appearance of the object indication displayed during step 106. Example modifications include changing the color, size, and the like.
[0026] FIG. 2 depicts use of a mobile device in accordance with an embodiment. In particular, FIG. 2 depicts the example scenario 200. In the example scenario 200, a mobile device comprises a forward-scanning sensor and a display. The forward-scanning sensor is integrated with the computer-controlled display. In the scenario 200, a user, or pedestrian, is holding the mobile device and viewing the display of the mobile device while the forward- scanning sensor scans for obstacles in the path of the pedestrian.
[0027] FIGs. 3A and 3B depict the example scenarios 302 and 304, respectively. In the example scenario 302, a user is holding a mobile device 306, similar to the mobile device depicted in scenario 200 of FIG. 2. The mobile device 306 is equipped with a forward-scanning sensor, the sensor configured to sense obstacles in a sensor field 308. The sensor field 308 is the area within the scope of the mobile device's forward-scanning sensor. As depicted in FIG. 3 A, the sensor field 308 extends in front of the user. In the example scenario 302, there are two objects within the sensor field 308, a first object 310 and a second object 312. The first object 310 is directly in front of the user and the second object 312 is forward and to the right of the user. Additionally, the second object 312 is closer to the user than the first object 310.
[0028] In the example scenario 304 of FIG. 3B, a mobile device 306 is configured to depict obstacles in front of the user of scenario 302 of FIG. 3 A. In particular, the example scenario 304 depicts a view of the top portion of the mobile device 306, which includes the mobile device display surface 314 and indications of the indications of detected objects 316 and 318. The indication of detected object 316 is located in the center of the display and correlates with the first object 310. The indication of detected object 318 is located on the right portion of the display and correlates with the second object 312. Additionally, because the first object 310 is located further away from the mobile device 306 than the second object 312, the indication of the detected object 316 is located closer to the top of the display than the indication of the detected object 318.
[0029] In accordance with an embodiment, an indication of a detected object appears on the periphery of the mobile device display surface. The indication of the detected object comprises the direction and the proximity of the detected objects. As shown in FIG. 3B, the direction of the detected objects is shown by the left-to-right orientation of the detect object, with objects straight ahead of the user in the center. Also shown in FIG. 3B, the proximity of each respective detected object is shown by depicting objects further from the user closer to the top of the display and objects that are near lower down the display. [0030] In accordance with an embodiment, additional aspects of detected object may also be indicated on the display of the mobile device. For example, the size of the indication of the detected object may increase for larger detected objects. A determined velocity of the object may be displayed by including a vector, the size of the vector being proportional to the velocity of the detected object, by changing the color of the indication, by making the object flash, or any other similar methods. The type of detected object may be displayed, for example, by an icon.
[0031] In accordance with an embodiment, the notifications provide for monitoring for potential collisions between a user and nearby objects. To provide for monitoring, the indications of detected objects are overlaid on images provided by the mobile devices display software. The object indicators are overlaid on images of the display. The object indicators may completely obscure the image directly behind the indicator or have a transparency. The object indicators are restricted to a portion of the display of a mobile device. The portion is along the periphery of the upper portions of the device display. The upper portion of the device display is defined as the upper portion of the display device as seen by the user, whether in the vertical or horizontal orientation.
[0032] In accordance with an embodiment, the indications are small enough such that, together, they do not cover more than 10% of the total device display. In some embodiments, the indications are small enough such that, together, they do not cover more than 5% of the total device display. In some embodiments, the indications are small enough such that, together, they do not cover more than 1% of the total device display.
[0033] In accordance with an embodiment, the indications are displayed from a point of view above the user, for example, a bird's-eye view. An object detected with a collision potential from in front of the user is displayed on the top-center portion of the device display. An object detected with a collision potential from the side of the user is displayed on the side of the display that the object is detected.
[0034] In some embodiments, indications are displayed in a bird's-eye view when the mobile device is held in a relatively horizontal configuration (with the display facing generally upward), but the positions of the indications changes when the device is held in a relatively vertical configuration (with the display facing generally backward toward the user). In such an embodiment, in the relatively vertical configuration, the object notifications are displayed from an augmented reality point of view. In the augmented reality point of view, object indications appear on the periphery of the display, but they appear in a portion of the display that corresponds to the object's real -world angle of approach. For example, when a user holds a smartphone vertically, with the display of the device perpendicular to the ground, an object approaching from the right would appear on the right side, an object approaching from the left would appear on the left side, and an object falling from the sky (such as the archetypal falling piano), would display on the top portion of the display.
[0035] In accordance with an embodiment, the indications may have graphic attributes corresponding to the detected object characteristics. A detected object may be indicated with a single dot or line on the periphery of the device display, or may use shape, size, or motion to indicate other detected object characteristics - such as size or velocity.
[0036] FIG. 4 depicts object indications, in accordance with an embodiment. In particular, FIG. 4 depicts an upper portion of a mobile device. The upper portion of the mobile device includes an upper portion of a display of the mobile device. The periphery of the device display contains indications of detected objects. The indications of detected objects may be, for example, vector arrows, with the size of the vector arrow being proportional to the velocity of the detected object. The velocity may be either a relative velocity (relative to the user) or an absolute velocity (relative to the ground). In some embodiments, an obstacle is indicated by a dot, and optionally a line or arrow extending from the dot indicates a velocity of the object (relative either to the ground or to the user).
[0037] In accordance with an embodiment, the mobile device may further be configured to provide a collision alarm. The collision alarm may be a visual alarm, a haptic alarm, an auditory alarm, or any other alarm as known by those with skill in the art.
[0038] In accordance with an embodiment, a forward- scanning sensor may be a camera. The camera is configured to capture light within its field of view and convey the detected light information to a collision processing module. The camera may detect visible of infrared light. In some embodiments, the camera comprises a 'fish-eye' camera with a wider field of view located on the upper rear-facing portion of a mobile device. In some embodiments, the camera is a depth camera (e.g., an RGBD camera).
[0039] The forward-scanning sensor may be an acoustic sensor, such as an ultrasonic sensor, with the acoustic sensor comprising a microphone and a speaker. The acoustic sensor is configured emit sounds from a microphone and detect reflected sounds. The transmitted and detected reflected sounds are conveyed to a collision processing module. Acoustic obstacle detection offers one or more benefits such as the use of currently available device configurations, operability in dark conditions, wide detection field, and manageable requirements for memory and processing power. The use of acoustic sensors can be used to detect and indicate other types of walking hazards, such as stairways, curbs, or other uneven walkways.
[0040] The forward-scanning sensor may include a depth camera such as time-of-flight sensor or RGBD-type depth camera. [0041] In some embodiments, the sensor may be a radio receiver, configured to receive wireless transmission from a nearby mobile device. The transmission from the nearby mobile device may include position information relating to the position of the nearby mobile device. The receiver conveys the position information relating to the nearby mobile device to the collision module.
[0042] The forward-scanning sensor may comprise any combination of a light sensor, an auditory sensor, a receiver, or other sensor known by those with skill in the relevant art.
[0043] In accordance with an embodiment, a mobile device further comprises side-scanning sensors. The side-scanning sensors comprise a light-based sensor, a sound-based sensor, or any other sensor as known by one with skill in the relevant art. The side-scanning sensor is configured to detect objects to the sides of the mobile device and convey detected data to a collision detection module.
[0044] FIG. 5 is a block diagram depicting a functional architecture of a mobile device, in accordance with an embodiment. In particular, FIG. 5 depicts a mobile device 500. The mobile device 500 comprises a velocity sensor 502, an object sensor 504, a collision module 506, an operating system application 508, a device display manager 510, a display surface 512, and a communication bus 514. The mobile device 500 may comprise other elements, as know by those with skill in the art, to accomplish other functions of a mobile device. A non-inclusive list of other elements includes a transmitter, a receiver, a user-interface, a battery, a computer processor, memory, and the like.
[0045] In an exemplary embodiment, the velocity sensor 502 comprises an accelerometer, RF triangulation - such as GPS, multi-base WiFi, or any other source of a user's velocity. The velocity sensor 502 is an optional device and need not be present in a mobile device.
[0046] The object sensor 504 comprises the forward-scanning sensors, side-scanning sensors, or any other sensor as known by those with skill in the relevant art. The object sensor 504 may take the form of the forward- scanning sensors and side-scanning sensors described in this disclosure.
[0047] The collision module 506 may be implemented using software configured to receive data from the velocity sensor 502 and the object sensor 504 to determine locations of objects detected by the user and to determine a collision threat associated with the detected object. The software of the collision module is further configured convey data regarding the likelihood of a collision and data regarding a detected object to the device display manager 510.
[0048] The collision module 506 may be configured by a user. The configurations may allow a user to set time-length of prediction paths, to show only objects that have a determined closest point of approach less than a pre-determined value, have a closest point of approach in a certain amount of time, or a combination of location and time.
[0049] The operating system application 508 may include any software used by the mobile device that is configured to convey image data to the device display manger 510.
[0050] The device display manager 510 is configured to receive image data from the operating system application 508 and the collision module 506. The device display manager 510 is further configured to convey display instructions and additional display instructions to the display surface 512. The additional display instructions are conveyed to the display surface 512 when the collision module detects an object that is a collision risk with the user. The additional display instructions may include the object's relative position, specifically both the distance and direction from the mobile device to the object. Additional information may also be conveyed in the display, including detected object size, type, velocity, current direction, and predicted path. Information about the object and its position may be conveyed on the display by information such as the shape of an indication (e.g. a circle), the size of the indication (e.g. the diameter of a circle), and the position of an indication (e.g., an x-y offset from a corner of the display area).
[0051] FIG. 6 is a call flow diagram of a method 600, in accordance with an embodiment. In this example, the components of the mobile device 500 are used in the method 600. At 602, the velocity sensor 502 conveys user velocity data to the collision module 506. At 604, the object sensor 504 conveys object detection data to the collision module 506. At 606, the collision module 506 processes the received information and conveys collision risk indication data to the device display manager 510. At 608, the operating system application conveys image data to the device display manger 510. At 610, the device display manager 510 conveys display instructions to the display surface 512. At 612, the device display manager 510 conveys additional display instructions to the display surface 512.
[0052] FIG. 7A-7C are schematic illustrations of three views of a smartphone 700 that may be used as a mobile device in some embodiments. FIG. 7A illustrates the top portion of the front side of smartphone 700 (the side on which the display is positioned). As seen in FIG. 7A, the smartphone 700 includes a front-facing microphone 701. FIG. 7B depicts the top portion of the back side of the mobile device (opposite the side on which the display is positioned). As seen in FIG. 7B, the smartphone 700 further includes a rear-facing microphone 702. FIG. 7B depicts a bottom edge of the smartphone 700. As seen in FIG. 7B, the smartphone 700 includes a bottom- facing microphone 704. In exemplary embodiments, the rear-facing microphone 702 and the bottom-facing microphone 704 operate to detect sound that is reflected back to the smartphone 700 in order to detect potential obstacles. The microphones may be used as a forward-scanning sensor that is configured to convey the detected sound to a collision module. [0053] FIG. 8 depicts a view of a display 800 of a mobile device, in accordance with an embodiment. The display 800 includes a display of a collision monitoring status indicator 802, detected object indications 804 and 806, and an operating system image 808. The collision monitoring status indicator 802 may be displayed when the collision module is operating and detected object indications are able to be displayed on a display of a mobile device. Of note, a detected object indication does not need to be displayed in order for the collision monitoring status indicator 802 to be displayed, especially if the collision module has not determined that any possible collision threats are present. The detected object indication 804 is displayed at the top of the display and slightly toward the left, which corresponds with a possible collision threat ahead and slightly to the left of the user. The detected object indication 804 may be displayed in the form of an arrow indicating that the object is in relative motion toward the user of the mobile device. The detected object indication 806 is displayed lower than the indicator 806, indicating that the obstacle corresponding to indicator 806 is closer to the user than the obstacle corresponding to indication 804. The indication 806 is positioned to the right, indicating that the corresponding obstacle is positioned to the right of the user carrying the mobile device. The detected object indications 804 and 806 are configured to be displayed as overlays on top of the operating system display 808.
[0054] FIG. 9 depicts a view of a mobile device display 900, in accordance with an embodiment. A portion 908 of the display is an operating system (OS) display area. The OS display area 908 provides device status information. Another portion 912 of the display is a portion controlled by the operating system and/or an application program. The application or OS display area 912 is controlled by the OS, an active application, or a combination of both. In the embodiment of FIG. 9, the application program is a text messaging application. A portion 910 of the mobile device display may be used for the display of detected object indicators. In the embodiment of FIG. 9, the detected object indication display area 910 is the area above the arcuate dotted line (which may or may not be visible on the actual display 900, in different embodiments).
[0055] The detected object information display area 910 is overlaid on top of the OS display area 908 and the application and OS display area. This area displays the object indicators, such as indicators 904 and 906. While the detected object indication display area 910 is displayed on the upper portion of the display device, in some embodiments the object indication display area 910 may be relocated. For example, the mobile device may be configured to detect eye movement and where a user is gazing on the mobile device. The indication display area 910 would be located at the periphery of where the mobile device determines where the user is gazing. In an example, when the mobile device determines the user is looking at the lower portion of the device display, the indication display area 910 may be located in the middle portion of the display. In embodiments with side-scanning sensors, the detected object indication display area 910 may extend down the left and right sides of the display device.
[0056] In accordance with an embodiment, the collision module may be further configured to display additional information in the detected object indication display area. Additional information may include information indicating direction of a desired destination, the direction of a next turn along a navigational path, the direction determined by data provided wirelessly by other devices or systems, the direction of an acoustic beacon, indications of walking surface hazards like uneven pavement and stairs, and the like.
[0057] In accordance with an embodiment, a smartphone such as the Apple iPhone may be configured to provide a user with collision indications. The smartphone may utilize built-in components, such as a control panel, a rear camera, a rear microphone, a bottom microphone, a bottom speaker, a computer processing system, and a display to provide the user with collision indications.
[0058] FIG. 10 depicts a process, in accordance with an embodiment. In particular, FIG. 10 depicts process 1000 which enables a smartphone to provide a user with collision indications. The process 1000 may be performed using a control panel 1050, an object detection module 1052, a smartphone microphone 1054, a smartphone speaker 1056, an application program 1058, mobile OS core services 1060, and a smartphone display 1062. The example process 1000 could be implemented on any iOS, Android, Windows, or other smartphone device.
[0059] At step 1001, an application program 1058 sends instructions to the OS core services 1060 on what to display, such as Cocoa or Media Services calls. At step 2002, the OS core services take higher-level drawing commands received from 1001 and convert them to hardware- specific commands needed by the smartphone display. Steps 1001 and 1002 repeat continuously.
[0060] At step 1003, a user enables the object monitoring function, through a user interface in the smartphone control panel 1050. At step 1004, the object detection module 1052, similar to the collision module described herein, begins monitoring the smartphone' s forward-scanning sensors, such as the rear-facing microphone and the bottom microphone. At step 1005, the object detection module directs the bottom speaker to emit a brief tone, for example a 25kHz tone. At 1006, the microphone detects the initial audio pulse and returning echoes.
[0061] At step 1007, the object detection module analyzes the audio signal from the microphone to generate position information of nearby objects. The object detection module may also combine object data from other forward- scanning sensors, such as a camera or a receiver. The object detection module provides instructions for generating visual indicators in an OS graphic framework, such as Carbon, Cocoa, or Quartz2d. The instructions define the indicator in terms of its shape, size, and location. Steps 1004-1008 may repeat until 1009, or when the smartphone turns off or goes into stand-by mode.
[0062] At step 1008, the OS core services interpret the graphic instructions and provide them to the smartphone display 1062. At step 1009, the user indicates that he or she no longer wishes to monitor incoming objects and disables the object detection module. Alternatively, the object detection module continues to run, repeating steps 1004-1008 when the phone's motion sensors indicate the user is walking.
[0063] In some embodiments the mobile device operates by operating the sensor to determine a position of an obstacle in a coordinate system anchored by the mobile device. The mobile device further operates to determine the position and orientation of the mobile device in a coordinate system anchored by the user. The determination of the position and orientation of the mobile device may be made based on information from sensors such as accelerometers, gyroscopes, and/or front- and rear-facing cameras, together with known information and/or predetermined assumptions about the position (e.g. height) in which a user is likely to be holding the mobile device. The mobile device then operates to transform the coordinates of the obstacle from the mobile-device-anchored coordinate system to the user-anchored coordinate system. The coordinates of the obstacle in the user-anchored coordinate system (or, in some embodiments, only the horizontal coordinates, or a projection of the position onto a horizontal plane) are then mapped to a position on the screen of the mobile device, and an indication representing the obstacle is displayed on the screen at the determined position.
[0064] In some embodiments, an indication of an obstacle (e.g. a dot) is displayed as an overlay at the appropriate location of the screen regardless of the use of that portion of the screen by an application program. In such an embodiment, the indication of one or more obstacles appears as an overlay over other programs on the mobile device, thus allowing use of the programs with minimal obstruction. This allows a user who is walking while using the mobile device to operate other application programs (e.g. email or texting applications) while still being alerted to the presence of obstacles.
[0065] In some embodiments, the mapping from the user-anchored coordinate system to a position on the screen is a non-linear mapping, such as a logarithmic mapping with respect to distance from the user.
[0066] In some embodiments, the mapping from the user-anchored coordinate system to a position on the screen is a mapping to only a portion of the screen, such as an upper portion of the screen.
[0067] Exemplary embodiments disclosed herein may be are implemented in a mobile device such as a wireless transmit/receive unit (WTRU) as described below. [0068] FIG. 11 is a system diagram of an exemplary WTRU 1102, which may be employed as a user device in embodiments described herein. As shown in FIG. 11, the WTRU 1102 may include a processor 1118, a communication interface 1119 including a transceiver 1120, a transmit/receive element 1122, a speaker/microphone 1124, a keypad 1126, a display/touchpad 1128, a non-removable memory 1130, a removable memory 1132, a power source 1134, a global positioning system (GPS) chipset 1136, and sensors 1138. It will be appreciated that the WTRU 1102 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
[0069] The processor 1118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 1118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 1102 to operate in a wireless environment. The processor 1118 may be coupled to the transceiver 1120, which may be coupled to the transmit/receive element 1122. While FIG. 11 depicts the processor 1118 and the transceiver 1120 as separate components, it will be appreciated that the processor 1118 and the transceiver 1120 may be integrated together in an electronic package or chip.
[0070] The transmit/receive element 1122 may be configured to transmit signals to, or receive signals from, a base station over the air interface 1115/1116/1117. For example, in one embodiment, the transmit/receive element 1122 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 1122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples. In yet another embodiment, the transmit/receive element 1122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 1122 may be configured to transmit and/or receive any combination of wireless signals.
[0071] In addition, although the transmit/receive element 1122 is depicted in FIG. 11 as a single element, the WTRU 1102 may include any number of transmit/receive elements 1122. More specifically, the WTRU 1102 may employ MTMO technology. Thus, in one embodiment, the WTRU 1102 may include two or more transmit/receive elements 1122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 1115/1116/1117.
[0072] The transceiver 1120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 1122 and to demodulate the signals that are received by the transmit/receive element 1122. As noted above, the WTRU 1102 may have multi-mode capabilities. Thus, the transceiver 1120 may include multiple transceivers for enabling the WTRU 1102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
[0073] The processor 1118 of the WTRU 1102 may be coupled to, and may receive user input data from, the speaker/microphone 1124, the keypad 1126, and/or the display/touchpad 1128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 1118 may also output user data to the speaker/microphone 1124, the keypad 1126, and/or the display/touchpad 1128. In addition, the processor 1118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 1130 and/or the removable memory 1132. The non-removable memory 1130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 1132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 1118 may access information from, and store data in, memory that is not physically located on the WTRU 1102, such as on a server or a home computer (not shown).
[0074] The processor 1118 may receive power from the power source 1134, and may be configured to distribute and/or control the power to the other components in the WTRU 1102. The power source 1134 may be any suitable device for powering the WTRU 1102. As examples, the power source 1134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), and the like), solar cells, fuel cells, and the like.
[0075] The processor 1118 may also be coupled to the GPS chipset 1136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 1102. In addition to, or in lieu of, the information from the GPS chipset 1136, the WTRU 102 may receive location information over the air interface 1115/1116/1117 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 1102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
[0076] The processor 1118 may further be coupled to other peripherals 1138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 1138 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
[0077] Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
[0078] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer- readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a Wireless Transmit-Receive Unit (WTRU), User Equipment (UE), terminal, base station, RNC, or any host computer.

Claims

1. A method comprising:
operating at least one sensor on a mobile device to determine a position of at least one obstacle in a two-dimensional horizontal plane relative to a user carrying the mobile device; for each obstacle, displaying an indicator of the respective obstacle as an overlay on a display of the mobile device, wherein horizontal and vertical position of the indicator on the display represent the position of the respective obstacle in the horizontal plane.
2. The method of claim 1, wherein determining a position of the obstacle relative to the user comprises:
determining a position of an obstacle in a coordinate system anchored by the mobile device;
determining the position and orientation of the mobile device in a coordinate system anchored by the user; and
transforming the coordinates of the obstacle from the mobile-device-anchored coordinate system to the user-anchored coordinate system.
3. The method of any of claims 1 and 2, wherein the indicator is displayed in a region of the display controlled by an application program on the mobile device.
4. The method of claim 3, wherein the application program is a text messaging application.
5. The method of claim 3, wherein the application program is an email application.
6. The method of claim any of claims 1-5, wherein all displayed indicators cover less than 10% of the display.
7. The method of any of claims 1-6, wherein the vertical position of the indicator on the display is selected such that greater distance from a bottom of the display represents greater distance of the respective obstacle from the user.
8. The method of any of claims 1-7, wherein the mobile device is a smartphone.
9. The method of any of claims 1-8, further comprising operating the sensor to detect a direction of motion of the obstacle relative to the user, wherein the indicator of the object is oriented to indicate the detected relative direction of motion.
10. The method of any of claims 1-9, further comprising operating the sensor to detect a velocity of the obstacle relative to the user, wherein the indicator of the object is sized to indicate a magnitude of the detected velocity.
11. The method of any of claims 1-10, wherein the position of a plurality of obstacles is determined, and wherein a plurality of respective indicators are displayed on the display of the mobile device.
12. The method of any of claims 1-11, wherein the sensor comprises a camera.
13. The method of any of claims 1-12, wherein the sensor comprises an acoustic sensor.
14. The method of any of claims 1-13, further comprising:
detecting a potential collision between the user and the obstacle; and
providing an alert of the potential collision.
15. The method of any of claims 1-14, wherein the position of the indicator is restricted to an upper periphery of the mobile device display.
16. A method performed on a mobile device, the method comprising:
determining the location of an obstacle with respect to the user;
determining whether the mobile device is in a relatively horizontal orientation or a relatively vertical orientation; and
displaying an indicator representing the obstacle on a display of the mobile device, wherein:
if the mobile device is in a relatively horizontal orientation, the location of the indicator on the display represents the location of the obstacle from a bird's-eye view; and
if the mobile device is in a relatively vertical orientation, the location of the indicator on the display represents the location of the obstacle from an augmented-reality view.
17. A mobile device comprising at least one sensor, a display, a processor, and a non-transitory computer-readable storage medium storing instructions operative, when executed on the processor, to perform functions comprising:
operating the at least one sensor to determine a position of at least one obstacle in a two- dimensional horizontal plane relative to a user carrying the mobile device; and
for each obstacle, displaying an indicator of the respective obstacle as an overlay on a display of the mobile device, wherein horizontal and vertical position of the indicator on the display represent the position of the respective obstacle in the horizontal plane.
18. The mobile device of claim 17, wherein the mobile device is a smartphone.
19. The mobile device of any of claims 17-18, wherein the sensor comprises a camera.
20. The mobile device of any of claims 17-19, wherein the sensor comprises an acoustic sensor.
PCT/US2016/039879 2015-07-06 2016-06-28 Systems and methods for providing non-intrusive indications of obstacles WO2017007643A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201562189034P true 2015-07-06 2015-07-06
US62/189,034 2015-07-06

Publications (1)

Publication Number Publication Date
WO2017007643A1 true WO2017007643A1 (en) 2017-01-12

Family

ID=56561429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/039879 WO2017007643A1 (en) 2015-07-06 2016-06-28 Systems and methods for providing non-intrusive indications of obstacles

Country Status (1)

Country Link
WO (1) WO2017007643A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107566659A (en) * 2017-10-20 2018-01-09 维沃移动通信有限公司 User security based reminding method and mobile terminal
CN109001741A (en) * 2018-05-24 2018-12-14 深圳市沃特沃德股份有限公司 The alarm method and system of intelligent terminal chance barrier
WO2019081699A1 (en) * 2017-10-27 2019-05-02 Osram Opto Semiconductors Gmbh Monitoring system for a mobile device and method for monitoring surroundings of a mobile device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130029730A1 (en) * 2011-07-25 2013-01-31 Fujitsu Limited Mobile electronic apparatus, danger notifying method, and medium for storing program
US20140300466A1 (en) * 2013-04-04 2014-10-09 Samsung Electronics Co., Ltd. Apparatus and method for preventing accident in portable terminal
US8953841B1 (en) * 2012-09-07 2015-02-10 Amazon Technologies, Inc. User transportable device with hazard monitoring

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130029730A1 (en) * 2011-07-25 2013-01-31 Fujitsu Limited Mobile electronic apparatus, danger notifying method, and medium for storing program
US8953841B1 (en) * 2012-09-07 2015-02-10 Amazon Technologies, Inc. User transportable device with hazard monitoring
US20140300466A1 (en) * 2013-04-04 2014-10-09 Samsung Electronics Co., Ltd. Apparatus and method for preventing accident in portable terminal

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FOERSTER, KLAUS-TYCHO ET AL.: "Proceedings of the 13th International Conference on Mobile and Ubiquitous Multimedia", 2014, ACM, article "SpareEye: enhancing the safety of inattentionally blind smartphone users"
HINCAPIE-RAMOS; JUAN DAVID; POURANG IRANI: "Proceedings of the SIGCHI Conference on Human Factors in Computing Systems", 2013, ACM, article "CrashAlert: enhancing peripheral alertness for eyes-busy mobile interaction while walking"
JUAN DAVID HINCAPIÉ-RAMOS ET AL: "CrashAlert", HUMAN FACTORS IN COMPUTING SYSTEMS, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 27 April 2013 (2013-04-27), pages 3385 - 3388, XP058043183, ISBN: 978-1-4503-1899-0, DOI: 10.1145/2470654.2466463 *
KLAUS-TYCHO FOERSTER ET AL: "SpareEye", MOBILE AND UBIQUITOUS MULTIMEDIA, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 25 November 2014 (2014-11-25), pages 68 - 72, XP058061886, ISBN: 978-1-4503-3304-7, DOI: 10.1145/2677972.2677973 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107566659A (en) * 2017-10-20 2018-01-09 维沃移动通信有限公司 User security based reminding method and mobile terminal
WO2019081699A1 (en) * 2017-10-27 2019-05-02 Osram Opto Semiconductors Gmbh Monitoring system for a mobile device and method for monitoring surroundings of a mobile device
CN109001741A (en) * 2018-05-24 2018-12-14 深圳市沃特沃德股份有限公司 The alarm method and system of intelligent terminal chance barrier

Similar Documents

Publication Publication Date Title
US9906406B2 (en) Alerting method and mobile terminal
EP3163498A2 (en) Alarming method and device
KR20160107054A (en) Vehicle control apparatus and method thereof, vehicle driving assistance apparatus and method thereof, mobile terminal and method thereof
US20170013116A1 (en) Method, apparatus and computer-readable medium for travel path condition prompt
KR20170021188A (en) Virtual reality headset for notifying object and method thereof
WO2017007643A1 (en) Systems and methods for providing non-intrusive indications of obstacles
US20200074740A1 (en) System and method for placement of augmented reality information for users based on their activity
US20200033127A1 (en) Information processing apparatus, information processing method, and recording medium
US10930147B2 (en) Electronic apparatus, roadside unit, and transport system
CN109173258B (en) Virtual object display and positioning information sending method, equipment and storage medium
CN109581358B (en) Obstacle recognition method, obstacle recognition device and storage medium
CN111681455A (en) Control method for electronic device, and recording medium
US10848606B2 (en) Divided display of multiple cameras
CN111126182A (en) Lane line detection method, lane line detection device, electronic device, and storage medium
US20170341579A1 (en) Proximity Warning Device
WO2017204332A1 (en) Portable electronic device, control system, control method, and control program
CN108318706A (en) The speed-measuring method and device of mobile object
US10609510B2 (en) Mobile electronic apparatus, mobile electronic apparatus control method, a non-transitory computer readable recording medium, for providing warnings to a user of the apparatus based on the location of the electronic apparatus
CN111127937A (en) Traffic information transmission method, device and system and storage medium
JP6605566B2 (en) Portable electronic device, portable electronic device control method, and portable electronic device control program
CN112734346B (en) Method, device and equipment for determining lane coverage and readable storage medium
KR20150082855A (en) Method for warning and detecting obstacle forward for when walking using smart phone, and smart phone of loading application of function the same
JP6560770B2 (en) Electronic device, control system, and control method of electronic device
WO2018038236A1 (en) Electronic device, control program, and method for operating electronic device
KR102299501B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16745895

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16745895

Country of ref document: EP

Kind code of ref document: A1