WO2016048237A1 - Method and system for accident avoidance - Google Patents

Method and system for accident avoidance Download PDF

Info

Publication number
WO2016048237A1
WO2016048237A1 PCT/SG2015/050329 SG2015050329W WO2016048237A1 WO 2016048237 A1 WO2016048237 A1 WO 2016048237A1 SG 2015050329 W SG2015050329 W SG 2015050329W WO 2016048237 A1 WO2016048237 A1 WO 2016048237A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
interest
accident
distance
portable device
Prior art date
Application number
PCT/SG2015/050329
Other languages
French (fr)
Inventor
Zhou WEI
Robert Deng
Chengfang Fang
Original Assignee
Huawei International Pte. Ltd.
Singapore Management University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei International Pte. Ltd., Singapore Management University filed Critical Huawei International Pte. Ltd.
Publication of WO2016048237A1 publication Critical patent/WO2016048237A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the invention generally relates to method and system for predicting potential accident and notifying user of the predicted potential accident, thereby reducing or avoiding accidents between a user of a portable device and other objects in the environment.
  • Embodiments of the invention provide an accident avoidance method implementable in a portable device.
  • the accident avoidance method comprises the following steps: using a camera on a portable device utilised by a user, capturing a street view image at every predetermined time interval; identifying an object of interest based on the street view image; based on the street view image, ascertaining at least one relative position parameter of the object of interest with respect to the user; predicting a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and notifying the user of the predicted potential accident.
  • the at least one relative position parameter includes at least one of a relative distance between the object of interest and the user, and a relative angle between the object of interest and a direction of a lens of the camera, i.e. an angle formed between a straight line connecting the object of interest to the user and a line of direction of the camera lens.
  • the method further comprises: ascertaining at least one of a motion speed, a motion direction, and a motion acceleration of the user based on data obtained through a plurality of sensors provided in the portable device, wherein predicting a potential accident is further based on a breach of the danger threshold by the at least one of the motion speed, the motion direction and the motion acceleration of the user.
  • the user of the portable device is moving with holding the portable device
  • the method further comprises: ascertaining at least one of motion speed, motion direction and motion acceleration of the object of interest based on at least two street view images successively captured by the camera from which the object of interest is identified; wherein predicting a potential accident is further based on at least one of the relative distance and the relative angle, and at least one of the motion speed, motion direction and motion acceleration of the object of interest.
  • the method may further comprise: ascertaining a suggestion for avoiding the potential accident based on at least one parameter used for prediction of the potential accident; and then notifying the user of the suggestion. Thereby, the user of the portable device may be able to avoid the potential accident in a timely and effectively manner.
  • the potential accident may be notified to the user of the portable device by at least one of a visual notification, an audio notification and a kinaesthetic notification to ensure the user can notice the potential accident.
  • Embodiments of the invention also provide an accident avoidance system implementable in a portable device, the system comprises: a camera configured to capture a street view image at every predetermined time interval; an object recognition module configured to identify an object of interest based on the street view image; an accident prediction module configured to determine at least one relative position parameter of the object of interest with respect to the user based on the street view image; and predict a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and a notification module configured to notify the user of the predicted potential accident.
  • the accident prediction module is configured to ascertain at least one of a relative distance between the object of interest and the user, and a relative angle between the object of interest and a direction of a lens of the camera; and predict a potential accident based on the ascertained at least one of the relative distance and the relative angle.
  • the system further comprises: a user status analysing module configured to ascertain at least one of a motion speed, a motion direction and a motion acceleration of the user based on data obtained through a plurality of sensors provided in the portable device; wherein the accident prediction module is further configured to predict a potential accident based on the ascertained at least one of the motion speed, the motion direction and the motion acceleration of the user.
  • the accident prediction module is further configured to ascertain at least one of a motion speed, motion direction and motion acceleration of the object of interest based on at least two street view images successively captured by the camera from which the object of interest is identified; and predict a potential accident based on at least one of the relative distance and the relative angle, and at least one of the motion speed, motion direction and motion acceleration of the object of interest.
  • the accident prediction module is further configured to ascertain a suggestion for avoiding the predicted potential accident based on at least one parameter used for prediction of the potential accident; and the notification module is further configured to notify the user of the suggestion.
  • the portable device may a mobile computing and/or communication device, or a mobility aid device.
  • a potential accident with object(s) in the environment can be automatically predicted, and a notification shall be generated to warn the portable device user of the predicted potential accident.
  • the user is therefore alerted to potential danger(s), and he can take appropriate action to avert the danger(s) thereby effectively reducing or avoiding the predicted potential accident.
  • Figure 1 is a schematic diagram illustrating the functional modules of an accident avoidance system according to a first embodiment of the invention.
  • Figure 2 is a flow chart illustrating a method for predicting a potential accident according to the first embodiment of the invention.
  • Figure 3 (a) illustrates the values in a rectangular coordinate system for describing the position of a portable device user obtained from an accelerometer when the user is static.
  • Figure 3 (b) illustrates the values in a rectangular coordinate system for describing the position of a portable device user obtained from an accelerometer when the user is moving according to the first embodiment of the invention.
  • Figure 4 illustrates the yaw, pitch and roll motion of a portable device.
  • Figure 5 shows an object distance between the portable device user and the object of interest according to a first example of the first embodiment.
  • Figure 6 shows an object distance and object angle between the portable device user and the object of interest according to a second example of the first embodiment.
  • Figure 7 shows a pixel angle in an image according to the first embodiment of the invention.
  • Figure 8 is a flow chart illustrating a method for ascertaining a vertical distance between the object of interest and the direction of the camera lens according to the first embodiment of the invention.
  • the invention provides a method and system which are adapted for use in a portable device for predicting potential accidents and notifying a user of the predicted potential accident.
  • the portable device may be, but is not limited to, a mobile computing and/or communication device (e.g. smart phone and tablet) or a mobility aid device (e.g. wheelchair for the physically impaired, crutch or cane for the visually impaired).
  • the portable device may be hand-held or wearable by a user or utilised to assist or transport a user.
  • a user is walking while holding a portable device (e.g. phone).
  • the portable device is provided with a height sensor for ascertaining the height of the portable device relative to the ground; an accelerometer sensor for ascertaining motion speed of the user of the portable device; and a camera for capturing street view images from a predetermined perspective with respect to the user, e.g. front facing, side facing or rear facing.
  • the portable device may be further provided with a rotation sensor for ascertaining yaw value of the portable device to ascertain the motion direction of the user.
  • the accident avoidance system does not additionally require the above components. However, if the portable device is not equipped with some or all of the above components, the accident avoidance system additionally require some or all of the above components to be installed.
  • the accident avoidance system 100 is implemented in a portable device 10, which comprises the following modules as shown in Figure 1 : a user status analysing module 1 10, an object recognition module 120, an accident prediction module 130 and a notification module 140.
  • the user status analysing module 1 10 is configured to ascertain a motion speed of the user of the portable device 10 based on data obtained through the accelerometer sensor 1 1 1 .
  • the user status analysing module 1 10 may be further configured to ascertain a motion direction of the user based on data obtained through the rotation sensor 1 12. It is to be noted that this user status analysing module 1 10 is an optional module.
  • the object recognition module 120 is configured to identify objects of interest from street view images captured by the camera 121 .
  • the camera may be pre-set to capture an image at predetermined time intervals after the accident avoidance system is initiated either manually or by a pre-determined trigger event, e.g. detection of a motion of the portable device 10.
  • Street view images refer to images of the environment around the user and include outdoor and indoor street views.
  • the accident prediction module 130 is configured to predict a potential accident based on various data. Particularly, the accident prediction module is configured to ascertain at least one relative position parameter of the object of interest with respect to the user based on each captured street view image from which the object of interest is identified. Based on the ascertained relative position parameter(s), the accident prediction module is configured to predict a potential accident by comparing the ascertained relative position parameter(s) against a predetermined danger threshold. Optionally, the accident prediction module 130 is configured to predict a potential accident by comparing the relative position parameter(s), and at least one of motion speed, motion direction and motion acceleration of the user against a predetermined danger threshold.
  • relative position parameter(s) may include at least one of a relative distance between the object of interest and the portable device user, and a relative angle between the object of interest and a direction of a lens of the camera, i.e. an angle formed between a straight line connecting the object of interest to the user and a line of direction of the camera lens.
  • the direction of the camera lens substantially coincides with the motion direction of the user as the camera is front facing.
  • the notification module 140 is configured to notify the user of the predicted potential accident. Notification may be realised by predetermined visual, audio or kinaesthetic means, e.g. an alert displayed on a screen of the portable device 10, a sound notification or a vibration of the portable device 10.
  • an accident avoidance method implementable in a portable device 10 comprises the following as shown in Figure 2:
  • a motion speed of the user is ascertained based on data obtained from an accelerometer sensor 1 1 1 .
  • the accelerometer sensor 1 1 1 can provide information for estimating the number of steps the user has taken over a time period.
  • the portable device user is static, the values in x, y, z axis as obtained from the accelerometer sensor 1 1 1 are stable as shown in Figure 3 (a).
  • the values in three axes fluctuate (move up and down) as shown in Figure 3(b).
  • the number of steps N within a specific time period can be obtained based on the peak and trough frequency along the x axis as shown in Figure 3 (b).
  • the user's step length L may be obtained by the accelerometer sensor 1 1 1 , user input or machine learning.
  • the number of steps may be obtained directly from a specific sensor, e.g. a step counter sensor.
  • the motion speed may be obtained directly from a specific sensor, e.g. a walking speed sensor.
  • a motion direction of the user is ascertained based on data obtained from a rotation sensor 1 12.
  • the rotation sensor 1 12 provided in the user's portable device 10 may obtain rotation data of the portable device 10, e.g. yaw, pitch, roll values as shown in Figure 4.
  • the change of the yaw values of the portable device 10 is used to ascertain the motion direction of the user.
  • the motion direction of the user at time t may be ascertained based on Equation (2):
  • A yaw (t)-yaw(t-1 ) (2) wherein yaw (t) refers to the yaw value at time t, yaw(t-1 ) refers to the yaw value at time t-1 , numerical value of ⁇ refers to the size of rotation angle, and the sign of ⁇ is used to ascertain the direction of rotation.
  • the current motion direction of the user coincides with the direction of the lens of the camera 121 of the portable device 10 since the camera lens is front facing.
  • objects of interest are identified based on a street view image captured at every predetermined time interval.
  • objects of interest may include pedestrians, obstacles, e.g. rubbish bins, traffic lights, dangerous terrains, etc.
  • Known object recognition algorithms or methods may be used to identify objects of interest from the street view images captured by the camera.
  • a method for identifying an object of interest from a street view image includes the following: the street view image in colour is optionally converted to a grey image in order to reduce computation complexity; the grey image is optionally down sampled based on processor and memory ability of the portable device 10; features of various objects in the image are extracted from the grey image; object(s) of interest are distinguished from the various objects and identified by applying classifiers pre-installed in the portable device 10, or pre-trained by a machine learning based on the extracted features.
  • street view images are front facing views from the user in this embodiment
  • street view images captured in other embodiments may be taken from other user's perspective, e.g. side facing or rear facing views.
  • the portable device 10 is a mobile phone placed near the ear of the user to facilitate call answering
  • the camera of the mobile phone will capture side facing views from the user and the accident avoidance system is to analyse potential dangers to the side of the user by recognizing and analyzing objects of interest moving towards the side of the user.
  • the relative position parameters may include a relative distance and a relative angle for each identified object of interest.
  • Relative distance refers to a distance between the object of interest and the portable device user.
  • Relative angle refers to an angle formed between a straight line connecting the object of interest to the user and a line of direction of a lens of the camera.
  • Figure 5 is a side view representation showing a relative distance between the portable device user and the object of interest according to a first example of the first embodiment.
  • the object of interest is substantially in line with the direction of the camera lens/motion direction of the user.
  • H refers to the height of the portable device 10 which may be obtained by a height sensor provided in the portable device 10.
  • refers to the vertical field of view of the camera provided in the portable device 1 0.
  • a camera has vertical, horizontal and diagonal fields of view, which are fixed for any particular camera.
  • D refers to the relative distance between the user ground point C to the object ground point O.
  • d 0 refers to the distance between the user ground point C and the ground point B of the vertical field of view, i.e. the ground point of the boundary of the image.
  • d x refers to the distance from the ground point B to the object ground point O.
  • d x may be ascertained based on a proportional relationship between a real distance between a pixel distance in a camera-captured image. This proportional relationship may be ascertained and pre-stored in the portable device 10.
  • the pre-determined proportional relationship is a
  • the pixel distance in the image is J
  • Figure 6 shows a relative distance and relative angle of an object of interest with respect to the user of the portable device 10 according to a second example of the first embodiment.
  • the object of interest is at an angle with respect to the direction of the camera lens/the motion direction of the user.
  • the object distance D may be ascertained based on the object angle ⁇ and the vertical distance D'.
  • the vertical distance L r between the object of interest and the motion direction of the user is required.
  • arctan (L r /(d 0 + d ) (6)
  • a second method for ascertaining the vertical distance L r comprises the following as shown in Figure 8.
  • 0 1 arctan((x-w/2 )/(h-y-iieiglit)) x>w/2
  • 0 1 arctan((w/2-x)/(h- -heiglit)) x ⁇ w/2 ⁇
  • w, h refer to image resolution parameters
  • height refers to the pixel height of the object of interest in the image.
  • the pixel coordinates of the object of interest in the image are denoted by (x, y+height).
  • the angle 0 is ascertained by using a predetermined proportional relationship between real angle and pixel angle.
  • the accident prediction module 130 predicts a potential accident based on at least one relative position parameter. Particularly, if the at least one relative position parameter breaches a predetermined danger threshold, a potential accident is predicted.
  • relative angle or relative distance or both is compared against the predetermined danger threshold to predict a potential accident in block 2005.
  • At least one of relative angle and relative distance, and at least one of motion speed and motion direction of the user are compared against the predetermined danger threshold to predict a potential accident in block 2005.
  • the motion direction of the user when the motion direction of the user is changing with respect to time, the position of the same object of interest in two successively-captured street view images may significantly change to an extent that the object of interest identified in a previous street view image may not appear in a subsequent street view image.
  • the motion direction of the user may be used by the accident prediction module 130 to predict a potential accident.
  • At least one of relative angle and relative distance, and at least one of motion speed, motion direction and motion acceleration of the user are compared against the predetermined danger threshold to predict a potential accident in block 2005.
  • the motion acceleration of the user may be directly obtained from an accelerometer sensor 1 1 1 provided in the portable device 10.
  • a notification is provided to the user if a potential accident has been predicted in block 2005. If no potential accident has been predicted, no notification is provided to the user.
  • the notification step may be realized by at least one of visual, audio and kinaesthetic means, e.g. an alert displayed on a screen of the portable device 10, a sound or voice notification or a vibration of the portable device 10.
  • visual, audio and kinaesthetic means e.g. an alert displayed on a screen of the portable device 10, a sound or voice notification or a vibration of the portable device 10.
  • the notification step may further include providing a suggestion on how to avoid the predicted potential accident.
  • This suggestion may be ascertained based on the ascertained relative position parameters, the motion speed and direction of the user. For example, the suggestion may be "stop walking”, “slow down”, “and turn to right/left", “pay attention to the traffic light”. Further, reasons for the suggestion may also be provided, for example, "you are about to cross the street”. Additionally, in other embodiments, where the object of interest is moving, accordingly, the suggestion may be ascertained based on the motion speed and direction of the object of interest. The method for ascertaining motion speed and direction of the object of interest will be described below.
  • the accident avoidance system of the second embodiment is different over the first embodiment as follows.
  • at least one relative position parameter of the object of interest is considered in block 2005
  • at least one of motion direction, motion speed and motion acceleration of the object of interest may be additionally considered.
  • the motion speed and motion direction of the object of interest may be ascertained based on at least two street view images successively captured by the camera in which the same object of interest is identified.
  • a first relative distance and a first relative angle are ascertained; based on a second image captured subsequent to the first image, a second relative distance and a second relative angle are ascertained. Then, the motion direction, motion speed and motion acceleration of the object of interest may be determined based on the first and second relative angle, and the first and second relative distance.
  • the accident prediction module 130 of the accident avoidance system 100 may be further configured to ascertain at least one of the motion speed, motion direction and motion acceleration of the identified object of interest based on at least two street view images successively captured by the camera in which the same object of interest is identified, and to ascertain whether a predetermined danger threshold has been breached by at least one relative position parameter ascertained for the identified object of interest, by at least one of the motion direction, motion speed and motion acceleration of the user, and by at least one of motion speed, motion direction and motion acceleration of the identified object of interest.
  • a different scenario is considered where the portable device user is static while the object of interest is moving.
  • the motion speed, motion direction or motion acceleration of the user when predicting a potential accident in block 2005, only the following parameters may be considered: at least one relative position parameter (e.g. the relative distance, and the relative angle) ascertained for the object of interest, and at least one of motion speed, motion direction and motion acceleration of the object of interest.
  • the motion speed, direction and acceleration of the identified object may be ascertained based on at least two street view images successively captured by the camera in which the same object of interest is identified.
  • the accident detection system 100 may not include the user status analysing module 1 10 which is configured to ascertain at least one of motion speed, motion direction and motion acceleration of the user based on information obtained from sensors provided in the portable device 10.
  • the accident avoidance system 100 includes the user status analysing module 1 10 which is only initiated when the user is moving.
  • the accident prediction module 130 may be configured to predict a potential accident based on only at least one relative position parameter ascertained for the identified object of interest.
  • the portable device 10 may further include a foreground camera to capture an image of the user's face (hereinafter "facial image").
  • facial image an image of the user's face
  • the accident prediction module 130 of the accident avoidance system 100 may be further configured to ascertain whether the user has been aware of the object of interest based on the user's facial image captured by the foreground camera.
  • the accident avoidance system 100 may further include a face recognition module which is configured to estimate the user's face direction relative to the portable device 10 based on the facial image captured by the foreground camera. Then the accident prediction module 130 is further configured to ascertain whether the user is aware of the object of interest based on the estimated user's face direction relative to the portable device 10, and the at least one relative position parameter of the object of interest with respect to the user, e.g. the relative distance and the relative angle of the object interest with respect to the user, and then the notification module 140 is configured to notify the user of the predicted potential accident only if the user is ascertained not aware of the object of interest.
  • a face recognition module which is configured to estimate the user's face direction relative to the portable device 10 based on the facial image captured by the foreground camera. Then the accident prediction module 130 is further configured to ascertain whether the user is aware of the object of interest based on the estimated user's face direction relative to the portable device 10, and the at least one relative position parameter of the object of interest with respect to the user
  • the accident prediction module 130 may be configured to ascertain the user's face direction relative to the object of interest based on the user's face direction relative to the portable device 10, the motion direction of the user(i.e. the current direction of the portable device 10, and the relative distance and relative angle of the object of interest with respect to the user, and then ascertain whether the user is aware of the object of interest based on the ascertained user's face direction relative to the object of interest, the relative distance of the object of interest with respect to the user.
  • the user's visual acuity may also be used to ascertain whether the user is aware of the object of interest. Information about the user's visual acuity may be pre-stored in a database and provided in the accident avoidance system 100.
  • the accident avoidance system 100 may further include a database for storing user characteristics, such as height, weight, visual acuity, auditory acuity, pre-determined danger threshold (including distance and angle threshold).
  • the database may further include possible actions the user may take in response to a predicted potential accident, i.e. suggestions for avoiding a potential accident, and the corresponding criteria for taking such possible actions. For example, when the accident avoidance system 100 ascertains that the motion direction of the user is towards the object of interest and the object distance has breached the predetermined danger threshold, then the possible actions may include changing the motion direction, e.g. turning to his left or right.
  • the accident avoidance system 100 may also be adapted for a mobility aid device for the physically and/or visually impaired, e.g. for a wheelchair, a crutch or a cane.
  • the accident avoidance system 100 is provided in a mobility aid device.
  • the accident avoidance system 100 includes a memory storage unit for storing programmable instructions and a processor unit configured to execute the programmable instructions to perform the above-described methods according to the embodiments of the invention.
  • a height sensor, an accelerometer, a rotation sensor and a camera have to be provided to the mobility aid to implement the accident avoidance system.
  • the accident avoidance system 100 may be installed to an intelligent cane for a visually impaired person to alert him to pedestrians, obstacles, dangerous terrains or important signs, e.g. traffic lights.
  • the accident avoidance system 100 installed to the cane is to detect objects of interest to the visually impaired. Similar to the first embodiment, at least one of motion speed, motion direction and motion acceleration of the visually impaired person is ascertained based on the information from the sensors. Objects of interest are identified based on street view images captured by the camera at predetermined time intervals.
  • a relative distance and a relative angle are ascertained with respect to the identified object of interest and the visually impaired person based on the street view image from which the object of interest is identified. Based on at least one ascertained relative position parameter, and at least one of motion speed, motion direction and motion acceleration of the visually impaired person, the accident avoidance system 100 determines whether a predetermined danger threshold is breached. If there is a breach, the system 100 notifies the visually impaired person of the predicted potential accident.
  • the methods described in the second embodiment and the third embodiment above are also applicable in a scenario where the user is visually impaired, i.e. when a visually impaired person is walking with such a cane, at the same time an object of interest is moving towards the visually impaired person, and then the method described in the second embodiment can be used.
  • the visually impaired person with the cane is static, while the object of interest is moving towards the visually impaired person, then the method described in the third embodiment can be used.
  • a suggestion on how to avoid the predicted potential accident may be further provided to the visually impaired person.
  • the suggestion may be ascertained based on at least one of the following parameters: the ascertained relative position parameters, the motion speed, motion direction and motion acceleration of the visually impaired person, and the motion speed, motion direction, and motion acceleration of the object of interest.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

Embodiments of the invention provide an accident avoidance method and system implementable in a portable device to reduce or avoid potential accidents between a portable device user and other objects in the environment. The accident avoidance method comprises: using a camera on a portable device utilised by a user, capturing a street view image at every predetermined time interval; identifying (2003) an object of interest based on the street view image; based on the street view image, ascertaining (2004) at least one relative position parameter of the object of interest with respect to the user; predicting (2005) a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and notifying (2006) the user of the predicted potential accident. The accident avoidance system includes a user status analysing module, an object recognition module, an accident prediction module and a notification module.

Description

METHOD AND SYSTEM FOR ACCIDENT AVOIDANCE
Field of Invention
The invention generally relates to method and system for predicting potential accident and notifying user of the predicted potential accident, thereby reducing or avoiding accidents between a user of a portable device and other objects in the environment.
Background
Nowadays, intelligent portable devices e.g., smart phones and tablets, have become an indispensable tool for communication, entertainment, work and even lifestyle management. It is very common for a portable device user to multitask, e.g. using portable device to communicate with friends, browse internet, play a game, check email, or watch TV while walking down the street. Accordingly, multitasking while using a portable device has been proved to be one of the principle causes of accidents, e.g., collision, crash, or fall. An analysis conducted at Ohio State University on hospital data found that injuries involving pedestrians on their cell phones have more than doubled between year 2005 and year 2010. A Pew Research study found that 53% of all adult cell phone owners have been involved in a distracted walking encounter, and numerous incidents concerning distracted walking in the recent years have made headlines. For example, in January 201 1 , a woman fell into a fountain at a Pennsylvania mall while texting; a tourist in Australia walked off a pier while checking Facebook on her phone.
In view of the above and other problems, it is desirable to provide a solution for predicting potential accidents which is adapted for a user device, e.g. a portable device, and notifying the user of the predicted potential accident.
Summary of Invention
Embodiments of the invention provide an accident avoidance method implementable in a portable device. The accident avoidance method comprises the following steps: using a camera on a portable device utilised by a user, capturing a street view image at every predetermined time interval; identifying an object of interest based on the street view image; based on the street view image, ascertaining at least one relative position parameter of the object of interest with respect to the user; predicting a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and notifying the user of the predicted potential accident.
Preferably, the at least one relative position parameter includes at least one of a relative distance between the object of interest and the user, and a relative angle between the object of interest and a direction of a lens of the camera, i.e. an angle formed between a straight line connecting the object of interest to the user and a line of direction of the camera lens.
In one embodiment of the invention where the user of the portable device is walking while holding a portable device, in order to more accurately predict the potential accidents, the method further comprises: ascertaining at least one of a motion speed, a motion direction, and a motion acceleration of the user based on data obtained through a plurality of sensors provided in the portable device, wherein predicting a potential accident is further based on a breach of the danger threshold by the at least one of the motion speed, the motion direction and the motion acceleration of the user. In this embodiment, the user of the portable device is moving with holding the portable device,
In another embodiment of the invention where the object of interest is moving, in order to more accurately predict the potential accidents, the method further comprises: ascertaining at least one of motion speed, motion direction and motion acceleration of the object of interest based on at least two street view images successively captured by the camera from which the object of interest is identified; wherein predicting a potential accident is further based on at least one of the relative distance and the relative angle, and at least one of the motion speed, motion direction and motion acceleration of the object of interest. According to one embodiment of the invention, the method may further comprise: ascertaining a suggestion for avoiding the potential accident based on at least one parameter used for prediction of the potential accident; and then notifying the user of the suggestion. Thereby, the user of the portable device may be able to avoid the potential accident in a timely and effectively manner.
In the embodiments of the invention, the potential accident may be notified to the user of the portable device by at least one of a visual notification, an audio notification and a kinaesthetic notification to ensure the user can notice the potential accident.
Embodiments of the invention also provide an accident avoidance system implementable in a portable device, the system comprises: a camera configured to capture a street view image at every predetermined time interval; an object recognition module configured to identify an object of interest based on the street view image; an accident prediction module configured to determine at least one relative position parameter of the object of interest with respect to the user based on the street view image; and predict a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and a notification module configured to notify the user of the predicted potential accident.
Preferably, the accident prediction module is configured to ascertain at least one of a relative distance between the object of interest and the user, and a relative angle between the object of interest and a direction of a lens of the camera; and predict a potential accident based on the ascertained at least one of the relative distance and the relative angle. In one embodiment of the invention where the user of the portable device is walking while holding a portable device, in order to more accurately predict the potential accidents, the system further comprises: a user status analysing module configured to ascertain at least one of a motion speed, a motion direction and a motion acceleration of the user based on data obtained through a plurality of sensors provided in the portable device; wherein the accident prediction module is further configured to predict a potential accident based on the ascertained at least one of the motion speed, the motion direction and the motion acceleration of the user. In another embodiment of the invention where the object of interest is moving, in order to more accurately predict the potential accidents, the accident prediction module is further configured to ascertain at least one of a motion speed, motion direction and motion acceleration of the object of interest based on at least two street view images successively captured by the camera from which the object of interest is identified; and predict a potential accident based on at least one of the relative distance and the relative angle, and at least one of the motion speed, motion direction and motion acceleration of the object of interest.
Preferably, the accident prediction module is further configured to ascertain a suggestion for avoiding the predicted potential accident based on at least one parameter used for prediction of the potential accident; and the notification module is further configured to notify the user of the suggestion. According to the embodiments of the invention, the portable device may a mobile computing and/or communication device, or a mobility aid device.
With the accident avoidance method and system implementable in a portable device, a potential accident with object(s) in the environment can be automatically predicted, and a notification shall be generated to warn the portable device user of the predicted potential accident. The user is therefore alerted to potential danger(s), and he can take appropriate action to avert the danger(s) thereby effectively reducing or avoiding the predicted potential accident.
Brief Description of the Drawings:
Figure 1 is a schematic diagram illustrating the functional modules of an accident avoidance system according to a first embodiment of the invention.
Figure 2 is a flow chart illustrating a method for predicting a potential accident according to the first embodiment of the invention.
Figure 3 (a) illustrates the values in a rectangular coordinate system for describing the position of a portable device user obtained from an accelerometer when the user is static.
Figure 3 (b) illustrates the values in a rectangular coordinate system for describing the position of a portable device user obtained from an accelerometer when the user is moving according to the first embodiment of the invention. Figure 4 illustrates the yaw, pitch and roll motion of a portable device. Figure 5 shows an object distance between the portable device user and the object of interest according to a first example of the first embodiment.
Figure 6 shows an object distance and object angle between the portable device user and the object of interest according to a second example of the first embodiment.
Figure 7 shows a pixel angle in an image according to the first embodiment of the invention.
Figure 8 is a flow chart illustrating a method for ascertaining a vertical distance between the object of interest and the direction of the camera lens according to the first embodiment of the invention.
Detailed Description of Embodiments of the Invention:
In the following description, numerous specific details are set forth in order to provide a thorough understanding of various illustrative embodiments of the invention. It will be understood, however, to one skilled in the art, that embodiments of the invention may be practiced without some or all of these specific details. It is understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to limit the scope of the invention. In the drawings, like reference numerals refer to same or similar functionalities or features throughout the several views.
The invention provides a method and system which are adapted for use in a portable device for predicting potential accidents and notifying a user of the predicted potential accident. The portable device may be, but is not limited to, a mobile computing and/or communication device (e.g. smart phone and tablet) or a mobility aid device (e.g. wheelchair for the physically impaired, crutch or cane for the visually impaired). The portable device may be hand-held or wearable by a user or utilised to assist or transport a user.
In a first embodiment of the invention, a user is walking while holding a portable device (e.g. phone). The portable device is provided with a height sensor for ascertaining the height of the portable device relative to the ground; an accelerometer sensor for ascertaining motion speed of the user of the portable device; and a camera for capturing street view images from a predetermined perspective with respect to the user, e.g. front facing, side facing or rear facing. In this embodiment, the portable device may be further provided with a rotation sensor for ascertaining yaw value of the portable device to ascertain the motion direction of the user. It is to be noted that since many existing portable devices are provided with the above-mentioned components including height sensor, accelerometer sensor, rotation sensor and camera, the accident avoidance system does not additionally require the above components. However, if the portable device is not equipped with some or all of the above components, the accident avoidance system additionally require some or all of the above components to be installed.
According to the first embodiment of the invention, the accident avoidance system 100 is implemented in a portable device 10, which comprises the following modules as shown in Figure 1 : a user status analysing module 1 10, an object recognition module 120, an accident prediction module 130 and a notification module 140.
The user status analysing module 1 10 is configured to ascertain a motion speed of the user of the portable device 10 based on data obtained through the accelerometer sensor 1 1 1 . The user status analysing module 1 10 may be further configured to ascertain a motion direction of the user based on data obtained through the rotation sensor 1 12. It is to be noted that this user status analysing module 1 10 is an optional module.
The object recognition module 120 is configured to identify objects of interest from street view images captured by the camera 121 . The camera may be pre-set to capture an image at predetermined time intervals after the accident avoidance system is initiated either manually or by a pre-determined trigger event, e.g. detection of a motion of the portable device 10. Street view images refer to images of the environment around the user and include outdoor and indoor street views.
The accident prediction module 130 is configured to predict a potential accident based on various data. Particularly, the accident prediction module is configured to ascertain at least one relative position parameter of the object of interest with respect to the user based on each captured street view image from which the object of interest is identified. Based on the ascertained relative position parameter(s), the accident prediction module is configured to predict a potential accident by comparing the ascertained relative position parameter(s) against a predetermined danger threshold. Optionally, the accident prediction module 130 is configured to predict a potential accident by comparing the relative position parameter(s), and at least one of motion speed, motion direction and motion acceleration of the user against a predetermined danger threshold.
In this embodiment, relative position parameter(s) may include at least one of a relative distance between the object of interest and the portable device user, and a relative angle between the object of interest and a direction of a lens of the camera, i.e. an angle formed between a straight line connecting the object of interest to the user and a line of direction of the camera lens. It is to be noted that in this embodiment, the direction of the camera lens substantially coincides with the motion direction of the user as the camera is front facing.
The notification module 140 is configured to notify the user of the predicted potential accident. Notification may be realised by predetermined visual, audio or kinaesthetic means, e.g. an alert displayed on a screen of the portable device 10, a sound notification or a vibration of the portable device 10. According to the first embodiment of the invention, an accident avoidance method implementable in a portable device 10 comprises the following as shown in Figure 2:
In block 2001 (optional), a motion speed of the user is ascertained based on data obtained from an accelerometer sensor 1 1 1 . The accelerometer sensor 1 1 1 can provide information for estimating the number of steps the user has taken over a time period. When the portable device user is static, the values in x, y, z axis as obtained from the accelerometer sensor 1 1 1 are stable as shown in Figure 3 (a). When the user is moving e.g. walking or running, the values in three axes fluctuate (move up and down) as shown in Figure 3(b). Assuming the user is walking along the x axis direction, the number of steps N within a specific time period can be obtained based on the peak and trough frequency along the x axis as shown in Figure 3 (b). The user's step length L may be obtained by the accelerometer sensor 1 1 1 , user input or machine learning. The motion speed V of the user can be ascertained according to Equation (1 ) based on the number of steps N, step length L and time period At . ν= (Ν Χ Ι_)/ Δί (1 )
It is to be noted that the number of steps may be obtained directly from a specific sensor, e.g. a step counter sensor. The motion speed may be obtained directly from a specific sensor, e.g. a walking speed sensor.
In block 2002 (optional), a motion direction of the user is ascertained based on data obtained from a rotation sensor 1 12.
The rotation sensor 1 12 provided in the user's portable device 10 may obtain rotation data of the portable device 10, e.g. yaw, pitch, roll values as shown in Figure 4. In this embodiment, the change of the yaw values of the portable device 10 is used to ascertain the motion direction of the user. The motion direction of the user at time t may be ascertained based on Equation (2):
A=yaw (t)-yaw(t-1 ) (2) wherein yaw (t) refers to the yaw value at time t, yaw(t-1 ) refers to the yaw value at time t-1 , numerical value of Δ refers to the size of rotation angle, and the sign of Δ is used to ascertain the direction of rotation.
It is to be noted that in this embodiment, the current motion direction of the user coincides with the direction of the lens of the camera 121 of the portable device 10 since the camera lens is front facing.
In block 2003, objects of interest are identified based on a street view image captured at every predetermined time interval.
In this embodiment, objects of interest may include pedestrians, obstacles, e.g. rubbish bins, traffic lights, dangerous terrains, etc. Known object recognition algorithms or methods may be used to identify objects of interest from the street view images captured by the camera. In one example, a method for identifying an object of interest from a street view image includes the following: the street view image in colour is optionally converted to a grey image in order to reduce computation complexity; the grey image is optionally down sampled based on processor and memory ability of the portable device 10; features of various objects in the image are extracted from the grey image; object(s) of interest are distinguished from the various objects and identified by applying classifiers pre-installed in the portable device 10, or pre-trained by a machine learning based on the extracted features.
It is to be noted that although the street view images are front facing views from the user in this embodiment, street view images captured in other embodiments may be taken from other user's perspective, e.g. side facing or rear facing views. For example, when the portable device 10 is a mobile phone placed near the ear of the user to facilitate call answering, the camera of the mobile phone will capture side facing views from the user and the accident avoidance system is to analyse potential dangers to the side of the user by recognizing and analyzing objects of interest moving towards the side of the user.
In block 2004, at least one relative position parameter of the identified object of interest with respect to the user is ascertained based on each street view image from which the object of interest is identified. In this embodiment, the relative position parameters may include a relative distance and a relative angle for each identified object of interest. Relative distance refers to a distance between the object of interest and the portable device user. Relative angle refers to an angle formed between a straight line connecting the object of interest to the user and a line of direction of a lens of the camera. A method for ascertaining the relative distance and relative angle will be described below with reference to Figures 5 to 8.
Figure 5 is a side view representation showing a relative distance between the portable device user and the object of interest according to a first example of the first embodiment. In this example, the object of interest is substantially in line with the direction of the camera lens/motion direction of the user. Referring to Figure 5, H refers to the height of the portable device 10 which may be obtained by a height sensor provided in the portable device 10. Θ refers to the vertical field of view of the camera provided in the portable device 1 0. Typically, a camera has vertical, horizontal and diagonal fields of view, which are fixed for any particular camera. D refers to the relative distance between the user ground point C to the object ground point O. d0 refers to the distance between the user ground point C and the ground point B of the vertical field of view, i.e. the ground point of the boundary of the image. dx refers to the distance from the ground point B to the object ground point O. With these parameters, the relative distance D may be ascertained according to Equations (3) and (4):
D= Jn + d (3) d0 = H ÷ tan(0 /2) (4) dx may be ascertained based on a proportional relationship between a real distance between a pixel distance in a camera-captured image. This proportional relationship may be ascertained and pre-stored in the portable device 10. Thus, assuming the pre-determined proportional relationship is a, and the pixel distance in the image is J, , then the real distance ^ can be obtained according to the Equation (5): d, = a XJ, (5) Since the method for ascertaining the proportional relationship between real distance between pixel distance in the image is well known by a person skilled in the art, it is not described in detail in this embodiment.
Figure 6 shows a relative distance and relative angle of an object of interest with respect to the user of the portable device 10 according to a second example of the first embodiment. In this example, the object of interest is at an angle with respect to the direction of the camera lens/the motion direction of the user.
In this second example, a vertical distance D'= d0 + d1 may be ascertained using the same method as mentioned in the first example. The object distance D may be ascertained based on the object angle ψ and the vertical distance D'. In order to ascertain the object angle ψ according to the Equation (6), the vertical distance Lr between the object of interest and the motion direction of the user is required. ψ = arctan (Lr/(d0 + d ) (6)
In this example, there are two methods for ascertaining the vertical distance Lr. The first method is to translate pixel distance in the captured image to real distance based on a predetermined proportional relationship. Since the method for obtaining the proportional relationship between real distance and pixel distance is known by a person skilled in the art, it is not described in detail here. A second method for ascertaining the vertical distance Lr comprises the following as shown in Figure 8.
In block 8001 , the pixel angle 0' in the image as shown in Figure 7 is ascertained based on Equation (7):
01 = arctan((x-w/2 )/(h-y-iieiglit)) x>w/2
01 = arctan((w/2-x)/(h- -heiglit)) x<w/2 ^ wherein w, h refer to image resolution parameters, height refers to the pixel height of the object of interest in the image. The pixel coordinates of the object of interest in the image are denoted by (x, y+height).
In block 8002, the angle 0 is ascertained by using a predetermined proportional relationship between real angle and pixel angle.
Since the method for obtaining the proportional relationship between real angle and pixel angle is known by a person skilled in the art, it is not described in detail here.
In block 8003, the vertical distance Lr is ascertained according to Equation (8):
Figure imgf000018_0001
With the above calculations, reference is now made to block 2005 of Figure 2, where the accident prediction module 130 predicts a potential accident based on at least one relative position parameter. Particularly, if the at least one relative position parameter breaches a predetermined danger threshold, a potential accident is predicted.
In one example, relative angle or relative distance or both is compared against the predetermined danger threshold to predict a potential accident in block 2005.
In an another example, in order to produce a more accurate prediction of a potential accident, at least one of relative angle and relative distance, and at least one of motion speed and motion direction of the user are compared against the predetermined danger threshold to predict a potential accident in block 2005. For example, when the motion direction of the user is changing with respect to time, the position of the same object of interest in two successively-captured street view images may significantly change to an extent that the object of interest identified in a previous street view image may not appear in a subsequent street view image. In this circumstance, the motion direction of the user may be used by the accident prediction module 130 to predict a potential accident.
In a further example, at least one of relative angle and relative distance, and at least one of motion speed, motion direction and motion acceleration of the user are compared against the predetermined danger threshold to predict a potential accident in block 2005. The motion acceleration of the user may be directly obtained from an accelerometer sensor 1 1 1 provided in the portable device 10. In block 2006, a notification is provided to the user if a potential accident has been predicted in block 2005. If no potential accident has been predicted, no notification is provided to the user.
The notification step may be realized by at least one of visual, audio and kinaesthetic means, e.g. an alert displayed on a screen of the portable device 10, a sound or voice notification or a vibration of the portable device 10.
The notification step may further include providing a suggestion on how to avoid the predicted potential accident. This suggestion may be ascertained based on the ascertained relative position parameters, the motion speed and direction of the user. For example, the suggestion may be "stop walking", "slow down", "and turn to right/left", "pay attention to the traffic light". Further, reasons for the suggestion may also be provided, for example, "you are about to cross the street". Additionally, in other embodiments, where the object of interest is moving, accordingly, the suggestion may be ascertained based on the motion speed and direction of the object of interest. The method for ascertaining motion speed and direction of the object of interest will be described below.
As mentioned above, in the first embodiment, it is assumed that the user is moving and the object of interest is static. In a second embodiment, a different scenario is considered where both the user and the object of interest are moving. The accident avoidance system of the second embodiment is different over the first embodiment as follows. After at least one relative position parameter of the object of interest is considered in block 2005, at least one of motion direction, motion speed and motion acceleration of the object of interest may be additionally considered. The motion speed and motion direction of the object of interest may be ascertained based on at least two street view images successively captured by the camera in which the same object of interest is identified. For example, based on a first image, a first relative distance and a first relative angle are ascertained; based on a second image captured subsequent to the first image, a second relative distance and a second relative angle are ascertained. Then, the motion direction, motion speed and motion acceleration of the object of interest may be determined based on the first and second relative angle, and the first and second relative distance. That is to say, according to the second embodiment, the accident prediction module 130 of the accident avoidance system 100 may be further configured to ascertain at least one of the motion speed, motion direction and motion acceleration of the identified object of interest based on at least two street view images successively captured by the camera in which the same object of interest is identified, and to ascertain whether a predetermined danger threshold has been breached by at least one relative position parameter ascertained for the identified object of interest, by at least one of the motion direction, motion speed and motion acceleration of the user, and by at least one of motion speed, motion direction and motion acceleration of the identified object of interest. In a third embodiment of the invention, a different scenario is considered where the portable device user is static while the object of interest is moving. In this scenario, it is not necessary to ascertain the motion speed, motion direction or motion acceleration of the user since the user is static. Therefore, in this embodiment, when predicting a potential accident in block 2005, only the following parameters may be considered: at least one relative position parameter (e.g. the relative distance, and the relative angle) ascertained for the object of interest, and at least one of motion speed, motion direction and motion acceleration of the object of interest. The motion speed, direction and acceleration of the identified object may be ascertained based on at least two street view images successively captured by the camera in which the same object of interest is identified. Accordingly, in this embodiment, the accident detection system 100 may not include the user status analysing module 1 10 which is configured to ascertain at least one of motion speed, motion direction and motion acceleration of the user based on information obtained from sensors provided in the portable device 10. In other embodiments of the invention, the accident avoidance system 100 includes the user status analysing module 1 10 which is only initiated when the user is moving.
It is to be noted that even in the second and the third embodiments, the accident prediction module 130 may be configured to predict a potential accident based on only at least one relative position parameter ascertained for the identified object of interest. In the above embodiments, the portable device 10 may further include a foreground camera to capture an image of the user's face (hereinafter "facial image"). Before generating a notification, at the notification module 140, to the user about the predicted potential accident, the accident prediction module 130 of the accident avoidance system 100 may be further configured to ascertain whether the user has been aware of the object of interest based on the user's facial image captured by the foreground camera. If the accident prediction module 130 ascertains that the user is aware of the object of interest, then the notification will not be generated even if the danger threshold has been breached. In order to ascertain whether the user is aware of the object of interest, the accident avoidance system 100 may further include a face recognition module which is configured to estimate the user's face direction relative to the portable device 10 based on the facial image captured by the foreground camera. Then the accident prediction module 130 is further configured to ascertain whether the user is aware of the object of interest based on the estimated user's face direction relative to the portable device 10, and the at least one relative position parameter of the object of interest with respect to the user, e.g. the relative distance and the relative angle of the object interest with respect to the user, and then the notification module 140 is configured to notify the user of the predicted potential accident only if the user is ascertained not aware of the object of interest.
According to one embodiment of the invention, the accident prediction module 130 may be configured to ascertain the user's face direction relative to the object of interest based on the user's face direction relative to the portable device 10, the motion direction of the user(i.e. the current direction of the portable device 10, and the relative distance and relative angle of the object of interest with respect to the user, and then ascertain whether the user is aware of the object of interest based on the ascertained user's face direction relative to the object of interest, the relative distance of the object of interest with respect to the user. Optionally, the user's visual acuity may also be used to ascertain whether the user is aware of the object of interest. Information about the user's visual acuity may be pre-stored in a database and provided in the accident avoidance system 100.
According to one embodiment of the invention, the accident avoidance system 100 may further include a database for storing user characteristics, such as height, weight, visual acuity, auditory acuity, pre-determined danger threshold (including distance and angle threshold). The database may further include possible actions the user may take in response to a predicted potential accident, i.e. suggestions for avoiding a potential accident, and the corresponding criteria for taking such possible actions. For example, when the accident avoidance system 100 ascertains that the motion direction of the user is towards the object of interest and the object distance has breached the predetermined danger threshold, then the possible actions may include changing the motion direction, e.g. turning to his left or right. According to certain embodiments of the invention, the accident avoidance system 100 may also be adapted for a mobility aid device for the physically and/or visually impaired, e.g. for a wheelchair, a crutch or a cane.
In a fourth embodiment of the invention, the accident avoidance system 100 is provided in a mobility aid device. The accident avoidance system 100 includes a memory storage unit for storing programmable instructions and a processor unit configured to execute the programmable instructions to perform the above-described methods according to the embodiments of the invention. In this embodiment, if the mobility aid device has no any pre-equipped sensors and cameras, a height sensor, an accelerometer, a rotation sensor and a camera have to be provided to the mobility aid to implement the accident avoidance system.
In one example of the fourth embodiment, the accident avoidance system 100 may be installed to an intelligent cane for a visually impaired person to alert him to pedestrians, obstacles, dangerous terrains or important signs, e.g. traffic lights. When a visually impaired person walks with such a cane, the accident avoidance system 100 installed to the cane is to detect objects of interest to the visually impaired. Similar to the first embodiment, at least one of motion speed, motion direction and motion acceleration of the visually impaired person is ascertained based on the information from the sensors. Objects of interest are identified based on street view images captured by the camera at predetermined time intervals. For each identified object of interest, a relative distance and a relative angle are ascertained with respect to the identified object of interest and the visually impaired person based on the street view image from which the object of interest is identified. Based on at least one ascertained relative position parameter, and at least one of motion speed, motion direction and motion acceleration of the visually impaired person, the accident avoidance system 100 determines whether a predetermined danger threshold is breached. If there is a breach, the system 100 notifies the visually impaired person of the predicted potential accident.
Accordingly, the methods described in the second embodiment and the third embodiment above are also applicable in a scenario where the user is visually impaired, i.e. when a visually impaired person is walking with such a cane, at the same time an object of interest is moving towards the visually impaired person, and then the method described in the second embodiment can be used. When the visually impaired person with the cane is static, while the object of interest is moving towards the visually impaired person, then the method described in the third embodiment can be used.
A suggestion on how to avoid the predicted potential accident may be further provided to the visually impaired person. The suggestion may be ascertained based on at least one of the following parameters: the ascertained relative position parameters, the motion speed, motion direction and motion acceleration of the visually impaired person, and the motion speed, motion direction, and motion acceleration of the object of interest.
Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the invention. Furthermore, certain terminology has been used for the purposes of descriptive clarity, and not to limit the disclosed embodiments of the invention. The embodiments and features described above should be considered exemplary.

Claims

CLAIMS:
1 . An accident avoidance method implementable in a portable device, the method comprising: using a camera on a portable device utilised by a user, capturing a street view image at every predetermined time interval; identifying an object of interest based on the street view image; based on the street view image, ascertaining at least one relative position parameter of the object of interest with respect to the user; predicting a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and notifying the user of the predicted potential accident.
2. The method of claim 1 , wherein the at least one relative position parameter includes at least one of a relative distance between the object of interest and the user, and a relative angle between the object of interest and a direction of a lens of the camera.
3. The method of claim 2, further comprising: ascertaining at least one of a motion speed, a motion direction, and a motion acceleration of the user based on data obtained through a plurality of sensors provided in the portable device; and wherein predicting a potential accident is further based on a breach of the danger threshold by the at least one of the motion speed, the motion direction and the motion acceleration of the user.
4. The method of claim 2 or claim 3, wherein the relative distance is ascertained based on a height of the portable device relative to the ground, a vertical field of view of the camera, a pixel distance of the object of interest in the street view image and a predetermined proportional relationship between real distance and pixel distance.
5. The method of claim 2 or claim 3, wherein the relative angle is ascertained based on a first vertical distance and a second vertical distance between the object of interest and the user, wherein the first vertical distance is ascertained based on a height of the portable device relative to the ground, a vertical field of view of the camera, a pixel distance of the object of interest in the street view image and a predetermined proportional relationship between real distance and pixel distance, wherein the second vertical distance is ascertained based on a corresponding pixel distance and a predetermined proportional relationship between pixel distance and real distance, or based on the corresponding pixel distance, pixel coordinates of the object of interest and a proportional relationship between real angle and pixel angle.
6. The method of any one of claims 2 to 5, further comprising: ascertaining at least one of motion speed, motion direction and motion acceleration of the object of interest based on at least two street view images successively captured by the camera from which the object of interest is identified; wherein predicting a potential accident is further based on at least one of the relative distance and the relative angle, and at least one of the motion speed, motion direction and motion acceleration of the object of interest.
7. The method of any one of the preceding claims, further comprising: ascertaining a suggestion for avoiding the potential accident based on at least one parameter used for prediction of the potential accident; and notifying the user of the suggestion.
8. The method of any one of the preceding claims, wherein notifying the user of the potential accident includes providing at least one of a visual notification, an audio notification and a kinaesthetic notification.
9. The method of any one of the preceding claims, further comprising: before notifying the user of the predicted potential accident, ascertaining whether the user is aware of the object of interest by ascertaining a face direction of the user with respect to the object of interest based on a facial image of the user as captured by a foreground camera provided in the portable device, and the at least one relative position parameter of the object of interest with respect to the user, and notifying the user of the predicted potential accident only if the user is ascertained not aware of the object of interest.
10. The method of any one of the preceding claims, wherein the portable device is a mobile computing and/or communication device, or a mobility aid device.
1 1 . An accident avoidance system implementable in a portable device, the system comprising: a camera configured to capture a street view image at every predetermined time interval; an object recognition module configured to identify an object of interest based on the street view image; an accident prediction module configured to determine at least one relative position parameter of the object of interest with respect to the user based on the street view image; and predict a potential accident based on a breach of a predetermined danger threshold by the at least one relative position parameter; and a notification module configured to notify the user of the predicted potential accident.
12. The system of claim 1 1 , wherein the accident prediction module is configured to ascertain at least one of a relative distance between the object of interest and the user and a relative angle between the object of interest and a direction of a lens of the camera; and predict a potential accident based on the ascertained at least one of the relative distance and the relative angle.
13. The system of claim 12, further comprising: a user status analysing module configured to ascertain at least one of a motion speed, a motion direction and a motion acceleration of the user based on data obtained through a plurality of sensors provided in the portable device; wherein the accident prediction module is further configured to predict a potential accident based on the ascertained at least one of the motion speed, the motion direction and the motion acceleration of the user.
14. The system of claim 12 or claim 13, wherein the accident prediction module is configured to ascertain the relative distance based on a height of the portable device relative to the ground, a vertical field of view of the camera, a pixel distance of the object of interest in the street view image and a predetermined proportional relationship between real distance and pixel distance.
15. The system of claim 12 or claim 13, wherein the accident prediction module is configured to ascertain the relative angle based on a first vertical distance and a second vertical distance between the object of interest and the user, wherein the first vertical distance is ascertained based on a height of the portable device relative to the ground, a vertical field of view of the camera, a pixel distance of the object of interest in the street view image and a predetermined proportional relationship between real distance and pixel distance, wherein the second vertical distance is ascertained based on a corresponding pixel distance and a predetermined proportional relationship between pixel distance and real distance, or based on the corresponding pixel distance, pixel coordinates of the object of interest and a proportional relationship between real angle and pixel angle.
16. The system of any one of claims 12 to 15, wherein the accident prediction module is further configured to ascertain at least one of a motion speed, motion direction and motion acceleration of the object of interest based on at least two street view images successively captured by the camera from which the object of interest is identified; and predict a potential accident based on at least one of the relative distance and the relative angle, and at least one of the motion speed, motion direction and motion acceleration of the object of interest.
17. The system of any one of claims 1 1 to 16, the accident prediction module is further configured to ascertain a suggestion for avoiding the predicted potential accident based on at least one parameter used for prediction of the potential accident; and the notification module is further configured to notify the user of the suggestion.
1 8. The system of any one of claims 1 1 to 16, wherein the notification module is configured to notify the user of the predicted potential accident by providing at least one of a visual notification, an audio notification and a kinaesthetic notification.
1 9. The system of any one of claims 1 1 to 16, further comprising: a face recognition module configured to estimate a face direction of the user relative to the portable device based on a facial image as captured by a foreground camera provided in the portable device, wherein the accident prediction module is further configured to ascertain whether the user is aware of the object of interest based on the estimated face direction of the user relative to the portable device, and the at least one relative position parameter of the object of interest with respect to the user, the notification module is further configured to notify the user of the predicted potential accident only if the user is ascertained not aware of the object of interest.
20. The system of any one of claims 1 1 to 16, wherein the portable device is a mobile computing and/or communication device, or a mobility aid device.
21 . A portable device comprising an accident avoidance system of any one of claims 1 1 to 20.
PCT/SG2015/050329 2014-09-25 2015-09-18 Method and system for accident avoidance WO2016048237A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201406053Q 2014-09-25
SG10201406053QA SG10201406053QA (en) 2014-09-25 2014-09-25 Method and system for accident avoidance

Publications (1)

Publication Number Publication Date
WO2016048237A1 true WO2016048237A1 (en) 2016-03-31

Family

ID=54293308

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2015/050329 WO2016048237A1 (en) 2014-09-25 2015-09-18 Method and system for accident avoidance

Country Status (2)

Country Link
SG (1) SG10201406053QA (en)
WO (1) WO2016048237A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050146600A1 (en) * 2003-12-29 2005-07-07 Jan Chipchase Method and apparatus for improved handset multi-tasking, including pattern recognition and augmentation of camera images
US20110143816A1 (en) * 2008-06-10 2011-06-16 Frank Fischer Portable device including warning system and method
US20120264406A1 (en) * 2011-04-15 2012-10-18 Avaya Inc. Obstacle warning system and method
CN103106374A (en) * 2013-01-15 2013-05-15 广东欧珀移动通信有限公司 Safety pre-warning processing method, system and mobile terminal of reminding user of mobile terminal
US20140080540A1 (en) * 2012-09-19 2014-03-20 Elitegroup Computer Systems Co.,Ltd. Portable mobile device
US8953841B1 (en) * 2012-09-07 2015-02-10 Amazon Technologies, Inc. User transportable device with hazard monitoring

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050146600A1 (en) * 2003-12-29 2005-07-07 Jan Chipchase Method and apparatus for improved handset multi-tasking, including pattern recognition and augmentation of camera images
US20110143816A1 (en) * 2008-06-10 2011-06-16 Frank Fischer Portable device including warning system and method
US20120264406A1 (en) * 2011-04-15 2012-10-18 Avaya Inc. Obstacle warning system and method
US8953841B1 (en) * 2012-09-07 2015-02-10 Amazon Technologies, Inc. User transportable device with hazard monitoring
US20140080540A1 (en) * 2012-09-19 2014-03-20 Elitegroup Computer Systems Co.,Ltd. Portable mobile device
CN103106374A (en) * 2013-01-15 2013-05-15 广东欧珀移动通信有限公司 Safety pre-warning processing method, system and mobile terminal of reminding user of mobile terminal

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHUANG MENG-CHE ET AL: "Estimating Gaze Direction of Vehicle Drivers Using a Smartphone Camera", 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, IEEE, 23 June 2014 (2014-06-23), pages 165 - 170, XP032649663, DOI: 10.1109/CVPRW.2014.30 *
CHUANG-WEN YOU ET AL: "CarSafe App: Alerting Drowsy and Distracted Driversusing Dual Cameras on Smartphones", PROCEEDING OF THE 11TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE SYSTEMS, APPLICATIONS, AND SERVICES, MOBISYS '13, vol. 13, 2013, New York, New York, USA, pages 13, XP055127047, ISBN: 978-1-45-031672-9, DOI: 10.1145/2462456.2465428 *
JUAN DAVID HINCAPIÉ-RAMOS ET AL: "CrashAlert", HUMAN FACTORS IN COMPUTING SYSTEMS, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 27 April 2013 (2013-04-27), pages 3385 - 3388, XP058043183, ISBN: 978-1-4503-1899-0, DOI: 10.1145/2470654.2466463 *
TIANYU WANG ET AL: "WalkSafe: A Pedestrian Safety App for Mobile Phone Users Who Walk and Talk While Crossing Roads", PROCEEDINGS OF THE TWELFTH WORKSHOP ON MOBILE COMPUTING SYSTEMS & APPLICATIONS, HOTMOBILE '12, 2012, New York, New York, USA, XP055232737, ISBN: 978-1-4503-1207-3, DOI: 10.1145/2162081.2162089 *

Also Published As

Publication number Publication date
SG10201406053QA (en) 2016-04-28

Similar Documents

Publication Publication Date Title
CN106952303B (en) Vehicle distance detection method, device and system
US11282389B2 (en) Pedestrian detection for vehicle driving assistance
KR101766305B1 (en) Apparatus for detecting intrusion
US8953841B1 (en) User transportable device with hazard monitoring
US10997422B2 (en) Information processing apparatus, information processing method, and program
US20180204452A1 (en) Concept for warning at least one road user located within a parking facility
US20240071086A1 (en) Information processing apparatus, information processing method, and program
RU2707695C2 (en) Method, system and computer-readable data storage media for foliage detection using range data
EP3847070A1 (en) Systems and methods for classifying driver behavior
EP3435346A1 (en) Spectacle-type wearable terminal, and control method and control program for same
EP3754618A1 (en) Recording control device, recording control system, recording control method, and recording control program
WO2022183663A1 (en) Event detection method and apparatus, and electronic device, storage medium and program product
CN112070052A (en) Interval monitoring method, device and system and storage medium
JP2018195194A (en) Information processing system, information processing apparatus, information processing method and program
KR102081059B1 (en) Smartphone for supporting walking
KR102336258B1 (en) YOLO-based monitoring system for accident handling through accident occurrence identification
Wei et al. Automatic accident detection and alarm system
CN112767647A (en) Danger early warning method, device, equipment and computer readable storage medium
US9685062B2 (en) Electronic device, security warning system and method
JP7136538B2 (en) electronic device
JP6822571B2 (en) Terminal device, risk prediction method, program
WO2016048237A1 (en) Method and system for accident avoidance
CN107290797A (en) A kind of obstacle detection system and method based on quorum-sensing system
CN114827436A (en) Camera shooting method and device
KR101906902B1 (en) Portable anti-collision device for pedestrian

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15779041

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 20/07/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15779041

Country of ref document: EP

Kind code of ref document: A1