WO2016033252A2 - Interférences de contexte de dispositif mobile relatives au transport - Google Patents

Interférences de contexte de dispositif mobile relatives au transport Download PDF

Info

Publication number
WO2016033252A2
WO2016033252A2 PCT/US2015/047054 US2015047054W WO2016033252A2 WO 2016033252 A2 WO2016033252 A2 WO 2016033252A2 US 2015047054 W US2015047054 W US 2015047054W WO 2016033252 A2 WO2016033252 A2 WO 2016033252A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
operations
processing
visual capture
vehicle
Prior art date
Application number
PCT/US2015/047054
Other languages
English (en)
Other versions
WO2016033252A3 (fr
Inventor
Dan Abramson
Sean IR
Ram BRACHA
Yuval Kashtan
Haim Grosman
Original Assignee
Cellepathy Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/540,954 external-priority patent/US9772196B2/en
Priority claimed from US14/540,951 external-priority patent/US20150168174A1/en
Application filed by Cellepathy Ltd. filed Critical Cellepathy Ltd.
Priority to US15/506,327 priority Critical patent/US20170279957A1/en
Priority to GB1704680.6A priority patent/GB2547809A/en
Publication of WO2016033252A2 publication Critical patent/WO2016033252A2/fr
Priority to US15/089,186 priority patent/US9638537B2/en
Publication of WO2016033252A3 publication Critical patent/WO2016033252A3/fr
Priority to US15/583,140 priority patent/US20170234691A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • H04W4/14Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]

Definitions

  • This disclosure relates generally to the field of mobile device identification, and, in particular, to computer-implemented systems and methods for transportation-related mobile device context inferences.
  • SMS text
  • transportation vehicles There are approximately 4.6 billion cellular phone subscriptions in the world over which it is estimated that more than 2 trillion text (SMS) messages are sent annually. There are also over 800 million transportation vehicles in the world. The magnitude of these statistics indicates that cellular phone use in vehicles is inevitable and is likely to remain quite common, unless preventative measures are taken.
  • a first visual capture can be received from a first visual capture component of a device; a second visual capture can be received from a second visual capture component of the device; and the first visual capture and the second visual capture can be processed by a processing device to determine an in-vehicle role of a user of the device.
  • Various other technologies are also disclosed.
  • FIG. 1 is a high-level diagram illustrating an exemplary configuration of an in-vehicle determination system
  • FIGs. 2A-2C are flow diagrams showing routines that illustrate broad aspects of methods for determining an in-vehicle role of a user and/or an in-vehicle location of a mobile device in accordance with various exemplary embodiments disclosed herein;
  • FIG 3 is a flow diagram showing a routing that illustrates a broad aspect of a method for enabling, disabling and/or modifying at least a feature of a mobile device in accordance with at least one exemplary embodiment disclosed herein;
  • FIG. 4 is a flow diagram showing a routine that illustrates a broad aspect of a method for determining an in-vehicle role of a user of a mobile device and/or a handheld state of a mobile device and/or a vehicle class of a vehicle containing the first mobile device using a central machine in accordance with at least one exemplary embodiment disclosed herein;
  • FIG. 5 is a flow diagram showing a routine that illustrates a broad aspect of a method for determining a vehicle class of a vehicle using a mobile device in accordance with at least one exemplary embodiment disclosed herein;
  • FIG. 6 is a flow diagram showing a routine that illustrates a broad aspect of a method of determining a handheld state a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 7 is a flow diagram showing a routine that illustrates a broad aspect of a method of restricting operation of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 8 is a flow diagram showing a routine that illustrates a broad aspect of another method of restricting operation of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 9A is a diagram depicting an exemplary relative coordinate system of a mobile device;
  • FIG. 9B is a diagram depicting exemplary relative accelerations and gyroscopic rotations of a mobile device
  • FIG. 9C is a diagram depicting an exemplary gyroscopic sign convention, as used herein;
  • FIG. 10 is a diagram depicting an exemplary coordinate system used in relation to a vehicle
  • FIGs. 11A-B are diagrams depicting a mobile device and its respective exemplary coordinate system in various orientations in relation to a car and its exemplary respective coordinate system;
  • FIG. 12 is a flow diagram showing a routine that illustrates a broad aspect of another method of restricting operation of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 13 is a flow diagram showing a routine that illustrates a broad aspect of another method of restricting operation of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 14 is a flow diagram showing a routine that illustrates a broad aspect of a method for orienting a coordinate system of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 15 is a flow diagram is described showing a routine that illustrates a broad aspect of a method for selectively restricting an operation of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 15A is an exemplary lock screen, in accordance with at least one embodiment disclosed herein;
  • FIG. 15B is an exemplary visual capture that can be processed to identify a presence of a fastened seatbelt, in accordance with at least one embodiment disclosed herein;
  • FIG. 15C depicts the "required orientation" of a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 15D depicts an exemplary screenshot showing visual feedback that can be provided to a user during authentication in accordance with at least one embodiment disclosed herein;
  • FIG. 15E depicts an exemplary screenshot showing visual feedback that can be provided to a user during authentication in accordance with at least one embodiment disclosed herein;
  • FIG. 15F depicts a mobile device, and specifically the locations of the forward-facing and rear-facing cameras of the mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 15G is an illustration depicting a 90 degree angle of incidence between the user's eyes/gaze/face/smile etc. and a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIGS. 15H-T depict exemplary aspects of an authentication sequence in accordance with at least one embodiment disclosed herein;
  • FIG. 16 is a flow diagram showing a routine that illustrates a broad aspect of a method for selectively restricting operation of a mobile device in accordance with at least one embodiment disclosed herein
  • FIG. 17 is a flow diagram showing a routine that illustrates a broad aspect of a method for authenticating an in vehicle role of a user of a mobile device and/or modifying a restriction of a mobile device in accordance with at least one embodiment disclosed herein;
  • FIG. 17A is a flow diagram showing particular aspects of the validation step of FIG. 17, in accordance with at least one embodiment disclosed herein;
  • FIG. 17B is an illustration depicting an orientation of a mobile device in relation to a typical line of sight of the driver in a moving car, in accordance with at least one embodiment disclosed herein,
  • FIG. 18 is a flow diagram is described showing a routine that illustrates a broad aspect of a method for selectively restricting an operation of and/or selectively modifying a restriction employed at mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 19 depicts an exemplary determination of orientation of a device based on visual capture(s) in accordance with at least one embodiment disclosed herein;
  • FIG. 20 depicts the orientation and/or location of mobile device in order to provide the requisite stability described herein, in accordance with at least one embodiment disclosed herein;
  • FIG. 21 is a flow diagram showing a routine that illustrates a broad aspect of a method for selectively restricting a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 22 is a flow diagram showing a routine that illustrates a broad aspect of a method for eliciting an authentication at a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 23 is a flow diagram showing a routine that illustrates a broad aspect of a method for eliciting an authentication at a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 24 is a flow diagram showing a routine that illustrates a broad aspect of a method for selectively modifying a restriction employed at a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 25 is a flow diagram showing a routine that illustrates a broad aspect of a method for selectively projecting outputs at a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 26 is a flow diagram showing a routine that illustrates a broad aspect of a method for selectively configuring overt operation of a mobile device, in accordance with at least one embodiment disclosed herein;
  • FIG. 27 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 28 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 29 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 30 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 31 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 32 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described relation to one or more embodiments described herein;
  • FIG. 33 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described relation to one or more embodiments described herein;
  • FIG. 34 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described relation to one or more embodiments described herein;
  • FIG. 35 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described relation to one or more embodiments described herein;
  • FIG. 36 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 37 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 38 s a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 39 s a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described relation to one or more embodiments described herein;
  • FIG. 40 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 41 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 42 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 43 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein
  • FIG. 44 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 45 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 46 a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 47 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described n relation to one or more embodiments described herein;
  • FIG. 48 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 49 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 50 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 51 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 52 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 53 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 54 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 55 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 56 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 57 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 58 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 59 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 60 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 61 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 62 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 63 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 64 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 65 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 66 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 67 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 68 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 69 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 70 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 71 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 72 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 73 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 74 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 75 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 76 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 77 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 78 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 79 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 80 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 81 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 82 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 83 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 84 depicts an exemplary implementation of one or more aspects described herein;
  • FIG. 85 depicts an exemplary implementation of one or more aspects described herein;
  • FIGS. 86A-C depict exemplary implementations of one or more aspects described herein;
  • FIGS. 87A-E depict exemplary implementations of one or more aspects described herein;
  • FIGS. 88A-E depict exemplary implementations of one or more aspects described herein;
  • FIGS. 89A-C depict exemplary implementations of one or more aspects described herein;
  • FIGS. 90A-E depict exemplary implementations of one or more aspects described herein;
  • FIGS. 91 A-C depict exemplary implementations of one or more aspects described herein;
  • FIGS. 92A-E depict exemplary implementations of one or more aspects described herein;
  • FIGS. 93A-D depict exemplary implementations of one or more aspects described herein;
  • FIGS. 94A-D depict exemplary implementations of one or more aspects described herein;
  • FIGS. 95 A-C depict exemplary implementations of one or more aspects described herein;
  • FIGS. 96 A-C depict exemplary implementations of one or more aspects described herein;
  • FIG. 97 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIG. 98 is a flow diagram showing a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein;
  • FIGS. 99A-B depict exemplary implementations of one or more aspects described herein;
  • FIGS. 100A-B depict exemplary implementations of one or more aspects described herein;
  • FIG. 101 depicts exemplary implementations of one or more aspects described herein;
  • FIGS. 102A-B depict exemplary implementations of one or more aspects described herein;
  • FIGS. 103A-B depict exemplary implementations of one or more aspects described herein.
  • FIGs. 104-129 are respective flow diagrams showing routines, each of which illustrates respective aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS are respective flow diagrams showing routines, each of which illustrates respective aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • the present disclosure details systems and methods for determining various user roles and actions as they relate to the operation of a mobile device within a vehicle such as a car.
  • a vehicle such as a car.
  • various systems and methods are provided herein which serve to identify the user of a particular mobile device (for instance, with respect to their role as a driver or passenger in the car), to identify various aspects of the usage of the device itself (for instance that the device is executing a text messaging application), and to identify instances when a mobile device deviates from its expected or regular operation.
  • the systems and methods disclosed herein can be arranged and/or deployed across a number of scenarios.
  • the systems and methods can be principally employed at a mobile device itself, such as in the form of a mobile application or 'app' executing on the mobile device.
  • a central machine such as a server in communication with a mobile device can employ the present systems and methods.
  • Such a centralized architecture can enable efficient processing and use of a larger database of user determination characteristics, eliminates power constraints and enables third parties, such as law-enforcement agencies and/or insurance companies, to easily monitor and/or adjust the operation of various mobile devices.
  • the systems and methods described herein define a solution that renders the surreptitious use of mobile devices in school impossible, directly improving student attention and learning, and empowering educators to decide how (if at all) they may be used in classrooms. In doing so, the present systems and methods can limit distraction without interfering with legitimate device use, which until now has been a major barrier to the adoption of other proposed solutions.
  • any structural and functional details disclosed herein are not to be interpreted as limiting the systems and methods, but rather are provided as a representative embodiment and/or arrangement for teaching one skilled in the art one or more ways to implement the systems and methods. Accordingly, aspects of the present systems and methods can take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware.
  • a software process can be transformed into an equivalent hardware structure, and a hardware structure can itself be transformed into an equivalent software process.
  • the selection of a hardware implementation versus a software implementation is one of design choice and left to the implementer.
  • the terms and phrases used herein are not intended to be limiting, but rather are to provide an understandable description of the systems and methods.
  • determining is intended to encompass the determination, identification, computation, calculation, and/or selection, with any degree of certainty or precision, and/or any other such operation, function, or action as it relates to the determination, identification, and/or selection of a user of a device such as a mobile device, an in- vehicle role of a user of a device such as a mobile device, a vehicle or vehicle model/type/class, a device or device model/type/class (e.g., handheld or wired), an operation and/or operation state of a device, and/or any other such similar or related operation, function, or action.
  • identifying event and "identifying events” as used herein are intended to encompass one or more occurrences or instances of events, stimuli, or phenomena, including explicitly the perceived coordinated or correlated occurrence or instance of two or more such events, stimuli, and/or phenomena, such as those originating at one or more devices. It should be understood that the referenced occurrences or instances of events, stimuli, or phenomena include single/singular events, stimuli, or phenomena as well as a set or series of multiple events, stimuli, or phenomena over a period of time. In addition, the referenced occurrences or instances of events, stimuli, or phenomena should also be understood to include one or more coordinations or correlations of the occurrence or instance of any number of such events, stimuli, and/or phenomena over any period of time.
  • user interface and “user interfaces” as used herein are intended to encompass one or more input devices, software modules executing in conjunction with one or more operating systems and/or input devices, or any other such similar or related device, accessory, apparatus, and/or software application or module that enable or facilitate input and/or interaction with a computing device.
  • detect As used herein are intended to encompass the detection, measurement, and/or receipt, with any degree of certainty or precision, one or more occurrences or instances of events, stimuli, phenomena, or any other such similar or related inputs that are detectable through one or more devices, implements or apparatuses.
  • processing is intended to encompass comparing, analyzing, weighing, correlating and/or computing one or more data items, elements, or structures, individually or in conjunction with one another, using a digital processor in conjunction with one or more software modules and/or applications.
  • the term "communicatively coordinated” as used herein is intended to encompass direct or indirect communication between two or more devices, accessories, and/or apparatuses, expressly including communications between a first device and a central machine, wherein the central machine is in turn in communication at some interval with a second device.
  • the first device and the second device are not, necessarily, in direct or indirect communication with one another, it can be said that they are communicatively coordinated with one another by virtue of their mutual connection to the referenced central machine.
  • feature and “features” as used herein are intended to encompass operations, functions, activities, or any other such similar or related actions, whether automated/automatic or user-initiated, that occur at or in conjunction with one or more devices, machines, applications, and/or apparatuses.
  • notification and notifications are intended to encompass one or more messages, transmissions, and/or data packets, such as electronic messages, which contain one or more data elements (such as inputs) related or relevant to one or more of the steps, operations, and/or processes disclosed herein.
  • An illustration of one such notification can be one or more electronic messages which contain information or data reflecting a first input from an accelerometer, a gyroscope, and/or a GPS receiver at a mobile device.
  • Such inputs can be grouped together into one or more notifications, and these notifications can in turn be transmitted to and/or received by other devices (such as a central machine) where they can be further processed.
  • vehicle class and “vehicle classes” as used herein are intended to encompass one or more types, categories, and/or models of vehicle.
  • airplanes, trains, automobiles, motorcycles, and boats can all be said to be different vehicle classes.
  • sub-categories within a given vehicle class can also be understood to be different vehicle classes.
  • the automobile vehicle class can be further sub-divided into further vehicle classes such as sedans, vans, sport utility vehicles (SUVs), and convertibles. These sub-categories can also be said to be vehicle classes within the meaning of the term as used herein.
  • SUVs sport utility vehicles
  • operation state and “operation states” as used herein are intended to encompass the states of a device, including any and all operations, functions, capacities, and/or capabilities, including, explicitly, a set and/or series of any number of operations, functions, capacities, and/or capabilities, that can be achieved by and/or in conjunction with a device, such as a mobile device.
  • Examples of an operation state include, but are not limited to: an execution of an application (such as an internet browser application) at a mobile device, a transmission of a notification (such as sending a text message or email message), a capacity to receive text messages, and a capability to type text using a keyboard.
  • handheld state and “handheld states” as used herein are intended to encompass one or more states of a mobile device with respect to whether or not a user is in direct or indirect physical contact with the device.
  • the handheld state of a device in instances where a user holds the device in his/her hand carries the device in his/her pocket, and/or balances the device on his/her knee can all be said to be “handheld.”
  • the handheld state of a device in instances where the device is positioned in a dock or cradle, and/or is otherwise not in direct or indirect contact with a user can be said to be “non-handheld.”
  • operational capacity and “operational capacities” as used herein are intended to encompass one or more operation states of a mobile device, particularly with respect to a central machine such as a server.
  • an operational capacity of a mobile device can be a voice or data connection that is provided to a mobile device through a central machine, such as that of a voice/data service provider.
  • a transformation, modification, and/or adjustment of such an operational capacity preferably entails such a transformation, modification, and/or adjustment that is initiated and/or effected by a central machine, preferably in relation to a mobile device.
  • a central machine can transmit an instruction and/or notification to a mobile device, such instruction/notification directing the transformation, modification, and/or adjustment be implemented at the mobile device.
  • a central machine can implement a transformation, modification, and/or adjustment at the central machine itself, wherein such a transformation, modification, and/or adjustment - such as the stopping of voice and/or data connections to a mobile device - ultimately effect the functionality of the device itself. In both such cases it can be said that the central machine has transformed, modified, and/or adjusted the operational capacity of the mobile device.
  • user and “users” as used herein are intended to encompass one or more individuals, persons, and/or entities whose presence a device or machine can preferably be directly or indirectly aware. It should be understood that while in certain scenarios a user can interact with a device, in other scenarios a particular individual, person, and/or entity can be said to be a "user” within the context of the present disclosure, despite not interacting with a particular device.
  • tactile sensor and “tactile sensor(s)” as used herein are intended to encompass one or more buttons, touchscreens, and/or components that enable a user to interact with a device in a tactile fashion.
  • tactile sensors include, but are not limited to, buttons (such as those that comprise a keyboard), switches, as well as touch screen displays (such as capacitive and resistive displays) which both display information and allow the tactile interaction with such information. It should be further understood that such tactile sensors are preferably further capable of perceiving a plurality of simultaneous tactile interactions. Examples of such functionality include mutlitouch technologies, as are known to those of ordinary skill in the art.
  • visual capture and “visual captures” as used herein are intended to encompass one or more operations, functions, and/or actions that relate to the optical perception and/or documentation of one or more visual items, elements, and/or phenomena. Examples of such visual captures include, but are not limited to, photographs, images, videos, and/or any other such method of visual perception and/or documentation. Accordingly, it can be appreciated that certain visual captures correspond to a single instance (such as a photograph) while other visual captures correspond to multiple instances (such as a series of photographs and/or a video).
  • in-vehicle role indicator is intended to encompass one or more items, elements, and/or indicators that relate to one or more aspects associated with and/or corresponding to the in- vehicle role of a user in a vehicle (e.g., whether a user is or is not a driver, is or is not a passenger, etc.).
  • one such in-vehicle role indicator is identifying in a picture of two hands of a driver grasping the steering wheel of a vehicle.
  • one or more images and/or videos can be processed in order to identify the presence of two hands grasping a steering wheel, thus indicating that a particular vehicle is being operated by a driver using two hands and therefore it can be reasonable concluded that the user who took such an image is not the driver.
  • another such in-vehicle role indicator can be capturing a picture that can be processed to identify that a seatbelt extends from the right shoulder to left thigh of the wearer. Such an identification also reasonably suggests that the wearer is not a driver (as the seatbelt of a driver traditionally extends from the left shoulder to the right thigh).
  • FIG. 1 is a high-level diagram illustrating an exemplary configuration of a determination system 100.
  • mobile device 105 can be a portable computing device such as a mobile phone, smartphone, or PDA.
  • mobile device 105 can be a tablet computer, a laptop computer, a personal computer, or an in- vehicle computer (e.g., ECU/OBD) though it should be understood that mobile device 105 of determination system 100 can be practically any computing device capable of embodying the systems and/or methods described herein.
  • ECU/OBD in- vehicle computer
  • Mobile device 105 of determination system 100 includes a control circuit 140 which is operatively connected to various hardware and software components that serve to enable operation of the determination system 100.
  • the control circuit 140 is operatively connected to a processor 110 and a memory 120.
  • Processor 110 serves to execute instructions for software that can be loaded into memory 120.
  • Processor 110 can be a number of processors, a multi-processor core, or some other type of processor, depending on the particular implementation. Further, processor 110 can be implemented using a number of heterogeneous processor systems in which a main processor is present with secondary processors on a single chip.
  • processor 110 can be a symmetric multi-processor system containing multiple processors of the same type-
  • memory 120 and/or storage 190 are accessible by processor 110, thereby enabling processor
  • Memory 120 can be, for example, a random access memory (RAM) or any other suitable volatile or non-volatile computer readable storage medium.
  • RAM random access memory
  • memory 120 can be fixed or removable.
  • Storage 190 can take various forms, depending on the particular implementation.
  • storage 190 can contain one or more components or devices.
  • storage 190 can be a hard drive, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the above.
  • Storage 190 also can be fixed or removable.
  • One or more software modules 130 are encoded in storage 190 and/or in memory 120.
  • the software modules 130 can comprise one or more software programs or applications having computer program code or a set of instructions executed in processor 110.
  • Such computer program code or instructions for carrying out operations for aspects of the systems and methods disclosed herein can be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code can execute entirely on the mobile device 105, partly on mobile device 105, as a stand-alone software package, partly on mobile device 105 and partly on a remote computer/device or entirely on the remote computer/device or server.
  • the remote computer can be connected to mobile device 105 through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection can be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider an Internet Service Provider
  • Software modules 130 including program code/instructions, are located in a functional form on one or more computer readable storage devices (such as memory 120 and/or storage 190) that can be selectively removable.
  • the software modules 130 can be loaded onto or transferred to mobile 105 for execution by processor 110. It can also be said that the program code of software modules 130 and one or more computer readable storage devices (such as memory 120 and/or storage 190) form a computer program product.
  • one or more of software modules 130 can be downloaded over a network to storage 190 from another device or system via communication interface 150 for use within determination system 100.
  • program code stored in a computer readable storage device in a server can be downloaded over a network from the server to determination system 100.
  • a determination module 170 that is executed by processor 110.
  • the processor 110 configures the control circuit 140 to determine an in- vehicle role of a user of the mobile device 105, as will be described in greater detail below.
  • software modules 130 and/or determination module 170 can be embodied in any number of computer executable formats, preferably software modules 130 and/or determination module 170 comprise one or more applications or 'apps' that are configured to be executed at mobile device 105 and/or in relation to mobile device 105. In other arrangements, software modules 130 and/or determination module 170 are incorporated and/or integrated within operating system 176.
  • software modules 130 and/or determination module 170 can be configured to execute at the request or selection of a user of mobile device 105 (or any other such user having the ability to execute a program in relation to mobile device 105, such as a network administrator), while in other arrangements mobile device 105 can be configured to automatically execute software modules 130 and/or determination module 170, without requiring an affirmative request to execute.
  • mobile device 105 can be configured to automatically execute software modules 130 and/or determination module 170, without requiring an affirmative request to execute.
  • the advantages of such an automatic arrangement can be appreciated in context of a regulatory scheme that mandates or recommends that software modules 130 and/or determination module 170 be executed by a mobile device 105 some or all of the time, in furtherance of a campaign to improve driver safety. It should also be noted that while FIG.
  • memory 120 can be operatively connected to the control circuit 140.
  • other software modules such as user interface 172 and operating system 176) and other information and/or data relevant to the operation of the present systems and methods (such as database 174) can also be stored on storage 190, as will be discussed in greater detail below.
  • a communication interface 150 is also operatively connected to control circuit 140.
  • Communication interface 150 can be any interface that enables communication between the mobile device 105 and external devices, machines and/or elements.
  • communication interface 150 includes, but is not limited to, a modem, a Network Interface Card (NIC), an integrated network interface, a radio frequency transmitter/receiver (e.g., Bluetooth, cellular, NFC), a satellite communication transmitter/receiver, an infrared port, a USB connection, or any other such interfaces for connecting mobile device 105 to other computing devices and/or communication networks such as the Internet.
  • NIC Network Interface Card
  • radio frequency transmitter/receiver e.g., Bluetooth, cellular, NFC
  • satellite communication transmitter/receiver e.g., an infrared port, a USB connection, or any other such interfaces for connecting mobile device 105 to other computing devices and/or communication networks such as the Internet.
  • Such connections can include a wired connection or a wireless connection (e.g. 80
  • mobile device 105 can communicate with one or more mobile devices 160A-N (collectively mobile devices 160).
  • the mobile devices 160 transmit and/or receive data to/from the mobile device 105, thereby preferably enhancing the operation of the determination system 100, as will be described in greater detail below.
  • mobile devices 160 can be in direct communication with mobile device 105, indirect communication with mobile device 105, and/or can be communicatively coordinated with mobile device 105, as will be described in greater detail below.
  • mobile device 160 can be practically any device capable of communication with mobile machine 105, in the preferred embodiment mobile device 160 is a handheld/portable computer, smartphone, personal digital assistant (PDA), tablet computer, and/or any portable device that is capable of transmitting and receiving data to/from mobile device 105. It should also be appreciated that in many arrangements, mobile device 160 will be substantially identical, from a structural and functional perspective, to mobile device 105.
  • PDA personal digital assistant
  • FIG. 1 depicts the determination system 100 with respect to mobile device 160 A and mobile device 160N, it should be understood that any number of mobile devices 160 can interact with determination system 100 in the manner described herein.
  • sensors 145A-145N are various components, devices, and/or receivers that are preferably incorporated within and/or in communication with mobile device 105. Sensors 145 preferably detect one or more stimuli, phenomena, or any other such inputs, as will be described in greater detail below.
  • sensors 145 include, but are not limited to, an accelerometer 145A, a gyroscope 145B, a GPS receiver 145C, a microphone 145D, a magnetometer 145E, a camera 145F, a light sensorl45G, a temperature sensor 145H, an altitude sensor 1451, a pressure sensor 145J, a proximity sensor 145K, a near-field communication (NFC) device 145L, a compass 145M, and a tactile sensor 145N.
  • mobile device 105 can preferably receive one or more inputs from one or more sensors 145 in order to determine an in- vehicle role of a user of mobile device 105 and/or to selectively restrict the operation of the mobile device.
  • database/server 162 is preferably a computing and/or storage device, and/or a plurality of computing and/or storage devices, that contain(s) information, such as determination characteristics, that can be relevant to the determination of an in-vehicle role of a user of mobile device 105.
  • a vehicle data system 164 such as an on board diagnostic (OBD) computer or computing device (e.g., OBD-I, OBD-II), an engine control unit (ECU), a roll system, an airbag system, a seat- weight sensor system, a seat-belt sensor system, and/or an anti-lock braking system (ABS) can also be in communication with mobile device 105.
  • OBD on board diagnostic
  • ECU engine control unit
  • Airbag system e.g., a seat-weight sensor system
  • ABS anti-lock braking system
  • Vehicle data system 164 preferably provides data and/or information from the vehicle itself that can also be relevant to various determinations disclosed herein, such as the determination of an in-vehicle role of a user of mobile device 105, as will be described in greater detail below.
  • mobile devices 160, database/server 162, and/or vehicle data system 164 can be in periodic or ongoing communication with mobile device 105 thorough a computer network such as the Internet 166.
  • mobile devices 160, database/server 162, and/or vehicle data system 164 can be in periodic or ongoing direct communication with mobile device 105, such as through communications interface 150, thus not requiring the presence of a network (such as the Internet 166) in order to initiate and maintain communications.
  • determination system 100 can take the form of a hardware unit that has circuits that are manufactured or configured for a particular use. This type of hardware can perform operations without needing program code to be loaded into a memory from a computer readable storage device to be configured to perform the operations.
  • mobile device 105 can take the form of a circuit system, an application specific integrated circuit (ASIC), a programmable logic device, or some other suitable type of hardware configured to perform a number of operations.
  • ASIC application specific integrated circuit
  • a programmable logic device the device is configured to perform any number of operations.
  • the device can be reconfigured at a later time or can be permanently configured to perform any number of operations.
  • programmable logic devices include, for example, a programmable logic array, programmable array logic, a field programmable logic array, a field programmable gate array, and other suitable hardware devices.
  • software modules 130 can be omitted because the processes for the different embodiments are implemented in a hardware unit.
  • determination system 100 and/or mobile device 105 can be implemented using a combination of processors found in computers and hardware units.
  • Processor 110 can have a number of hardware units and a number of processors that are configured to execute software modules 130. In this example, some of the processors can be implemented in the number of hardware units, while other processors can be implemented in the number of processors.
  • a bus system can be implemented and can be comprised of one or more buses, such as a system bus or an input/output bus.
  • the bus system may be implemented using any suitable type of architecture that provides for a transfer of data between different components or devices attached to the bus system.
  • communications interface 150 can include one or more devices used to transmit and receive data, such as a modem or a network adapter.
  • Embodiments and/or arrangements can be described in a general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
  • routine 201 that illustrates a broad aspect of a method for determining an in-vehicle role of a user of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 201 that illustrates a broad aspect of a method for determining an in-vehicle role of a user of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • the implementation is a matter of choice dependent on the requirements of the device (e.g., size, energy, consumption, performance, etc.). Accordingly, the logical operations described herein are referred to variously as operations, structural devices, acts, or modules.
  • the process begins at step 210 where processor 110 executing one or more of software modules 130, including, preferably, determination module 170, receives a first input, such as from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150.
  • a first input such as from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150.
  • the first input originates from one or more identifying events that are perceptible to at least one of sensors 145, user interface 172, operating system 176, and/or communication interface 150.
  • Examples of such an input include, but are not limited to, an acceleration input that originates from an acceleration event (e.g., the speeding up or slowing down of a car) that is perceived by accelerometer 145 A, a change in geographic location input that originates from a location changing event (e.g., the movement from one place to another) that is perceived by GPS receiver 145C, and/or one or more instances or user interaction (e.g., typing) that are detected by user interface 172.
  • an acceleration input that originates from an acceleration event (e.g., the speeding up or slowing down of a car) that is perceived by accelerometer 145 A
  • a change in geographic location input that originates from a location changing event (e.g., the movement from one place to another) that is perceived by GPS receiver 145C
  • user interaction e.g., typing
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, analyzes at least the first input, such as to identify one or more determination characteristics within the first input, including but not limited to user determination characteristics.
  • user determination characteristics are one or more aspects originating at and/or derived from an input that provide insight regarding the in- vehicle role and/or identity of the user that is exerting control over and/or otherwise associated with a mobile device, such as mobile device 105.
  • determination module 170 can analyze the typing to identify one or more user determination characteristics (that is, characteristics that contribute to a determination of the identity of the particular user that is associated with mobile device 105, as will be described below). In this case, determination module 170 can analyze the typing patterns within the first input (such as the time interval in between the typing of individual letters in the SMS message, the average time interval in between the typing of individual letters in the SMS message, and/or the variability among one or more time intervals between the typing of individual letters in the SMS message).
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, computes one or more determination factors (that is, factors that reflect and/or suggest one or more determinations that can be arrived at with respect to one or more of the mobile device, its location, the user, and/or the vehicle).
  • determination factors that is, factors that reflect and/or suggest one or more determinations that can be arrived at with respect to one or more of the mobile device, its location, the user, and/or the vehicle.
  • a probability can be computed, based on the user determination characteristics, that the in-vehicle role of the user of mobile device 105 is a driver and/or that the in-vehicle role of the user of the mobile device 105 is a passenger.
  • the user determination characteristics identified at step 220 can provide varying degrees of certitude as to the identity or role of a user. So, continuing the example provided with regard to step 220, while, on the one hand, significant time intervals between typed letters can indicate that the in-vehicle role of the user is a driver, on the other hand if the time intervals in between the various letters are, on average, consistent and/or substantially similar this can indicate that the user is not necessarily distracted (due to being a driver), but rather is a passenger and is simply not adept at typing.
  • the computed probability for such user determination characteristic(s) is preferably a lesser degree of certainty that the user is a driver (and/or a passenger), accounting for the potentially conflicting indications from the various user determination characteristics.
  • processor 110 executing software modules 130 preferably computes a probability that the in-vehicle role of the user of mobile device 105 is a passenger.
  • processor 110 executing software modules 130 preferably computes a probability that the in-vehicle role of the user of mobile device 105 is a driver (being that the user determination characteristics appear consistent with the activity of a driver within a vehicle). It should be appreciated that because ranges exist for a particular user determination characteristic (such as typing consistency), a probability of an in-vehicle role is preferably computed, reflecting a degree of certainty that the user of mobile device is a driver and/or that the user of mobile device is a passenger.
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, transforms an operation state of the mobile device 105 based on the determination factors (such as the probability computed at step 230), and/or outputs at least one operation state based on the at least one determination factor, and/or outputs at least one in-vehicle role of the user based on at least one determination factor, and/or outputs at least one in-vehicle location of the mobile device 105 based on at least one determination factor, and/or outputs at least one result based on the at least one determination factor.
  • the determination factors such as the probability computed at step 230
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, transforms an operation state of the mobile device 105 based on the determination factors (such as the probability computed at step 230), and/or outputs at least one operation state based on the at least one determination factor, and/or outputs at least one in-vehicle role of the
  • processor 110 can coordinate the disabling of one or more features of the mobile device 105, such as the disabling of any and/or all features that enable the entry of text into mobile device 105.
  • existing safety risks can be reduced by preventing a user who has been determined to be likely to be a driver of a vehicle from using various regular functions of mobile device 105 that are likely to distract the user and increase safety risks while driving and/or are restricted and/or prohibited based on the vehicle's current (or most recently known) location, as preferably determined in conjunction with GPS 145C.
  • one or more other transformations to the operation state of mobile device can be similarly applied based on the computed probability.
  • notifications can be provided at the mobile device 105
  • notifications can be transmitted to third parties (notifying a third party, such as a law enforcement agency, of the in-vehicle role of the user of mobile device 105 and/or of the particular operation of the mobile device 105, such as that typing is being performed upon mobile device 105)
  • instructions can be provided to third parties (such as a cellular service provider) to change an operation state of mobile device 105 (such as temporarily disabling the communication ability of mobile device 105), and/or one or more applications executing or executable on mobile device 105 can be disabled (such as a text messaging application).
  • the operations corresponding to transforming step 240 can be customized and/or configured in relation to various probabilities computed at step 230. That is, certain transformations of the operation state of mobile device 105 (for example, notifying law enforcement authorities) may only be appropriate when there is a high probability (such as greater than 90%) that the in- vehicle role of a user of mobile device 105 is a driver (and further that the driver is interacting with mobile device 105 in an illegal manner while driving), while other transformations may be appropriate even for lower degrees of probability (for example, it may be appropriate to provide a warning notification at mobile device 105 even for a 60% probability that the user is a driver).
  • a high probability such as greater than 90%
  • transformations can be employed preemptively, wherein the transformation is applied even before a prohibited interaction (e.g., typing into an SMS program) occurs, thereby avoiding restricted or prohibited interaction with mobile device 105, even at the first instance.
  • the user can configure how (that is, the type of transformation) and when (that is, the probability threshold that must be met in order to trigger the transformation) the operation of mobile device 105 is to be transformed.
  • a third party can establish such configurations.
  • a regulatory agency can dictate that one or more transformations be employed on some or all mobile devices when a particular probability threshold that a user of the device is a driver is met.
  • a car insurance provider can provide incentives to its customers who utilize one or more transformations and/or probability thresholds suggested and/or dictated by the insurance company.
  • FIG. 2B a flow diagram is described showing a routine 202 that illustrates a further aspect of a method for determining an in-vehicle role of a user of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 202 that illustrates a further aspect of a method for determining an in-vehicle role of a user of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • more or fewer operations can be performed than shown in the figures and described herein, and that these operations can be performed in a different order than those described herein.
  • certain of the operations of FIG. 2B can be performed while others are not, and further that in certain arrangements can be performed in a sequence other than that depicted in FIG. 2B.
  • step 210 a first input of a first device 105 is received, and proceeds to step 210
  • Steps 210 and 220 have already been described above with reference to FIG. 2A and thus will not be further elaborated upon here as their operation is substantially identical to steps 210 and 220 described above.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, receives a second input from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150.
  • examples of such an input include, but are not limited to, an input corresponding to an acceleration perceived by accelerometer 145 A, and/or an input corresponding to a change in geographic location as perceived by GPS receiver 145C.
  • the second input is analyzed by processor 110 executing determination module 170, in a manner substantially similar to that described above with reference to step 220, in order to identify one or more determination characteristics such as user determination characteristics within the second input.
  • determination module 170 can analyze the accelerations to identify one or more user determination characteristics within the second input.
  • determination module 170 can analyze various patterns within the second input (such as the time and duration of acceleration and deceleration).
  • Certain patterns such as frequent periods of sustained forward acceleration interspersed with periodic intervals of rapid and/or brief forward deceleration can indicate that the user of mobile device 105 is likely traveling in, if not operating, a car which often follows such an acceleration/deceleration pattern.
  • the context and significance of one or more other user determination characteristics can be better evaluated and/or quantified.
  • the typing patterns of a user determined to be traveling in a moving car are, on average, of greater significance in determining whether the user of the device is a driver/passenger.
  • the typing patterns of a user of a mobile device 105 that has been determined not to be traveling in a moving car can be understood to be, on average, of lesser significance in determining whether the user of the device is a driver/passenger.
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, can compare the determination characteristics such as user determination characteristics identified within the first input (such as those identified at step 220) with the determination characteristics such as user determination characteristics identified within the second input (such as those identified at step 222). In doing so, one or more patterns, correlations and/or relationships can be identified between the user determinations characteristics of the first input and the user determination characteristics of the second input.
  • the typing patterns identified at step 220 can be compared with the acceleration/deceleration patterns identified at step 222. In doing so, patterns, correlations, and/or relationships between the typing patterns and acceleration/deceleration patterns can be identified.
  • time intervals between typed characters and/or typing inconsistencies increase at the same time as substantial and/or sudden forward and/or lateral acceleration and/or deceleration
  • this can further indicate that the user of a mobile device 105 is a driver.
  • the driver Being that for a driver to engage in a maneuver with sudden acceleration and/or deceleration the driver is expected to have temporarily stopped typing due to the increased attention a driver must pay to his driving activities, if such accelerations correlate closely with inconsistent typing speeds and/or slower typing speed and/or such accelerations are just prior to typing delays, this can be a strong indication that the user of mobile device 105 is a driver.
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, compares determination characteristics such as user determination characteristics (including, but not limited to, the user determination characteristics from the first input, as identified at step 220, and/or the user determination characteristics from the second input, as identified at step 222) with stored determination characteristics such as user determination characteristics, such as those stored at one or more databases, such as database 174 (that is local to mobile device 105) and/or database/server 162 (that is external to mobile device 105).
  • Stored user determination characteristics can be archived user determination characteristics that have been retained from previous user determinations that have been performed, can be generated based on statistical analyses of previous user determinations, and/or can be defined or established independent of any particular previous user determination.
  • the processor 110 can more accurately compute the probability that the in-vehicle role of the user of mobile device 105 is a driver or that the in-vehicle role of the user of mobile device 105 is a passenger.
  • various identified user determination characteristics can be compared to such stored determination characteristics (e.g., highly predictive typing patterns). If the identified determination characteristics closely correlate to highly reliable/predictive stored determination characteristics, the identified determination characteristics can be similarly considered highly reliable and this correlation can further enhance the reliability of the computation of a probability regarding the in-vehicle role of a particular user. Additional examples and illustrations of such comparisons are provided below at EXAMPLE 1.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, receives an input from another device, such as one of mobile devices 160.
  • the input received from mobile device 160 is preferably from among the various types of inputs referenced above at steps 210 and 221 (for example, an acceleration input that originates from an acceleration event that is perceived by accelerometer 145 A, and/or a change in geographic location input that originates from a location changing event that is perceived by GPS receiver 145C), and thus will not be described at length here.
  • this input originates at mobile device 160 (that is, a device external to mobile device 105), and thus the input from mobile device 160 is preferably received by mobile device 105 through communication interface 150.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, processes an input of mobile device 105 against an input of one or more mobile devices 160.
  • one or more determination characteristics such as user determination characteristics can be identified within the input of the first mobile device 105.
  • various typing patterns and/or tendencies (referenced above) of mobile device 160 can be processed against similar typing patterns/tendencies of mobile device 105 (or, alternatively, various typing patterns and/or tendencies of mobile device 105 can be processed against similar typing patterns/tendencies of mobile device 160).
  • processor 110 can analyze and/or identify the degree to which the input from mobile device 105 deviates from the input received from mobile device(s) 160, in a manner similar to the comparison discussed above at step 224 (except that here the input of mobile device 105 is being processed against an input received from another mobile device 160, as opposed to comparing one user determination characteristic with stored characteristics).
  • the typing tendencies of mobile device 105 are relatively inconsistent, if, when processing the typing tendencies received from mobile device(s) 160 against those of mobile device 105 it is revealed that the typing across many or all of the devices 105 and 160 is similarly inconsistent, this can indicate that it there is not necessarily a high probability that the user of mobile device 105 is a driver, despite the inconsistent typing inputs received at the device 105 (rather, such inconsistent typing may be the result of the various devices 105 and 160 traveling along an off-road or bumpy road, which would make consistent typing difficult, even for passengers in a vehicle).
  • the typing tendencies of mobile device 105 are relatively consistent, however when processing such input(s) against inputs from mobile device(s) 160 it is revealed that the typing tendencies of the user of mobile device 105 are actually relatively inconsistent, this can indicate a higher probability that the user of mobile device 105 is a driver of a vehicle (even though the input from mobile device 105, in-and-of-itself, may not have generated the same conclusion).
  • various limitations and/or filters can be imposed upon the receiving at step 225 and/or the processing at step 226, to ensure the most accurate results possible. That is, while in certain arrangements it can be beneficial to receive inputs from practically any mobile device 160 that is capable of communication with mobile device 105, in other arrangements it can be preferably to limit the number of devices and/or inputs that are received by mobile device 105 on the basis of one or more factors to ensure that the inputs being received by mobile device 105 from such external devices 160 are those that can be expected to be of greatest relevance. Examples of factors that can be considered in imposing such limitations and/or filters include proximity to mobile device 105 and/or similarity/compatibility with mobile device 105.
  • device 160 in processing the typing tendencies of device 105 against those of another device 160, it can be preferable to ensure that device 160 is in close proximity to mobile device 105 (such as through a comparison of the location coordinates obtained from their respective GPS receivers or by causing one or more of the mobile devices to emit one or more tones and/or signals (e.g., an audio tone) that can then be received on other mobile devices that are in close proximity, as described in detail in EXAMPLE 2), thereby establishing a high likelihood that mobile device 105 and mobile device 160 are operating within the same vehicle (and are thus subjected to substantially identical conditions).
  • tones and/or signals e.g., an audio tone
  • the inputs from mobile device 105 and those of mobile device 160 that are to be processed against one another/compared are substantially synchronized from a chronological standpoint. That is, it is preferable that each of the various inputs be associated with a particular time (and that the source of such time be a central clock, such as a server, which can synchronize the various devices, though it should be understood that in other arrangements one or more of devices 105, 160 can broadcast timing data that enables the calibration of the various devices), thereby enabling the processing of inputs from mobile device 105 with inputs from mobile device 160 that correspond to the same point in time.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, receives an input from vehicle data system 164, such as an on board diagnostic (OBD) computer or computing device (e.g., OBD-I, OBD-II, ECU, roll system, airbag system, and/or an ABS), preferably through communication interface 150.
  • vehicle data system 164 preferably provides data and/or information originating at the vehicle itself.
  • vehicle data system 164 can provide one or more inputs that reflect various actions or events, such as a car's acceleration and/or deceleration, steering, braking, and/or any other such car-related operations. Such inputs can provide further insight into determining the in-vehicle role of a user of mobile device 105, as will be described below.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, processes an input of mobile device 105 against an input of vehicle data system 164, in a manner similar to that described above with respect to step 226.
  • an input of mobile device 105 such as various typing tendencies (as illustrated above) is processed against an input from vehicle data system 164 that preferably pertains to an operation of a car (e.g., the car accelerating, braking, and/or swerving) and which is qualitatively different than the input of mobile device 105 because vehicle data system 164 cannot necessarily detect the various stimuli perceptible to mobile device 105, owing in part to the fact that mobile device 105 is preferably not fixed relative to the car's coordinate system.
  • the various inputs are compared and/or synchronized from a chronological standpoint, substantially in the manner described above with respect to step 226.
  • inputs from mobile device 105 can be processed against inputs from vehicle data system 164 (which, in turn, originate at the car itself), thereby enabling the association of various inputs from mobile device 105 with events such as the accelerating, braking, and/or swerving of the car.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, computes one or more determination factors, such as a probability, based on the various determination characteristics, that the in-vehicle role of the user of mobile device 105 is a driver and/or a probability that the in-vehicle role of the user of the mobile device 105 is a passenger, substantially in the manner described in detail above with regard to step 230.
  • determination module 170 computes one or more determination factors, such as a probability, based on the various determination characteristics, that the in-vehicle role of the user of mobile device 105 is a driver and/or a probability that the in-vehicle role of the user of the mobile device 105 is a passenger, substantially in the manner described in detail above with regard to step 230.
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, transforms an operation state of the mobile device 105 and/or outputs at least one operation state based on the at least one determination factor, and/or outputs at least one in-vehicle role of the user based on at least one determination factor, and/or outputs at least one in-vehicle location of the mobile device 105 based on at least one determination factor, and/or outputs at least one result based on the at least one determination factor, as also described in detail above.
  • software modules 130 including, preferably, determination module 170
  • FIG. 2C a flow diagram is described showing a routine 203 that illustrates a further aspect of a method for determining an in-vehicle role of a user of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • the process begins at step 210 where an input is received from mobile device 105, and proceeds to step 220 where the first input is analyzed.
  • a determination factor such as a probability is computed, based on the various determination characteristics, as referenced above. Steps 210, 220, and 230 have already been described above with reference to FIG. 2A and thus will not be further elaborated upon here as their operation is substantially identical to steps 210, 220, and 230 described above.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, outputs one or more results based on the determination factor(s) computed at step 230.
  • results can include, but are not limited to, one or more files, notifications, and/or communications that contain and/or reflect operations of the mobile device 105, and/or one or more operation states of the first mobile device 105, and the outputting of such results can be dependent upon a certain probability threshold, as described in detail herein.
  • mobile device 105 is configured to output results (such as that the in-vehicle role of a user is a driver/passenger) when the probability (that is, the reliability) of such results are greater than 75%
  • mobile device 105 determines with a probability of 80% that the in- vehicle role of a user of mobile device 105 is a driver
  • a corresponding notification can be outputted reflecting such results.
  • the referenced results can be output based on the calculated probability that the user of mobile device 105 is a driver or that the user of mobile device 105 is a passenger. It should be understood that the outputting referenced at this step can be employed in a number of ways depending on the particular arrangement.
  • the referenced results can be transmitted to an external device or third-party, such as a law enforcement agency, insurance company, and/or other device 160 (for example, a parent receiving results from a child's device 105), via communication interface 150.
  • a law enforcement agency, insurance company, and/or other device 160 for example, a parent receiving results from a child's device 105
  • the outputting of such results to a law enforcement agency, insurance company, and/or another device 160 can ensure that such entities are notified of the various operations and/or operation states of a particular mobile device 105, especially when it has been determined that it is highly probable that device 105 is being operated by a driver of a car.
  • such results can be outputted to mobile device 105 itself in any number of ways, such as by logging the operations and/or operation state(s) of mobile device 105 at times/intervals when it has been determined, for instance, that there is a high probability that the user of mobile device 105 is a driver. Irrespective of whether the results are output to a third-party or to the device 105 itself, it should be appreciated that the outputting of such results can provide insight regarding the operations of the mobile device 105 at a particular moment and/or interval, which can be utilized later, such as in investigating car crashes.
  • a law-enforcement agency can review such outputted results to determine whether the driver was engaged in various distracting activities during and/or near the time of the crash (e.g., mobile device 105 was being used by driver with 93% certainty, was being used in a hand-held state with 94% certainty, and was being used for texting with 100% certainty at least 30 seconds prior to the crash).
  • the various referenced results can be outputted across any and/or all degrees of probability, thereby ensuring a comprehensive log of a user results, reflecting the various operations and/or operation states throughout the course of operation of the mobile device 105.
  • a device can be determined to be operated by a driver or a passenger based on one or more applications running at the device (and/or that are installed on the device but not necessarily running).
  • FIG. 43 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • At 4305 one or more operational aspects of a user device can be identified.
  • the one or more operational aspects can include one or more applications executing at the mobile device. Moreover, in certain implementations the one or more operational aspects can include one or more resources being utilized at the mobile device.
  • the one or more operational aspects can be processed, such as in order to determine one or more characteristics of a user of the user device.
  • one or more operations can be initiated, such as based on the one or more characteristics.
  • a device determined to be within a moving vehicle that is running a 'SatNav' application can be determined to be relatively more likely to be operated by a driver than by a passenger.
  • resources that the device is using e.g., whether Bluetooth is paired on the device
  • the device is likely to be about to start a trip.
  • Such a determination can be further accounted for in determining the in- vehicle role of a user associated with such a device, for example, based upon the in- vehicle role that that user or a group of users can be determined to be likely to play after using such application.
  • routine 300 that illustrates an aspect of a method for enabling, disabling and/or modifying at least a feature of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • step 310 processor 110 executing one or more of software modules 130, including, preferably, determination module 170, monitors one or more inputs from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150.
  • inputs include, but are not limited to, an acceleration input, a geographic location input, and/or one or more instances or user interaction (e.g., typing).
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170 defines an operation signature based on the inputs monitored at step 310.
  • the defined operation signature preferably reflects a normal operation state and/or a range of normal operation states of the mobile device 105.
  • an operation signature or profile can be defined that reflects one or more values or ranges of values that have been identified as the normal or regular operation of the device 105, the normal or regular usage of the device 105 by a particular user, and/or the normal or regular usage of device 105 and/or a series or class of such devices by a particular user and/or a series or range of users. For example, after monitoring inputs from the accelerometer 145 A of mobile device 105 for a period of time, a range of normal acceleration inputs of the device 105 can be determined.
  • a range of normal typing tendencies e.g., typing speeds, typing consistency, etc., as described herein
  • these various inputs can be used to define an operation signature for the mobile device 105 that reflects the normal operation and/or operating range of the device 105.
  • the referenced operation signature is not limited to a single input or type of input, but rather in certain arrangements can be made up of signatures of two or more types of inputs.
  • a normal operation signature can be made up of a range normal accelerometer inputs together with a range of normal typing tendencies.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, further monitors one or more second inputs from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150, substantially in the manner described above with respect to step 310.
  • processor 110 executing one or more of software modules 130 processes one or more of the second input(s) (monitored at step 330) against one or more of the operation signature(s) (defined at step 320).
  • processor 110 executing one or more of software modules 130 can identify a degree of deviation and/or a degree of correlation between the second input(s) and the operation signature(s).
  • various typing patterns and/or tendencies (referenced above) of mobile device 105 can be processed against an operation signature reflecting a range of normal typing tendencies of mobile device 105, as referenced above with respect to step 320 and described in detail herein.
  • processor 110 can analyze and/or identify the degree to which the one or more second input(s) (monitored at step 330) deviate from the operation signature of mobile device 105 (defined at step 320).
  • processor 110 can analyze and/or identify the degree to which the one or more second input(s) (monitored at step 330) deviate from the operation signature of mobile device 105 (defined at step 320).
  • such a deviation from the operation signature (which reflects the normal and/or expected operation of mobile device 105) can indicate that the mobile device 105 is being operated under conditions that distract the user from interacting normally with the device 105, such as during driving.
  • the monitored typing tendencies of mobile device 105 are relatively inconsistent, from an objective standpoint, upon processing such inputs against an operation signature (such as an operation signature reflecting that the typing tendencies of the user of mobile device 105 are also generally inconsistent, such as in the case of a new user who is not adept at typing), it can be revealed that the monitored typing tendencies/inputs (which otherwise reflect significantly inconsistent typing tendencies) actually correlate substantially with the mobile device's 105 operation signature.
  • the correlation with the operation signature (which reflects the normal and/or expected operation of mobile device 105) can indicate that the mobile device 105 is actually being operated under relatively normal/consistent conditions, and thus should not be assumed to be operated under distracting conditions, such as driving, as may have otherwise been concluded based on the inconsistent typing tendencies alone.
  • steps 310 and 320 can be repeated on a periodic and/or constant basis, in order to further refine the operation signature defined at step 320. That is, it can be appreciated that in certain scenarios a user's interaction with mobile device 105 can change and/or improve over time (such as in the case of a new user whose typing skills gradually improve with repeated use of device 105), and thus the operation signature of mobile device 105 should be adjusted, modified, and/or refined accordingly. It can be appreciated that this process can be achieved in any number of ways. In one arrangement, mobile device 105 can be configured to periodically reset its operation signature (such as every month), such that only recent operations are accounted for in defining the operation signature.
  • further inputs that are monitored can be factored into and/or averaged with previously monitored inputs, thereby updating an existing operation signature.
  • further inputs can be factored into and/or averaged with previously monitored inputs, and the more recent inputs can be weighted to place greater emphasis upon them, thereby updating an existing operation signature while accounting for the fact that more recent inputs are of greater value in defining an accurate operation signature of a mobile device 105.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, adjusts one or more operations of mobile device 105.
  • this adjustment corresponds to the degree of deviation and/or the degree of correlation between one or more monitored inputs (such as the input monitored at step 330) and one or more operation signature(s) of mobile device 105 (such as the operation signature defined at step 320). It should be understood that in certain arrangements, this adjustment is similar to the transformation of the operation state of mobile device 105 discussed in detail above with respect to step 240, and/or the outputting of one or more results discussed in detail above with respect to step 250.
  • processor 110 can coordinate the disabling of one or more features of the mobile device 105, such as the disabling of any and/or all features that enable the entry of text into mobile device 105, while in other arrangements notifications (such as warning notifications) can be provided at or transmitted to mobile device 105.
  • notifications such as warning notifications
  • processor 110 can coordinate the disabling of one or more features of the mobile device 105, such as the disabling of any and/or all features that enable the entry of text into mobile device 105, while in other arrangements notifications (such as warning notifications) can be provided at or transmitted to mobile device 105.
  • notifications such as warning notifications
  • various of the adjustments employed at step 350 can be customized and/or configured in relation to various degrees of correlation and/or deviation identified at step 340.
  • certain adjustments of the operation of mobile device 105 may only be appropriate when a high degree of deviation from a normal operation state (that is, from the operation signature) is identified (and, preferably, further that such a deviation is indicative of restricted or prohibited activity on the part of the user of mobile device 105).
  • Other adjustments, such as providing a notification at mobile device 105 may be appropriate even for lower degrees of correlation/deviation, as described in detail above.
  • information regarding whether and how a device user uses his/her device can be measured and/or recorded, whether or not the device user is determined to be the driver (or irrespective of a likelihood that the device user is the driver).
  • Such technologies/techniques can be used, among other things, to log and analyze how large a "distracted driving" problem the device user has, i.e., based on how much and in what ways the device's usage deviates from a particular device usage policy (e.g., a state or federal law, a company policy, a parental policy, etc.) in conjunction with various determinations pertaining to the identity of the user (e.g., as a driver of a vehicle).
  • a particular device usage policy e.g., a state or federal law, a company policy, a parental policy, etc.
  • such technologies can utilize information obtained directly from the device (e.g., information about calls made or received, texts made or received, applications used, URLs visited, intermediate or final sources or destinations of data transmissions, device movement information, location information, network connectivity or visibility information, GPS, accelerometer, etc.) and/or information obtained from third parties (e.g., mobile server providers, telematics, etc.).
  • information obtained directly from the device e.g., information about calls made or received, texts made or received, applications used, URLs visited, intermediate or final sources or destinations of data transmissions, device movement information, location information, network connectivity or visibility information, GPS, accelerometer, etc.
  • third parties e.g., mobile server providers, telematics, etc.
  • the referenced information can also be saved to the device and/or to a remote server.
  • the frequency or quantity (e.g., number of screen touches, number of key presses, time with screen on, CPU activity, application switches, number of human or device spoken words) of device use is used as an indicator to determine vehicle role.
  • Such frequency can be measured in various ways (e.g., (i) relative to others; (ii) relative to self (e.g., same device); (iii) relative to self as driver; and (iv) relative to self as passenger) to make such determination.
  • a device user uses her device, on average, at least once every five (5) minutes (during waking hours) throughout the day and, for a period in which the device was in a vehicle, she doesn't use it for 60 minutes, there is a relatively greater likelihood that her in- vehicle role was that of a driver.
  • a device user has, on average, 5 device interactions (e.g., key presses) per minute (during waking hours) throughout the day and, for a period in which the device was in a vehicle, the quantity of key presses was 400 over 60 minutes, there is a relatively greater likelihood that her in- vehicle role was that of a passenger.
  • routine 400 that illustrates an aspect of a method of determining at least one of an in-vehicle role of a user of a first mobile device and/or a handheld state of the first mobile device and/or a vehicle class of a vehicle containing the first mobile device using a central machine in accordance with at least one embodiment disclosed herein.
  • routine 400 is primarily directed to determinations performed at central machine 168, as will be described in greater detail below.
  • any one of the particular steps, operations, and/or functions are described throughout the present disclosure as being performed at and/or upon a particular machine or device (such as mobile device 105, mobile device 160, and/or central machine 168), such description should be understood as being exemplary and/or illustrative and not limiting. Accordingly, it can be appreciated that any and all steps, operations, and/or functions described herein with regard to a particular device and/or machine (such as central machine 168) should be similarly understood to be similarly capably of employment at another device and/or machine (such as mobile device 105), substantially in the manner described herein, without departing from the scope of the present disclosure.
  • step 410 processor 4110 of central machine 168 (depicted in FIG. 1) executing one or more of software modules 4130, including, preferably, determination module 4170, receives (preferably through communication interface 4150) a first notification from mobile device 105, the first notification preferably corresponding to an input originating from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150 of mobile device 105.
  • the first input originates from one or more identifying events that are perceptible to at least one of sensors 145, user interface 172, operating system 176, and/or communication interface 150 of mobile device 105, such as an acceleration input perceived by accelerometer 145A, a change in geographic location input perceived by GPS receiver 145C, and/or one or more instances or user interaction (e.g., typing) detected by user interface 172.
  • identifying events that are perceptible to at least one of sensors 145, user interface 172, operating system 176, and/or communication interface 150 of mobile device 105, such as an acceleration input perceived by accelerometer 145A, a change in geographic location input perceived by GPS receiver 145C, and/or one or more instances or user interaction (e.g., typing) detected by user interface 172.
  • a notification such as a computer readable file containing information that reflects the input itself as well as information that is pertinent to the input (such as the time, date, and a unique identifier such as a MAC address of mobile device 105) is preferably generated by mobile device 105 based on the input, and is transmitted by communication interface 150 of mobile device 105 to central machine 168, preferably via communications network 166.
  • central machine 168 communicates with mobile device 105 directly, such as through a direct Bluetooth pairing and/or through an ad-hoc wireless network.
  • processor 4110 of central machine 168 executing one or more of software modules 4130 executing one or more of software modules 4130, including, preferably, determination module 4170, analyzes at least the first notification to identify one or more determination characteristics, such as one or more of user determination characteristics and/or one or more handheld state characteristics and/or one or more vehicle determination characteristics within the notification.
  • determination characteristics are one or more aspects originating at and/or derived from one or more input(s) and/or notification(s) that provide insight regarding the in- vehicle role, and/or identity of the user that is exerting control over and/or otherwise associated with a mobile device, such as mobile device 105.
  • handheld state characteristics are one or more aspects originating at and/or derived from one or more input(s) and/or notification(s) that provide insight regarding the handheld state of a mobile device, such as mobile device 105, such as whether mobile device 105 is being operated by a user in a handheld or non-handheld state (for example, various angles and/or sudden changes perceived by gyroscope 145B can indicate that mobile device 105 is being operated in a handheld state by a user).
  • a mobile device such as mobile device 105
  • various angles and/or sudden changes perceived by gyroscope 145B can indicate that mobile device 105 is being operated in a handheld state by a user.
  • central machine 168 (as opposed to at mobile device 105, from which the notification analyzed at this step originates) provides several advantages in certain scenarios over having the analysis performed at mobile device 105, as described at step 220.
  • the analysis performed at the present step can be quite resource intensive, and shifting this analysis to central machine 168 ensures that the system resources of mobile device 105 remain relatively free.
  • central machine 168 can be operated by a law enforcement agency, and, as such, a centralized approach, such as the one described with respect to FIG. 4, can provide such an agency with the ability to monitor and/or adjust the operational capacity of mobile device 105 as necessary, as will be described in greater detail below.
  • this centralized approach can be easier to implement with respect to regulatory compliance and preventing tampering. It is expected that both regulatory authorities who are interested in implementing a solution such as that described with reference to FIG. 4 are more likely to succeed in obtaining compliance from mobile device manufacturers and/or mobile communications providers when requiring a solution that primarily only requires, from the standpoint of the mobile device 105, periodic notification transmissions from mobile device 105 to central machine 168. In addition, such a solution can be more difficult for users to manipulate, modify, and/or 'hack,' given that the primary analysis is performed by central machine 168, as opposed to mobile device 105.
  • determination factor(s) such as probabilities
  • processor 4110 of central machine 168 executing one or more of software modules 4130 computes one or more determination factor(s), such as probabilities, based on the determination characteristics identified at step 420.
  • determination factor(s) such as probabilities, based on the determination characteristics identified at step 420.
  • various user determination characteristics and/or handheld state characteristics are generated. For example, as referenced above, in certain arrangements user determination characteristics are identified (such as typing tendencies, as referenced above), while in other arrangements handheld state characteristics (such as one or more angles detected by mobile device 105, as referenced above) can be identified, while in yet other arrangements both user determination characteristics and handheld state characteristics can be identified.
  • one or more probabilities are computed by central machine 168, reflecting a probability that the in- vehicle role of the user of mobile device 105 is a driver, a probability that the in-vehicle role of the user of the mobile device 105 is a passenger, a probability that the handheld state of the mobile device 105 is handheld, and/or a probability that the handheld state of the mobile device 105 is non-handheld, all in a manner substantially similar to that described in detail above with respect to step 230.
  • the user determination characteristics and/or handheld state characteristics identified at step 420 can provide varying degrees of certitude as to the identity/role of a user and/or the handheld state of mobile device 105.
  • a probability that an in-vehicle role of the user of mobile device 105 is a driver/passenger and/or a probability that a handheld state of mobile device 105 is handheld/non-handheld preferably reflects a degree of certainty across such a probability spectrum, as described in detail above.
  • processor 4110 of central machine 168 executing one or more of software modules
  • step 4130 including, preferably, determination module 4170, adjusts an operational capacity of mobile device 105 based on the one or more determination factor(s), such as at least one of the probabilities computed at step 430, substantially in the manner described in detail above with respect to step 240.
  • the description pertaining to step 240 above relates to adjustments and transformations initiated by mobile device 105 upon itself, here the adjustments to the operation of mobile device 105 are initiated by central machine 168.
  • central machine 168 can transmit an operation command, such as a command in the form of one or more notifications, messages, and/or instructions that reflect various adjustments that are to be made to the operational capacity of mobile device 105, and such adjustments can then be applied to mobile device 105 upon its receipt of the transmitted operation command(s), and/or their application/execution, effecting similar and/or identical results as those described in detail above with respect to step 240 (e.g., providing notifications at mobile device 105, restricting operation of mobile device 105, and/or transmitting notifications from mobile device 105 to third parties).
  • an operation command such as a command in the form of one or more notifications, messages, and/or instructions that reflect various adjustments that are to be made to the operational capacity of mobile device 105, and such adjustments can then be applied to mobile device 105 upon its receipt of the transmitted operation command(s), and/or their application/execution, effecting similar and/or identical results as those described in detail above with respect to step 240 (e.g., providing notifications at mobile device 105, restricting operation
  • central machine 168 can adjust the operational capacity of mobile device 105 based primarily and/or exclusively on adjustments made at and/or by central machine 168 which, in turn, preferably effect or otherwise adjust the operational capacity of mobile device 105.
  • central machine 168 is controlled by a mobile communications provider such as a cellular communications provider
  • an adjustment can be implemented at central machine 168 whereby one or more of the services provided by mobile communications provider to mobile device 105 (such as phone, SMS, and/or data services) can be interrupted and/or otherwise adjusted or modified, thereby effecting the operation of mobile device 105 through an adjustment occurring at central machine 168 based on the probability computed at step 430.
  • substantially similar adjustments can be implemented upon and/or through one or more service providers that provide one or more services, whether directly or indirectly, to mobile device 105.
  • various voice over IP (VoIP) providers such as Skype, enable users to achieve voice communications (akin to telephone calls) over data connections (such as an internet connection).
  • VoIP voice over IP
  • data connections such as an internet connection
  • the 'Viber' app enables similar SMS capabilities over an internet connection.
  • the methods and systems disclosed herein can be configured such that any necessary adjustment can be implemented upon and/or through the requisite service provider (for example, by limiting the calling capabilities of Skype and/or the SMS capabilities of Viber) substantially in the manner described in detail above.
  • processor 4110 of central machine 168 executing one or more of software modules 4130, including, preferably, determination module 4170, outputs one or more results and/or operation states of mobile device 105 based on the one or more determination factor(s), such as the probability or probabilities computed at step 430, substantially in the same manner as described in detail above with respect to step 250.
  • step 450 primarily pertains to operations initiated and/or performed by central machine 168.
  • the one or more operation state(s) outputted by central machine 168 reflect the operation state(s) of mobile device 105 (for example, that the device is being used in a handheld state and/or that the device is being used by a driver).
  • the outputting of the operation state(s) can be further based upon one or more determination factor(s), such as one or more probabilities computed at step 430, which reflects the likelihood or degree of certainty that a user of mobile device 105 is a driver/passenger and/or that mobile device 105 is being used in a handheld/non4 andheld state.
  • the operation state of mobile device 105 can be outputted by central machine 168 to an external device or third-party, such as a law enforcement agency, insurance company, and/or other device 160, via communication interface 4150.
  • an external device or third-party such as a law enforcement agency, insurance company, and/or other device 160
  • Such functionality can be advantageous in jurisdictions where administrative regulations recommend and/or require that entities such as mobile communications providers provide information to law enforcement agencies that reflects the unauthorized usage of mobile devices such as mobile device 105 while the user of the device is driving.
  • such functionality can be advantageous to insurance companies when processing an insurance claim.
  • central machine 168 (which receives and retains the various pertinent notifications/inputs provided by the various devices such as mobile device 105) can output the necessary data, such as the operation state of mobile device 105, thereby assisting the insurance company to make necessary decisions regarding the validity of a particular insurance claim.
  • routine 500 that illustrates an aspect of a method of determining a vehicle class of a vehicle using a first mobile device in accordance with at least one embodiment disclosed herein.
  • determining the vehicle class of a particular vehicle can provide further insight and accuracy in the determination of an in- vehicle role of a user of a mobile device.
  • the various steps and/or operations that make up routine 500 share substantial similarities to those described above in connection with FIGs. 2A-C and 3-4.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, receives a first input from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, analyzes at least the first input to identify one or more vehicle determination characteristics within the first input.
  • vehicle determination characteristics are one or more aspects originating at and/or derived from an input that provide insight regarding the vehicle class within or upon which and/or in relation to mobile device 105 is traveling.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, computes at least one determination factor based on the vehicle determination characteristic(s), such as a probability that the vehicle corresponds to a particular vehicle class.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, outputs a vehicle class based on the one or more determination factor(s), such as the probability or probabilities computed at step 530.
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, can transform an operation state of mobile device 105 based in whole or in part on the determination factor(s), such as the probability computed at step 530.
  • This operation can be further appreciated when employed in conjunction with a determination of an in-vehicle role of a user of mobile device 105, such as that depicted in FIGs. 2A-C and described in detail above.
  • a mobile device 105 upon determining (preferably to a certain minimum probability) that a mobile device 105 is traveling within a certain class of vehicle, there can be little need to further determine the in-vehicle role of the user of the device 105 (e.g., if the vehicle is an airplane, all device usage can be prohibited, irrespective of a particular user's in-vehicle role).
  • a transformation (substantially similar to that described in detail above with respect to step 240) can be employed based upon both the computed probability that mobile device 105 is traveling in a car, together with the computed probability (such as that described in detail above with respect to step 230) that the in- vehicle role of a user of mobile device 105 is a driver.
  • processor 110 can coordinate various transformations and/or adjustments to the operation(s) of mobile device 105, as described in detail above with respect to step 240.
  • various of the referenced transformations can be employed only when either one or both of the probabilities pertaining to the vehicle class within which mobile device 105 is traveling and/or the in-vehicle role of the user of mobile device 105 is a driver meet and/or exceed a certain minimum threshold.
  • various physical properties e.g., interior height, width, volume, varying signal absorption and reflections at different frequencies, etc.
  • vehicle classes e.g., bus vs. car, train vs. car
  • various other vehicle class determination techniques such as those described herein and/or known to those of ordinary skill in the art.
  • one or more of a device's speakers can be used to emit one or more sounds (e.g., having a certain frequency, length, volume, etc.) (and/or signals at other frequencies, e.g., RF) and one or more of the device's microphones can be used to hear/perceive such sounds and/or the echoes thereof as the sounds emitted reflect off the objects and structure inside the vehicle (and whose flight time can be longer than the time it takes to travel from the device speaker to the device microphone).
  • sounds e.g., having a certain frequency, length, volume, etc.
  • signals at other frequencies e.g., RF
  • the time it takes for an echoed sound emitted by a device's speakers to be perceived by the device's microphones is relatively smaller for cars (generally having a shorter distance to sides and ceiling) than for buses (generally having a longer distance to sides and ceilings) or the total energy of the echoes returned to the device will tend to be larger for cars than for trains because of their smaller volume.
  • the sounds emitted are chosen to be ones that tend to reflect better (or worse) off car interiors than off train interiors so that, by measuring the time and/or energy of the echoes returned, the vehicle class can be more accurately identified.
  • the sound echoing technique described herein can also be used to better determine whether a device is present within a vehicle or whether a device (and/or its user) is present outside a vehicle and/or engaged in another form of activity (with or without the use of various other trip detection and/or activity recognition techniques, such as those described herein or known to those of ordinary skill in the art).
  • inputs originating from one or more motion sensors (e.g., accelerometer, gyroscope, etc.) of a device can be processed to determine the class of the vehicle within which the device is located (e.g., car, bus, train, etc.) and, based upon the determined vehicle class, one or more restrictions (and/or no restrictions) can be employed at/in relation to the device.
  • one or more motion sensors e.g., accelerometer, gyroscope, etc.
  • restrictions and/or no restrictions
  • a device e.g., accelerometer, GPS, cellular, WiFi
  • the forward acceleration of a train is an order of magnitude lower than the forward acceleration of a car, while the length of sustained acceleration of a train is longer than that of a car
  • a policy of no restrictions is applied to the device on the premise that it is a passenger device (while different policies can be applied to those few devices of users who are train conductors, bus drivers etc.).
  • routine 600 that illustrates an aspect of a method of determining a handheld state a mobile device in accordance with at least one embodiment disclosed herein.
  • routine 600 is primarily directed to determinations performed at mobile device 105, as will be described in greater detail below.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, receives a first input from one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150.
  • the first input originates from one or more identifying events that are perceptible to at least one of sensors 145, user interface 172, operating system 176, and/or communication interface 150.
  • the first input originates from one or more identifying events that are perceptible to at least one of sensors 145, user interface 172, operating system 176, and/or communication interface 150 of mobile device 105, such as an acceleration input perceived by accelerometer 145 A and/or a change in orientation input perceived by gyroscope 145B.
  • identifying events such as an acceleration input perceived by accelerometer 145 A and/or a change in orientation input perceived by gyroscope 145B.
  • a series of inputs such as a number of acceleration inputs over a certain period of time
  • a combination of inputs such as a number of acceleration inputs and orientation inputs over a period of time
  • handheld state characteristics are one or more aspects originating at and/or derived from one or more input(s) that provide insight regarding the handheld state of a mobile device, such as mobile device 105, such as whether mobile device 105 is being operated by a user in a handheld or non-handheld state.
  • various orientations and/or sudden changes perceived by gyroscope 145B can indicate that mobile device 105 is being operated in a handheld state by a user.
  • a relatively constant pattern of inputs from accelerometer 145A and/or gyroscope 145B can indicate that mobile device 105 is positioned in a relatively stable manner, thus indicating that it is being operated in a non4 andheld state.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, computes one or more determination factor(s), based on the handheld state determination characteristic(s), such as a probability that that the handheld state of mobile device 105 is handheld, or that the handheld state of mobile device 105 is non-handheld.
  • the handheld state determination characteristic(s) such as a probability that that the handheld state of mobile device 105 is handheld, or that the handheld state of mobile device 105 is non-handheld.
  • the handheld state determination characteristic(s) such as a probability that that the handheld state of mobile device 105 is handheld, or that the handheld state of mobile device 105 is non-handheld.
  • a high probability e.g., greater than 90%
  • step 630 such probabilities are computed, reflecting a probability that the handheld state of the mobile device 105 is handheld, and/or a probability that the handheld state of the mobile device 105 is non-handheld, in a manner substantially similar to that described in detail above with respect to steps 230 and 430.
  • the handheld state characteristics identified at step 620 can provide varying degrees of certitude as to the handheld state of mobile device 105. Accordingly, it should be appreciated that because ranges exist of a particular handheld state characteristic (such as device shake patterns), a probability that a handheld state of mobile device 105 is handheld/non-handheld preferably reflects a degree of certainty across such a probability spectrum, as described in detail above.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170, outputs one or more handheld states of mobile device 105 based on the one or more determination factor(s), such as the probability or probabilities computed at step 630, substantially in the same manner as described in detail above with respect to steps 250, 450, and 550.
  • the one or more determination factor(s) such as the probability or probabilities computed at step 630, substantially in the same manner as described in detail above with respect to steps 250, 450, and 550.
  • a notification can be provided at mobile device 105 indicating that it has been determined that the device is being so operated.
  • a notification can further include a suggestion/instruction that the user of the device 105 refrain from further use of the device, in deference to regulatory guidelines.
  • the handheld state of mobile device 105 can be output to a third-party, such as a law enforcement agency, under appropriate circumstances. Additionally, as noted in detail above with respect to step 250, such outputting can, in certain arrangements, be contingent upon a certain minimum probability being computed (e.g., a 90% or greater probability that a mobile device 105 is operating in a handheld state), while in other arrangements the handheld state can be outputted across any and/or all degrees of probability.
  • a certain minimum probability e.g., a 90% or greater probability that a mobile device 105 is operating in a handheld state
  • the handheld state can be outputted across any and/or all degrees of probability.
  • Described herein in various implementations are techniques (including methods, systems, etc.) that are operative to improve the accuracy of various determinations, including determinations pertaining to whether a device is in a particular context (or not) and/or reducing the power consumed/expended (such as by the device) in order to make such determination(s).
  • Such techniques can be advantageous, for example, in situations in which such determinations are to be made repeatedly over extended periods of time. It should be understood that, in certain situations, particular sources of information that may otherwise be considered/analyzed in order to make certain context-based determinations/decisions may not be available.
  • GPS information may not be available, and such inputs are thus unavailable to be used in determining movement and/or speed by/in relation to a device.
  • usage of such sensors can consume substantial amounts of power at the device.
  • FIG. 27 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more power charging aspects can be determined, such as in relation to a user device.
  • the one or more power charging aspects can be processed. In doing so, a context of the user device can be determined.
  • one or more operations can be initiated. In certain implementations, such operations can be initiated based on the context. Moreover, in certain implementations, such operations can be initiated in relation to the user device.
  • FIG. 28 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more indications can be received.
  • each of the one or more indications can correspond to a perception of one or more access points.
  • each of the one or more indications can be associated with an operational context of a device.
  • the one or more indications can be processed.
  • such indications can be processed in order to determine one or more characteristics associated with one or more respective perceptions of at least one of the one or more access points.
  • the one or more characteristics can be associated with one or more operational contexts.
  • one or more operations can be initiated. In certain implementations, such operations can be initiated based on a determination of at least one of the one or more characteristics in relation to a user device. Moreover, in certain implementations such operations can be initiated in relation to the user device based on at least one of the one or more operational contexts.
  • FIG. 29 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more operational characteristics of a device can be identified.
  • one or more contextual determination methods can be selectively employed, such as in relation to the device. In certain implementations, such determinations can be employed based on the operational characteristics.
  • an inapplicability of at least one of the one or more contextual determination methods can be determined, such as in relation to the device. In certain implementations such an inapplicability can be determined based on at least one of the operational characteristics.
  • FIG. 30 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more indications can be received.
  • each of the one or more indications can correspond to a perception of one or more access points, such as in relation to a user device.
  • At least one of the one or more indications can include one or more locations, such as those corresponding to the at least one of the one or more access points.
  • the one or more indications can be processed, such as in relation to one or more access point records. In doing so, one or more characteristics of at least one of the one or more access points can be determined. In certain implementations, the one or more characteristics can include (a) nomadic or (b) stationary.
  • the one or more locations can be processed, such as in order to determine a locational variability, such as with respect to the at least one of the one or more access points.
  • the locational variability can include a difference between (a) a location associated with a connection to the at least one of the one or more access points and (b) a location associated with a disconnection from the at least one of the one or more access points.
  • a context of the user device can be determined. In certain implementations, such a context can be determined based on the one or more characteristics. In certain implementations, the context can include at least one of: (a) within a vehicle, (b) not within a vehicle, (c) within a trip, or (d) not within a trip.
  • FIG. 31 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG.
  • each of the one or more indications can correspond to a perception of one or more access points, such as in relation to a user device.
  • the one or more indications can be processed, such as in order to determine a variability with respect to the one or more indications.
  • the variability can include a quantity of distinct access points perceived during a chronological period.
  • a context of the user device can be determined, such as based on the variability.
  • FIG. 32 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more indications can be received.
  • each of the one or more indications can correspond to a perception of one or more access points, such as in relation to a user device.
  • the one or more indications can be processed, such as in order to determine a quantity of access points perceptible to the user device.
  • a context of the user device can be determined, such as based on the quantity.
  • the context can include (a) an urban location or (b) a rural location.
  • a trip determination threshold can be selected, such as with respect to the user device. In certain implementations, the trip determination threshold can be selected based on the context.
  • FIG. 44 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more indications can be received, each of the one or more indications corresponding to a perception of one or more access points in relation to a user device.
  • the one or more indications can be processed, such as to determine one or more characteristics of at least one of the one or more access points.
  • a context of the user device can be determined, such as based on the one or more characteristics.
  • one or more characteristics can be associated with one or more operational contexts.
  • one or more operations can be initiated, such as based on a determination of at least one of the one or more characteristics in relation to a user device. In certain implementations, the one or more operations can be initiated in relation to the user device based on at least one of the one or more operational contexts.
  • a trip determination threshold can be selected with respect to the user device, such as based on a context.
  • FIG. 45 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more inputs associated with a user device can be processed, such as to determine one or more mobility characteristics of the user device.
  • a context of the user device can be determined, such as based on a determination that the one or more mobility characteristics comprise mobile operation and do not comprise operation consistent with walking.
  • FIG. 46 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • an indication can be received, corresponding to a perception of a nomadic access point in relation to a user device.
  • a quantity of devices with respect to which the nomadic access point is perceptible can be determined.
  • a context of the user device can be determining, such as based on the quantity.
  • various techniques described herein can pertain to determinations as to whether or not a device is present/operational 'within a vehicle' (e.g., a moving vehicle) and/or 'within a trip' (e.g., a trip within a moving vehicle) (it should be noted that such terms can be used interchangeably). In doing so, various aspects of the power consumed/expended by a device (such as in order to make one or more of the referenced determinations) can be reduced.
  • one or more of the referenced techniques can also be implemented to determine or otherwise detect other contexts while consuming/expending relatively less power, e.g., determining the in- vehicle role of a user of a particular device, whether or not a device is present within a class or on school grounds, what mode/class of transportation/vehicle the device is present within (e.g., car, train, bus, bicycle, walking, etc.), whether or not the user is a vulnerable road user, and more.
  • information related to one or more aspects of the battery and/or the power consumption of a device can be used to determine whether the device is in a vehicle.
  • a determination that the charge level of a battery that is connected to power is increasing relatively slowly can indicated an increased likelihood that the device is present within a vehicle.
  • the referenced technique(s) can also incorporate aspects relating to the power being consumed by the device (e.g., in order to operate).
  • such technique(s) can further incorporate various comparisons and/or machine learning techniques with respect to characteristics pertaining to the battery of a device (e.g., in relation to the performance/operation of the specific battery and/or the specific device and/or across a population of identical and/or similar/comparable model batteries and/or devices over time).
  • a variability in the rate at which a battery charges can be used to determine a context in which a device is operating (e.g., within a vehicle). Such determinations can be made based on the variability in the rate at which the battery of a device charges, as a device that is connected to a car charger is relatively more likely to display a relatively more variable battery charge rate than a device connected to a wall outlet.
  • one or more aspects of a context within which a device is present/operating can be determined based on the voltage across a battery of the device and/or changes thereto. Such determinations can be made, for example, based on the fact that the voltage across a battery of a device that is connected to a car charger is likely to exhibit relatively more variability than that of a device connected to a wall outlet.
  • various determinations can be performed in order to identify/distinguish between various types of power charge sources such as solar battery chargers and fixed wall chargers, based on the variability of the power from such sources (e.g., a power source providing a relatively smaller current, a relatively more variable current, and/or a current having a relatively more variable voltage, can be determined to be relatively more likely to be present within a vehicle, and vice versa) that they are able to provide to the device battery.
  • a power source providing a relatively smaller current, a relatively more variable current, and/or a current having a relatively more variable voltage, can be determined to be relatively more likely to be present within a vehicle, and vice versa
  • various determinations pertaining to the context within which a device is present/operating can be achieved based on the temperature of a battery of the device. Moreover, and changes in such temperature(s) can be used to determine one or more contexts of the device (e.g., whether the device is within a vehicle). Such determination(s) can be made, for example, in light of the fact that the temperature of a device that is connected to a car charger is likely to exhibit different characteristics (e.g., more extreme temperatures) than those exhibited by/observed in relation to a device connected to a wall outlet.
  • one or more determinations pertaining to a hands-free state of a device can be made based on one or more aspects of the power connection status of the device.
  • a device that is connected to a power source is relatively more likely to be operated by a driver than by a passenger (in light of the fact that, for example, a vehicle is relatively more likely to have one or more a power connections that are compatible with a device operated by a driver, and the locations such power sources, such as a cradle/dock having a power connection, is relatively more likely to be positioned in closer proximity to a driver than a passenger, etc.).
  • various factors, aspects, metrics, etc. including but not limited to: (i) whether a device was connected to power at any point during a trip, (ii) the time into the trip at which it was so connected, (iii) the length of time for which it was connected, (iv) the battery level at the time at which it was connected, (v) the magnitude of the electric current at which it charged and (vi) the charging port to which it was connected (e.g., as reported by the vehicle to the device or as learned by one or more devices over time) , can all assist/be factored in determining if the device is a driver device or a passenger device.
  • drivers may tend to connect to power more frequently than passengers because drivers are often more familiar, tend the have chargers appropriate for their devices, and often have only one charger available. Accordingly, if a device is determined to have been connected to power during a trip, it is more likely to be a driver device than a passenger device. The earlier in a trip a device was connected to power (the driver connecting by force of habit, the driver connecting because she knew how to do so, etc.), the more likely it is to be a driver device. The more battery remaining in a device when it is connected to power, the more likely it is to be a driver device (because, for example, a passenger is less likely to impose on a driver unless there is a greater battery problem on their device). In vehicles that have multiple charging ports, the role of a device user that connects to power can be determined from the port that she connects her device to. For example, a driver is less likely to connect her device to a passenger charging port.
  • a device when a device is connected to a power source and its context (e.g., within a vehicle vs. outside of a vehicle, within school grounds vs. outside of school grounds, etc.) has been identified and/or can be determined, one or more further determinations can be made with respect to the ongoing state/status of such a device. For example, it can be determined that such a device is relatively likely to remain within or otherwise maintain the same/comparable context at least until the device is no longer connected to a power source.
  • one or more instance(s) e.g., a "short burst" of additional determination techniques (e.g., contextual determinations, such as those described herein), such as those that may entail higher or additional power consumption, can be initiated or otherwise employed.
  • additional determination techniques e.g., contextual determinations, such as those described herein
  • a determination of the context within which a device is present/operating can be achieved initially (i.e., upon connecting the device to a power source) such that, having identified/determined such a context, various aspects of the power consumption of the device can be reduced (e.g., over the medium and long terms).
  • one or more sensors of the device e.g., GPS, WiFi radio, cellular radio, Bluetooth radio, accelerometer, gyroscope, etc.
  • sensors of the device can be turned on/activated, such as for a period of time, in order to obtain/receive one or more inputs based upon which it can be determined whether the device is (or is not) present/operating within a vehicle.
  • the device Based on a determination that the device is not present/operating within a vehicle, it can be further determined that the device is subsequently not present/operating within a vehicle as long as and/or at least until such time as the device is determined to no longer be connected to a power source (and/or connected to the power source with respect to which the device was initially connected). Based on a determination that the device is present/operating within a vehicle, it can be further determined that such a state/status is likely continue, and thus that that the device is to continue to be present/operating within the vehicle, at least until such time as is the device is determined to no longer be connected to a power source (and/or connected to the power source with respect to which the device was initially connected).
  • the manner in which one or more determinations are made can be further configured, modified, adjusted, etc., based on/with respect to whether or not the device can be determined to be (or has or has not recently been) connected to power (e.g., the various techniques that are used to compute the referenced determinations can be adjusted or changed based on whether a device is and/or isn't connected to power).
  • a mobile device when connected to power, it may use its GPS radio more intensively (without depleting the device's battery), either directly and/or indirectly (e.g., by leveraging other OS processes or user processes that may do so), to help determine one or more contexts in which the mobile device is in (e.g., trip start, trip end, vehicle class, in-vehicle role, in-vehicle location, etc.).
  • GPS radio more intensively (without depleting the device's battery), either directly and/or indirectly (e.g., by leveraging other OS processes or user processes that may do so), to help determine one or more contexts in which the mobile device is in (e.g., trip start, trip end, vehicle class, in-vehicle role, in-vehicle location, etc.).
  • one or more of the techniques described herein operate based on the assumption that power sources (with respect to which the various determinations are computed) are non- nomadic (i.e., that such power sources/supplies are fixed in a particular location). Accordingly, while in certain situations nomadic power sources/supplies can also be implemented (e.g., external batteries, portable power sources, etc.), the use of such nomadic power sources can be determined/detected and/or otherwise accounted for in a number of ways.
  • the usage of such nomadic power sources can be determined based on the level of current or voltage identified at the device/batter (and/or changes thereto), and/or via the magnetometer of the device (and/or changes thereto), and/or based on one or more movement patterns that can be identified as being consistent with the use of a heavier device, as can be determined based on one or more inputs originating from various motion sensors of the device (e.g., accelerometer, gyroscope).
  • such nomadic power sources are relatively less common and/or can entail various barriers to user adoption (e.g., cost, size, weight, being cumbersome, etc.).
  • one or more of the referenced determinations can be accumulated/maintained over time, such as with respect to a particular device, a user/user account associated with a device, etc.
  • a device and/or a user/user account that has been determined (such as on an ongoing basis) to utilize a nomadic power supply the referenced techniques (which, for example, determine a context of a device based upon a power connection state/status of the device) can be utilized relatively more selectively and/or not at all with respect to such devices.
  • a scan of WiFi access points that are perceptible to the device can be initiated or otherwise performed, such as at one or more intervals.
  • the number/quantity of and/or identities (e.g., service set identifiers or 'SSIDs') associated with WiFi access points that are subsequently perceptible at the device can be compared to an initial/previous number/quantity of and/or identities associated with WiFi access points perceptible at the device.
  • a determination can be made with respect to whether or not (and/or when) the device is moving, such as is described in greater detail herein. For example, in a scenario where access points A, B and C are perceptible at a device, and at a subsequent time only access points D, E and F are visible, it can be determined that it is relatively likely that the device has moved (on account of the changing identities of the access points perceptible at the device).
  • the referenced change(s) with respect to the perception of various access point(s) can provide relatively accurate determinations with respect to movement of the device, as described herein.
  • the context within which a device is present/operating can be determined in relation to various aspects of the power connectivity/connection associated with the device, by using one or more machine learning techniques (as are known to those of ordinary skill in the art) to identify, such as over time, one or more patterns, properties, and/or characteristics associated with various power sources/chargers that the device connects to and/or associating such patterns, properties, and/or characteristics with one or more contexts. That is, it can be appreciated that many users can utilize different power sources/chargers in different locations (e.g., a USB charger at work, a cigarette lighter charger in the car and a wall charger at home).
  • Such power sources can have different electrical properties (current, voltage, etc.) that can be perceived at/in relation to the device (e.g., USB charger in the office may charge the battery of a particular device at 240 mAh, thereby charging the battery at a rate of 0.2%/minute , while a car charger may charge the battery at 600 mAh, thereby charging the battery at a rate of 0.5 /minute, and an AC wall charger at home may charge the battery at 1200 mAh, thereby charging the battery at a rate of at 1 %/minute.
  • the device e.g., USB charger in the office may charge the battery of a particular device at 240 mAh, thereby charging the battery at a rate of 0.2%/minute
  • a car charger may charge the battery at 600 mAh, thereby charging the battery at a rate of 0.5 /minute
  • an AC wall charger at home may charge the battery at 1200 mAh, thereby charging the battery at a rate of at 1 %/minute.
  • the battery of the device is being charged at a rate of 0.5%/minute concurrent with or otherwise chronologically proximate to a time with respect to which such a device can also be determined to be present/operating within a vehicle (such as a moving vehicle)
  • a vehicle such as a moving vehicle
  • one or more connectivity state(s) of the device can be used to determine whether or not the device is present/operating within a vehicle. For example, based on a determination that the device is connected to a WiFi access point, the performance of various power consuming operations (that might otherwise be performed by/in relation the device, such as using the GPS of the device to determine whether the device is moving) can be avoided and/or limited (such as by performing such operations less frequently). Moreover, the device can continue to operate in such a fashion (i.e., avoiding/limiting the use of such operations) until the referenced WiFi connection is terminated (and/or shortly thereafter).
  • an ongoing connection to a WiFi access point can indicate that the device is relatively unlikely to be in a vehicle.
  • a device based on a determination that a device is/is not connected to a Wifi AP, it can be determined that such a device is not/is present within a vehicle (such as a moving vehicle).
  • one or more operations can be initiated, such as with respect to the device (e.g., employing, modifying, removing various restrictions that are employed with respect to the operation of the device).
  • one or more of the technologies described herein can incorporate one or more machine learning techniques, such as with respect to the WiFi access points that are perceptible to a particular device.
  • one or more determinations can be made, such as with respect to whether (a) an access point is static/stationary (e.g., within a home, office, etc.), as can be determined, for example, based on one or more perceptions of such WiFi access points by the device concurrent with and/or in chronological proximity to one or more determinations that indicate that the device is unlikely to be moving, or (b) an access point is dynamic/mobile (e.g., within or on a car, train, bus, etc.) based on one or more perceptions of such WiFi access points by the device concurrent with and/or in chronological proximity to one or more determinations that indicate that the device is likely to be moving.
  • an access point is static/stationary (e.g., within a home, office, etc.), as can be determined, for example, based on one or more perceptions of such WiFi access points by the device concurrent
  • the referenced determinations can be made, for example, by measuring the one or more aspects of the motion(s) that can be perceived by/in relation to the device (based on one or more inputs that can originate, for example, from information/sources such as GPS, cellular IDs, cellular RSSI, WiFi Ids, Wifi RSSI, accelerometer, gyroscope, etc.) concurrent with and/or in relatively close proximity to one or more determinations that the device is connected to (or is otherwise able to perceive or "hear" without necessarily being connected to) a WiFi access point (or through the use of a third- party database of Wifi access points, such as Skyhook, based upon which and/or in relation to which such determinations can be made).
  • information/sources such as GPS, cellular IDs, cellular RSSI, WiFi Ids, Wifi RSSI, accelerometer, gyroscope, etc.
  • the context within/in relation to which a device is present/operating e.g., whether a trip, such as in a vehicle, has commenced, whether the device is within a moving vehicle, whether the device is with a user walking, the vehicle class of a vehicle, etc.
  • a device e.g., whether a trip, such as in a vehicle, has commenced, whether the device is within a moving vehicle, whether the device is with a user walking, the vehicle class of a vehicle, etc.
  • a determination that a device is connected to a power source based on (a) a determination that a device is connected to a power source, and (b) a determination that one or more wireless (WiFi), Bluetooth, cellular, etc., signals (such as those originating from a WiFi access point) and/or one or more inputs originating at one or more sensors (e.g., accelerometer, gyroscope, magnetometer, proximity, pressure, light, camera, microphone, etc.), are changing (e.g., changing above a certain threshold/degree of change) (e.g., different WiFi access point BSSIDs are determined to be coming into/out of view, cell tower signal strengths are determined to be changing more than a certain amount, etc.), it can be determined that the device is likely to be within a moving vehicle.
  • WiFi wireless
  • WiFi wireless
  • the device is relatively less likely to be in certain types of moving vehicles (e.g., bicycle, elevator) because devices are relatively less likely to be connected to power in such situations.
  • one or more of the various determination techniques described herein can be employed upon determining that a device is connected to a power source. In doing so, such determinations with respect to the context of the device can be utilized to conserve power (e.g., by deactivating, utilizing relatively less frequently and/or in a different way, and/or not utilizing at all, one or more sensors, functions, and/or operations of the device, such as those that consume power, based on the determined context of the device).
  • Such functionality can be advantageous even when a device is connected to a power source (e.g., an AC outlet) because a user may only have a limited amount of time to charge his/her device and utilizing relatively less power can enable the battery to charge more quickly.
  • a device may or may not be connected to power
  • based on (a) a determination that one or more wireless (WiFi), Bluetooth, cellular, etc., signals (such as those originating from a WiFi access point) are changing (e.g., changing above a certain threshold/degree of change) (e.g., different SSIDs are determined to be coming into/out of view, signal strengths are determined to be changing more than a certain amount, etc.), and (b) a determination that the device is not moving in a manner consistent with one or more activity types and/or vehicle classes (e.g.
  • the device is determined to be moving in a manner consistent with a user walking, a user bicycling, a user riding an elevator, etc.) and/or in the absence of a determination that the device is moving in a manner consistent with such one or more activity types and/or vehicle classes (as can be determined, for example, in the case of user walking, , using various pedometric determination techniques, as are known to those of ordinary skill in the art, or in the case of a user in an elevator, for example, using a pressure sensor on the device to determine a relatively significant change in pressure over a relatively short time and/or distance), it can be determined that the device is likely to be present in conjunction with certain activity types and/or within certain vehicles classes types.
  • a determination that a device is, for example, present within a moving vehicle can be made even without the use of the device's GPS (which can otherwise entail considerable battery discharge).
  • determining that the device is not present with a user who is walking can be achieved by sampling an accelerometer and/or gyroscope for signs/indications/patterns of walking, which can be done at a relatively lower sampling rate (thereby consuming less battery power) than otherwise required to determine that the device is present within a moving vehicle.
  • a user/device based on a determination that a user/device has moved more than a certain distance (e.g., more than a distance that could otherwise be attributed to GPS 'noise,' even if not moving), such as within a certain period of time, and further based on a determination that the device has not been operating in a manner that can be determined to be sufficiently consistent to walking (such as within the referenced period), the user/device can be determined to be present within a moving vehicle.
  • a certain distance e.g., more than a distance that could otherwise be attributed to GPS 'noise,' even if not moving
  • the user/device can be determined to be present within a moving vehicle.
  • the various techniques described herein can be employed in order to make any number of determinations, including but not limited to: activity determination(s), vehicle movement determination(s), and/or vehicle class determination(s). In doing so, such determinations can be achieved even in scenarios/situations where determination-related 'blind spots' might otherwise be present, such as in scenarios where GPS signals are not available or otherwise reliable (e.g., in tunnels, underground garages, cities with tall buildings, indoors, inclement weather, etc.).
  • a vehicular trip can be determined based on inputs originating from a device's motion sensors by perceiving centripetal forces with magnitudes within certain ranges (e.g., O. lg to lg), and for certain lengths of time (e.g., more than 1 second). The accuracy/correctness of such a trip determination can be quickly verified by determining the device locations and whether it recently traversed a turn in the direction and with a radius consistent with what was perceived by the motion sensors on the device.
  • a vehicular trip can also be determined based on forward acceleration of sufficient magnitude and length. For example, a device that perceives significant acceleration (or change in acceleration) along an axis for sufficiently long (e.g., more than 1 second) is likely to be in a trip.
  • the described techniques can be applied in an orientation-variant way, e.g., the device orientation is resolved using the device sensors (e.g., acceleration, gyroscope, magnetometer, GPS) or in an orientation- invariant way, i.e., acceleration across any/all axes is used to make such determination.
  • the device sensors e.g., acceleration, gyroscope, magnetometer, GPS
  • an orientation- invariant way i.e., acceleration across any/all axes is used to make such determination.
  • the various techniques described herein can also enable the identification/determination of vehicles moving at practically any speed (in contrast to otherwise waiting for the vehicle to reach a speed at which it can be determined that an unaided human cannot reasonably travel in order to affirmatively determine that a device is present within a moving vehicle). In employing such techniques, it can be determined that a device is present within a moving vehicle even in a scenario where the vehicle is moving at a relatively slow speed.
  • a device upon identifying/determining that a device is present within a moving vehicle, it can be further determined that such a device is no longer present within a moving vehicle upon identifying/determining that the device is present in conjunction with 'walking.' (It should be noted that the terms 'walking,' 'user walking,' etc., as used herein are intended to encompass any and/or all forms of unaided human movement, such as walking, running skipping, etc.)
  • employing one or more of the referenced techniques can enable determination(s) of one or more activities, operations, presence of a device within a moving vehicle, vehicle class(es), etc., without the use of GPS.
  • the GPS of a device can, for example, be utilized only when other signals are not sufficiently strong, clear, accurate, etc., and/or to confirm the accuracy of such other determinations.
  • one or more determinations can be made with respect to vehicle class based on the perception/identification of one or more RF devices (e.g., WiFi APs, BT devices, etc.) that are in proximity of/present within the vehicle.
  • the identity e.g., BSSID, SSID, MAC address, etc.
  • the RF device can be used to determine a likelihood of a particular vehicle class (with respect to which the device is present). In doing so, it can be detected/determined that the device is present within a moving vehicle (such as in the manner described herein).
  • An identity of one or more related/comparable/similar RF devices can be identified/determined, such as based on/in relation to a database that can maintain identities of various RF devices and their respective associated vehicle classes (as determined, for example, based on inputs originating at one or more sensors of various devices, such as in the manner described herein). Moreover, one or more determinations (e.g., of a vehicle class) can be made based on similar, related, and/or comparable (even if not identical) RF device identity information, such as that stored/maintained in such a database.
  • the first BSSID of a WiFi AP can be determined to be likely to be operating in relation to a bus, too (in light of the fact that bus corporations tend to purchase in bulk and are relatively likely to receive successively/closely numbered devices).
  • a data value of an RF device like a WiFi AP's SSID
  • an SSID identified/determined to be used in a public train e.g., "AmtrakConnect”
  • another RF device having the same SSID or a similar/comparable SSID (e.g., "AmtrakConnect2”) that such a device with the same/comparable SSID is also on a public train.
  • potential errors in such determinations that may arise can be corrected by using one or more techniques (such as those described herein) to validate the determined vehicle class in relation to such RF devices.
  • the RF devices operating in relation to vehicles in certain vehicles classes are relatively likely to be perceived by multiple mobile devices simultaneously and over time than, for example, in a car.
  • one or more determinations as to the class of a vehicle can be made based on the relative quantity of devices that perceive such an RF device simultaneously and/or over time.
  • the greater the number of devices determined to simultaneously and/or over time perceive a particular RF device the relatively more likely it is that such devices are present within certain types of "mass" vehicle classes (e.g., train, bus, etc.).
  • such RF device(s) can be associated with a particular vehicle class on that basis, and/or using one or more techniques such as those described herein.
  • the accuracy of such determinations can be further improved over time. For example, observing the same and/or similar changes with regard to certain access points - such as during a trip - multiple times, can further confirm that subsequent comparable observations can also be determined to be occurring during a trip. It should also be noted that the accuracy of the referenced determinations can be further improved based on a determination that the referenced signals (e.g., those originating from WiFi access points, etc.) originate from a nomadic and/or non-nomadic source, as determined, for example, using one or more of the techniques described/referenced herein.
  • the referenced signals e.g., those originating from WiFi access points, etc.
  • the GPS can be utilized, alone or in conjunction with other signals and/or sensors), to determine whether the device is present within a moving vehicle, etc. It should also be noted that one or more additional determinations can be made (e.g., upon determining that a device is within a moving vehicle), such as with regard to the type of vehicle that the device is in (e.g., car, boat, train, bus, etc.), such as in the manner described herein.
  • one or more determinations can be made with respect to the characterization and/or classification of various WiFi access points (such as stationary/static/fixed/non-nomadic or mobile/dynamic/nomadic). Such determinations can be made, for example, based on (a) information accumulated/stored by/in relation to a device (such as over time), (b) information accumulated by/in relation to multiple devices (i.e., 'crowd-sourced') (such as over time), (c) based on information contained/identified within various WiFi broadcasts (reflecting, for example, the manufacturer, model, etc., of an access point), as can be identified, for example, based on one or more comparisons with one or more databases (as can be stored on the device and/or remotely) that contain data pertaining to which manufacturers, models, etc. are likely to be utilized in stationary/static/fixed/non-nomadic or mobile/dynamic/nomadic contexts, and/or (d) in conjunction with third party services.
  • a device such as over time
  • one or more of the contextual determinations described herein can be further configured to operate based on/in relation to a premise/default/assumption that a device is not present/operating within a vehicle based on a determination that such a device is connected to an access point that can be determined to be non-nomadic access point (i.e., an access point that can be determined to have a relatively fixed location and/or whose location is not determined to change relatively regularly).
  • non-nomadic/stationary access point can include a WiFi access point in an office network (in contrast to an in- vehicle WiFi access point or a mobile device that can serve as a WiFi access point, a such devices can be determined to be mobile).
  • the manner in which a particular access point, such as a WiFi access point, can be determined to be non-nomadic/stationary can be achieved in a number of ways.
  • the nomadicity that is, the nomadic/non-nomadic state/status
  • the nomadicity of an access point can be determined based on/in relation to a database (local or remote) that can maintain information pertaining to various access points, and/or that contains respective classifications of such access points, such as in relation to the nomadic/stationary and/or non-nomadic/mobile characteristics/properties of such access points (e.g., BSSIDs).
  • Such a database can be built/compiled, for example, based on the results of various nomadic / non-nomadic determinations that can be performed with respect to various access points, such as is described herein (with respect to individual devices and/or accumulated across devices, such as via 'crowd-sourcing,' and/or utilizing a third party database of WiFi access points, such as Skyhook, that can provide the referenced nomadic / non-nomadic information/determinations, or, based on a determination that, by virtue of the method with respect to which information pertaining to such WiFi access points was collected/gathered, that the referenced WiFi access point records/entries pertain to non-nomadic access points.
  • a third party database of WiFi access points such as Skyhook
  • the nomadicity of a an access point can be determined by querying/ looking up the referenced access point within a database (local or remote) that maintains associations of the context within which the device is present/operating with an identifier of the WiFi access point (e.g., a BSSID) that is the device is/was connected to.
  • a database can be generated based on the results of one or more determinations that pertain to the context within which a device is present/operating in relation to the access point to which the device is connected, as described herein (both with respect to individual devices and/or accumulated across devices, such as via 'crowd-sourcing,' and/or may be a third party database of access points that provides such information.
  • a device For example, based on a determination that on multiple instances a device was connected to a particular WiFi access point and was also determined not to be present/operating within a vehicle, then upon subsequently determining that the device is again connected to the same WiFi access point, it can be determined that the device is again relatively likely t not to be present/operating within a vehicle.
  • one or more of the referenced determinations can also incorporate and/or otherwise consider aspects pertaining to BSSID, Beacon, and/or WPS transmissions. For example, based on one or more of such items the identity of the manufacturer of the WiFi access point to which the device is connected can be identified. By way of illustration, a BSSID of an access point can be used to identify the manufacturer. And, having identified the manufacturer, such an identity can be used to further increase the accuracy of the referenced determination(s). For example, in a scenario where the BSSID of the WiFi access point to which a device is connected is that of a manufacturer that makes mobile phones, it can be determined that there is a relatively greater likelihood that such an access point is nomadic.
  • the context of such a device can be further analyzed/checked (i.e., rather than solely relying on the connection of the device to the referenced access point in order to determine the context of the device).
  • a WiFi access point having a BSSID prefix of 40:A6:D9 is assigned to WiFi access points made by Apple and can thus be determined to be relatively likely to be an iPhone 4.
  • a database can also be utilized in cases/scenarios where a manufacturer produces both nomadic and non-nomadic devices, in light of the fact that such devices often have multiple prefixes and different prefixes can pertain to different devices.
  • the value of the BSSID suffix i.e., the last 6 characters of the 12-character BSSID and which are generally assigned contiguously to similar/comparable devices, can be compared to such a database in order to further determine the nomadicity of such an access point.
  • additional information e.g., device type, model name, model number, etc.
  • WPS WiFi Protected Set-Up
  • the model number of model name of the device e.g.
  • the nomadicity of an access point can be determined based on whether the device can be determined to be moving at the time it is also determined to be connected to the access point (such movement can be determined, for example, based on one or more inputs/sources such as GPS, cellular IDs, cellular RSSI, WiFi IDs, Wifi RSSI, accelerometer, gyroscope, etc., such as are described herein).
  • an access point can be characterized/classified as being nomadic based on a determination that the connected device was moving concurrent with and/or in relatively close chronological proximity to being connected to the referenced access point, while being characterized/classified as non-nomadic based on a determination that the device was not moving concurrent with and/or in close chronological proximity to being connected to the referenced access point.
  • the context of the device e.g., being mobile or stationary while connected to the access point
  • a device in addition to determining that a device is not present/operating within a vehicle based on a determination that the device is connected to an access point determined to be non-nomadic, a device can be positively/affirmatively determined to be present/operating within a vehicle based on a determination that the device is connected to an access point determined to be nomadic.
  • connection of a device to such access point(s) can be relatively less indicative of the fact that the connected device is not present/operating within a vehicle (as opposed to access points having a relatively shorter transmission range).
  • one or more techniques such as the RSSI technique described herein can be utilized in order to make a determination regarding the nomadicity of the access point.
  • one or more of the access points that are perceptible to the device can be analyzed to determine the context of the device.
  • the access point to which the device is connected can be classified as nomadic / non-nomadic and saved (locally and/or remotely) for future use (such as by this device and/or with other devices), such as using one or more techniques, such as those described herein.
  • an access point perceived by a device can be characterized/classified as nomadic / non-nomadic based on a query/identification of a database containing a record of such an access point (and/or a comparable access point based upon which such determination can be made) and/or by making such determination dynamically, such as based on information received/derived from the BSSIDs (e.g., manufacturer, model) and/or WPS information (e.g., model) of an access point.
  • BSSIDs e.g., manufacturer, model
  • WPS information e.g., model
  • a device can perceive one or more access points that are known or have been determined to be non-nomadic access points (such as over some period of time, in order to account for nomadic access points that may be perceptible to a moving device for relatively short periods of time), such a device can be further determined to be stationary/not in motion. Moreover, if the device can perceive one or more access points determined to be nomadic (such as over some defined period of time), such a device can be determined to be in motion.
  • one or more contexts associated with one or more of the access points that are perceptible to a device can be used to determine, with a certain degree of likelihood, the current context of the device. For example, in a scenario where, upon perceiving a particular access point (such as an access point located in the vehicle of another user, such as when such a user parks in a company parking lot), it can be determined that the device is not present/operating within a vehicle, upon subsequently perceiving the same access point, it can be further determined that the device is, again, not present/operating within a vehicle.
  • a particular access point such as an access point located in the vehicle of another user, such as when such a user parks in a company parking lot
  • one or more other access points that are perceptible to the device can be analyzed, such as across successive/intermittent scans of the WiFi radio of the device, such as even in scenarios where the nomadic / non-nomadic characterization/classification of such access points may not yet be determined. Based on a determination that one or more of (e.g., the collection of) perceived access points across multiple/successive scans is substantially similar (and there are a relatively sufficient quantity of access points that are perceptible in such scans), then it can be further determined that the device is not moving.
  • multiple/successive scans of perceptible access points using the WiFi radio of the device may demonstrate a substantially similar collection of access points, but if some of the referenced RSSI values have changed substantially, it can nevertheless be determined that the device is likely in motion.
  • the speed at which a device is moving can be determined/estimated based on a determination of the distance between (a) the location of the device at the time that the WiFi radio of the device is used to perform a first access point scan (e.g., as determined from the known/determined locations of one or more of the BSSIDs, such as from a 3 party database such as Skyhook, and further using error minimization techniques to calculate the location of the device, such as based on the respective locations of each of the access points that are perceptible to the device with or without using the RSSIs of such access points as described herein, and (b) the location of the device at the time that the WiFi radio of the device is used to perform another access point scan, relative to the amount of time that elapsed between the two scans.
  • a technique can be implemented with more than two access point scans as well.
  • the speed at which a device is moving can still be determined/estimated even without location information regarding (non-nomadic) WiFi access points.
  • determination(s) can be achieved, for example, based on changes in the RSSI of the WiFi access points across multiple/ successive scans. For example, an access point whose RSSI on a device changes from -30 dB to -90 dB over the course of a relatively short period of time (e.g., 1 second) can be determined to be relatively likely to be present/operating within a vehicle because, given the transmission range of WiFi access points, it is relatively unlikely that the device would be capable of moving at such a rate of speed without being present within a vehicle.
  • one or more techniques similar to those described herein with respect to access points such as WiFi access points can be employed with respect to determining a context within which a device is present/operating based on the Bluetooth (BT) connectivity state/status of such a device. For example, based on a determination that the device is connected to a nomadic BT device (e.g., a hands-free car loudspeaker), it can be determined that the device is present/operating within a vehicle. Moreover, based on a determination that the device is connected to a non-nomadic BT device (e.g., a desktop computer), it can be determined that the device is not present/operating within a vehicle.
  • a nomadic BT device e.g., a hands-free car loudspeaker
  • a non-nomadic BT device e.g., a desktop computer
  • a BT device (to which a device is connected) can be characterized/classified as nomadic / non-nomadic in a number of ways.
  • the nomadicity of a BT device can be determined based on/in relation to a database (local or remote) that can maintain information pertaining to various BT devices, and/or that contains respective classifications and/or characterizations of the nomadic / non-nomadic properties of various BT devices (BSSIDs).
  • Such database can be built/compiled, for example, based on the results of one or more of the nomadic / non-nomadic determinations described herein, both with respect to individual devices and/or accumulated across devices ('crowd- sourcing') and/or may be a third party database of BT devices that provides such nomadic / non-nomadic information.
  • the nomadicity of a BT device can be determined by querying/ looking up the referenced BT device within a database (local or remote) that maintains associations of the context within which the device is operating in relation to an identifier of the BT device (BSSIDs) that is the device was/is connected to.
  • a database local or remote
  • BSSIDs identifier of the BT device
  • Such database can be generated/built based on the results of one or more determinations that pertain to the context within which a device is present/operating in relation to the BT device to which the device is connected, as described herein (both with respect to individual devices and/or accumulated across devices, e.g., via 'crowd-sourcing,' and/or may be a third party database of BT devices that provides such information.
  • a device For example, based on a determination that on multiple instances a device was connected to a particular BT device and it was also determined not to be present/operating within a vehicle, then upon subsequently determining that the device is again connected to the same BT device, it can be determined that the device is again relatively likely not to be present/operating within a vehicle.
  • one or more of the referenced determinations can also incorporate and/or otherwise consider aspects pertaining to BSSIDs.
  • the identity of the manufacturer of the BT device to which the device is connected can be identified, such as is described herein.
  • the BSSID of a BT device can be used to identify its manufacturer. And, having identified the manufacturer, such an identity can be used to further increase the accuracy of the referenced determination(s).
  • the BSSID of the BT device to which the device is connected is determined to be that of a manufacturer that produces hands-free car loudspeakers, it can be determined that there is a relatively greater likelihood that the BT device is nomadic, whereas based on a determination that the BSSID of the BT device to which the device is connected is that of a manufacturer that produces desktop computers, it can be determined that there is a relatively greater likelihood that the BT device is non-nomadic.
  • a database can also be utilized in case(s)/scenario(s) where a manufacturer produces both nomadic and non-nomadic devices, in light of the fact that such devices often have multiple prefixes and different prefixes relate to different devices.
  • the value of the BSSID suffix i.e., the last 6 characters of the 12-character BSSID and which are generally assigned contiguously to similar/comparable devices, can be compared to such a database in order to determine the nomadicity of such a BT device.
  • one or more of the referenced determinations can also incorporate and/or otherwise consider aspects pertaining to beacon transmission(s) pertaining to various BT devices., such as by querying/looking up the device class(es) (major and minor) and/or services class(es) of a particular BT device. For example, if a device (e.g., a mobile device such as a smartphone) is connected to a hands-free BT device (having a BT major class of 'Audio Video' and a BT minor class of 'Hands-Free'), it can be determined that the device (that is, the mobile device) is relatively likely to be present/operating within a vehicle.
  • a device e.g., a mobile device such as a smartphone
  • a hands-free BT device having a BT major class of 'Audio Video' and a BT minor class of 'Hands-Free'
  • one or more power consuming operations (that may have otherwise be performed, such as in order to determine the context of the device, e.g., using the GPS to determine if the device is moving) need not be performed (or can be performed relatively less often) until the BT connection is determined to have been terminated (and/or relatively shortly thereafter).
  • the device can be determined to be relatively unlikely to be present/operating within a vehicle.
  • one or more power consuming operations (that may have otherwise be performed, such as in order to determine the context of the device, e.g., by using the GPS to determine if the device is moving) need not be performed (or can be performed relatively less often) until the BT connection is determined to have been terminated (and/or relatively shortly thereafter).
  • one or more of the referenced determinations can be made based on a determination that a device was moving at the time it is connected to the BT device (as determined, for example, based on GPS, Cell IDs, Cell RSSI, WiFi IDs, Wifi RSSI, accelerometer, gyroscope, etc., as described herein), and a characterization/classification of the BT device to which the device is connected as nomadic based on a determination that the device was moving while it was connected to the BT device (and non-nomadic if the device was determined not to be moving while it was connected to the BT device), and/or by recording or associating the determined context of the device with the BT device.
  • the BT device(s) that are perceptible to a device can be used to determine the context of within which the device is present/operating (it should be understood that the referenced scans for BT devices can be made for discoverable and/or paired BT devices).
  • the BT device to which a device is connected can be characterized/classified as nomadic / non-nomadic, and such characterization/classification can be saved (locally or remotely) for future use (such as by the referenced device and/or by other devices).
  • the nomadicity of a device can be determined based on/in relation to the nomadic / non-nomadic characterization(s)/classification(s) of the BT devices it perceives (such as by identifying records/information corresponding to such BT devices within a database) and/or making such determinations dynamically based on one or more determinations with respect to one or more combinations of information corresponding to the manufacturers, major and minor classes and services of such BT devices.
  • the referenced device can be determined to be in motion, whereas if the device perceives, for a sufficiently/relatively long period of time, one or more BT devices that have been determined to be non- nomadic, the device can be determined not to be in motion.
  • the nomadicity of a device can be determined based on/in relation to how one or more of the BT devices that are perceptible to the device has/have been previously associated, such as with a particular context. In doing so, one or more determinations can be made with respect to the present context of the device.
  • a device For example, based on one or more instances with respect to which a device perceives a particular BT device (e.g., the BT earpiece of another user) and also determines that the device is not present/operating within a vehicle, upon a subsequent perception by the device of the same BT device, it can be further determined that the device is (again) not present/operating within a vehicle (i.e., even without making such a determination affirmatively).
  • a particular BT device e.g., the BT earpiece of another user
  • the device upon a subsequent perception by the device of the same BT device, it can be further determined that the device is (again) not present/operating within a vehicle (i.e., even without making such a determination affirmatively).
  • the nomadicity of a device can be determined based on/in relation to an analysis of the collection of other BT devices (i.e., those BT Devices other than the one that the device is connected to) that the device perceives/"hears," such as across successive scans of its BT radio, even if the nomadic / non-nomadic classification of such BT devices is not yet known. If the collection of BT devices across successive scans is determined to be substantially similar (and there are sufficiently many BT devices that are perceptible in such scans), then the device can be determined not to be moving. If, however, the collection of BT devices across successive scans is determined to be substantially dissimilar, then the device can be determined to be moving. The accuracy of such a technique can be increased/improved based on an analysis of the RSSIs of the
  • BT devices in the collection e.g., rather than solely/primarily considering the Boolean existence / nonexistence of each BT device in successive scans.
  • multiple/successive scans of BT devices using the device's BT radio may be determined to demonstrate a substantially similar collection of BT devices, however if some of the RSSI values can be determined to have changed substantially, the device can be determined to be likely to be in motion (i.e., relative to the BT devices).
  • the speed at which the device is moving can be determined/estimated by determining the distance between (a) the location of the device at the time that its BT radio is used to perform a first BT device scan (e.g., as determined based on the known locations of each of the (presumably non- nomadic) BSSIDs, such as from a third-party database, and using an error minimization technique to calculate the location of the device based on the respective locations of each of the BT devices perceived/"heard" with or without using the RSSIs of such devices, and (b) the location of the device at the time that its BT radio is used to perform a second or subsequent BT device scan, relative to the amount of time that elapsed between the two scans.
  • this technique can be implemented with more than two BT device scans as well.
  • the speed at which the device is moving can still be determined/estimated, such as based on the changes identified in the RSSI of the BT devices in multiple/successive scans.
  • a BT Device whose RSSI perceived on a device changes from -30 dB to -90 dB over the course of a short period of time (e.g., 100 milliseconds) can be determined to be likely to be in a vehicle because, given the transmission range of BT Devices, a human is generally un able to move the distance required to change the signal so much in such a short period of time, whereas a BT device whose RSSI on a device is determined to have changed from -30 dB to -31 dB over the course of the same time is relatively unlikely to be in a moving vehicle because, again, given the transmission range of (most) BT devices, a moving vehicle is likely to experience a much larger change in the strength of the signal received than 1 dB, even over a short period of time.
  • a short period of time e.g. 100 milliseconds
  • a device is determined to be connected to a cell tower that has been previously associated with a particular context, it can be determined, with a certain degree of likelihood, that the device is present within such context again. For example, it can be determined that a particular device will tend to be connected to a certain cell tower when the device is in the user's home and, generally, a different cell tower when the device is in the user's place of work. Accordingly, if a device is connected to either of these cell towers its context can be determined not to be within a vehicle and thus, if a device is presently connected to either one of them, it can be determined that the device is not present within a vehicle.
  • the accuracy of this technique can be improved/enhanced by 'learning' (such as via machine learning techniques as are known to those of ordinary skill in the art) the RSSIs of cell towers for different contexts of the device (e.g., not within a vehicle, within a vehicle, etc.), so that, for example, determinations such as (for example): if a device is determined to be connected to Cell Tower 12345 in LAC 123 at an RSSI of between - 90 DB and -80 DB, the device can be determined to be likely not to be in a vehicle, can be achieved.
  • the cell towers that can be perceived/"heard" by the device when scanning with its cellular radio can be used to determine the context of the device. For example, based on a determination that a device can perceive/hear one or more cell towers that have been previously associated with a particular context, it can be determined, with at least a certain degree of likelihood, that the device is present in such context again. For example, it can be determined that a particular device will tend to be connected to a certain cell tower when the device is present in the user's home and, generally, a different cell tower when the device is present in the user's place of work.
  • the context of the device can be determined not to be within a vehicle, and thus, based on a determination that a device is presently connected to either one of them, it can be determined that the device is not present within a vehicle.
  • a context of the device can be determined based on an analysis of the collection/set of other cell towers (i.e., those cell towers other than the one that the device is connected to) that are perceptible to the device, such as across multiple/successive scans of its cellular radio. Based on a determination that the collection/set of cell towers that are perceptible to the device across multiple/successive scans is substantially similar (and there are sufficiently many cell towers that appear in such scans), then the device can be determined not to be moving. If, however, the collection/set of cell towers that that are perceptible to the device across multiple/successive scans is substantially dissimilar, the device can be determined to be moving.
  • the accuracy/resolution of such a technique can be improved/increased, for example, based on an analysis of the RSSIs of the cell towers in the set/collection (rather than only looking at the Boolean existence / non-existence of each cell tower in multiple/successive scans). For example, based on a determination that the successive scans are sufficiently close in time and/or the device is moving slowly, successive scans of cell towers (such as using the device's cellular radio) may show a substantially similar collection of cell towers, but if some of the RSSI values have changed substantially the device can be determined to be likely to be in motion.
  • the speed at which the device is moving can be determined/estimated by determining the distance between (a) the location of the device at the time that its cellular radio is used to perform a first cell tower scan (e.g., as determined from the known/identified/determined locations of each of the cell tower IDs, such as from a third party database, e.g., Skyhook, and using an error minimization technique to calculate the location of the device based on the respective locations of each of the cell towers perceived with or without using the RSSIs of such cell towers), and (b) the location of the device at the time that its cellular radio is used to perform a second cell tower scan, relative to the amount of time that elapsed between the two scans.
  • a technique can be implemented with more than two cell tower scans as well.
  • the speed at which the device is moving can still be determined/estimated, such as based on the changes in the RSSI of the cell towers in successive scans. For example, a cell tower whose RSSI on a device changes from -30 dB to -90 dB over the course of a short period of time (e.g., 1 second) can be determined to be relatively likely to be present within a vehicle because, given the transmission range of cell towers, a human is not able to move the distance required to change the signal so much in such a short period of time, whereas a cell tower whose RSSI on a device changes from -30 dB to -31 dB over the course of the same time is relatively unlikely to be present within a moving vehicle because, given the transmission range of (most) cell towers, a device present within a moving vehicle is likely to perceive a relatively much larger change in the strength of the signal received than 1 dB, even over a relatively short period of time.
  • a short period of time e.g. 1
  • accuracy of the referenced in-vehicle / not-in-vehicle determination(s) can be further improved when performed in conjunction with other connectivity/"network visibility" information/determinations (e.g., in relation to BT devices, cellular IDs, cellular RSSIs, etc.) and/or from other sensors as described herein.
  • the conditions pursuant to which a device can be determined to be present within a moving vehicle can be dynamically determined, such as based on the device's environment, such as can be measured/determined by its sensors. For example, if many different WiFi access points (e.g., having different BSSIDs), many cell tower IDs (CIDs), and/or many Bluetooth devices are perceptible to a device, and/or if a GPS so indicates, it can be determined that the device is relatively likely to be present in an urban area, whereas if relatively few (or none) of the referenced wireless signals are perceptible, and/or if a GPS reading so indicates, it can be determined that the device is likely to be in a rural area, and, because vehicles generally move slower in urban areas than they do in rural areas, a threshold speed (or other conditions) at which a trip can be determined to have started (as described herein) can be set/adjusted to be relatively lower in such an urban setting as compared to a rural setting, thereby enabling more
  • a device is connected to a non- nomadic AP (e.g., for more than a relatively short time), it can be determined not to be moving, but further determinations can also be performed. In doing so, for example, based on a determination that a device is connected to a nomadic AP it can be further determined that it is relatively more likely the device is moving than if it were (a) connected to a non-nomadic AP, (b) connected to an AP of unknown nomadicity; and/or (c) not connected to an AP.
  • such determinations can be premised based on an assumption/default that if a device is determined to be connected to a nomadic AP, it can be determined to be is moving.
  • power/resources can be invested to determine, by way of one or more other sensors (e.g., GPS, accelerometer, gyroscope, cellular radio, etc.) whether or not the device is moving.
  • a device determined not to be in substantially the same location e.g., as determined by GPS, WiFi APs, cell tower IDs, Bluetooth devices, IP address
  • a device determined not to be in substantially the same location e.g., as determined by GPS, WiFi APs, cell tower IDs, Bluetooth devices, IP address
  • the nomadicity of such an AP can be determined, for example, via GPS, such as by comparing (i) the GPS coordinates of the referenced device's location at or about the time that the device connected to the WiFi AP with (ii) the GPS coordinates of the device's location at or about the time of the device's disconnection from the same WiFi AP.
  • the nomadicity of such an AP can be determined, for example via WiFi, such as by comparing (i) one or more of the WiFi APs perceptible at the device at or about the time of the device's connection to one of the WiFi APs with (ii) one or more of the WiFi APs perceptible at the device at or about the time of the device's disconnection from the same WiFi AP.
  • the nomadicity of such an AP can be determined, for example via Cell Tower ID, such as by comparing (i) the cell tower to which the device is connected (or more or more of the visible cell towers) at or about the time of the device's connection to the WiFi AP with (ii) the connected cell tower (or one or more sets of the perceptible cell towers) at or about the time of the device's disconnection from the same WiFi AP.
  • the nomadicity of such an AP can be determined, for example via Bluetooth, such as by comparing (i) one or more of perceptible Bluetooth devices at or about the time of the device's connection to the WiFi AP with (ii) one or more of the visible Bluetooth devices at or about the time of the device's disconnection from the same WiFi AP.
  • the nomadicity of such an AP can be determined, for example via IP address, such as by comparing (i) the IP address of the device shortly before the time of the device's connection to the WiFi AP with (ii) the IP address of the device shortly after/about the time of the device's disconnection from the same WiFi AP.
  • a WiFi AP can be determined to be non-nomadic based on a determination that the location of the connected device at the time it was connected to the AP is substantially the same as its location at the time that it disconnected from the same AP.
  • a device determined to be continuously connected to an AP for 10 hours can be determined, on average, to be relatively more likely to be connected to a non-nomadic AP than a device determined to be connected to an AP for only 10 seconds.
  • the notion/concept/aspect of a 'substantially same' location can be determined, for example, relative to the transmission range of an AP.
  • certain APs can have a relatively wide transmission range (e.g., a WiMax AP), and in such a case it is possible that, even though the location at the time of connection and the location at the time of disconnection may be determined to be relatively far apart, the AP is actually non-nomadic.
  • the techniques described herein can be further configured to account for the AP's transmission range(s), as can be identified/determined, for example, from/based on the BSSID, manufacturer, etc., of the AP, and/or as can be determined from repetitive crowd-sourced connection/disconnection locations, such as using one or more of the techniques described herein.
  • WiFi APs can be determined to be nomadic or non-nomadic using "crowd-sourcing" techniques.
  • message(s) can be sent from user devices to a server indicating (a) time, (b) conditions (pertaining to GPS, WiFi, cellular, Bluetooth, IP addresses, etc.) at or near the time of connection to the WiFi AP, and (c) conditions at or near the time of disconnection from the WiFi AP.
  • devices can also maintain and update nomadic/non-nomadic AP information locally.
  • the state of the device's display screen, keyboard and/or the presence of a device user can be used to determine whether and/or how intensively (e.g., based on sampling rate, duty cycle, how much power can be consumed, etc.) device resources should be used to determine the context of the device (e.g., to determine if the device is present within a moving vehicle, determine the in-vehicle role of its user, etc.), and/or to provide other functionality, with or without taking into account the device's battery level, rate of battery level depletion, etc.
  • intensively e.g., based on sampling rate, duty cycle, how much power can be consumed, etc.
  • the power consumption of the device can be reduced (in light of the fact that the device, in its current state, cannot serve as a distraction to a driver) by reducing or eliminating some or all of its energy intensive operations (e.g., GPS calls) such as those used to make context determinations (e.g., in-trip determinations, in- vehicle role determinations), as described herein.
  • the GPS can be queried once every 5 seconds (or not at all), while when the screen is on, the GPS can be queried once per second.
  • the device can be configured to maintain itself in a low power consumption state in which much of, or substantially all of, the device's context information gathering can be suspended for as long as the screen is off.
  • Such techniques can be advantageous in scenarios where the context of the device can be promptly determined when the screen is turned on (or a user is present). For example, one manner in which this can be done is by turning the GPS on upon determining that the screen is turned on. The latency associated with such a technique can be reduced by not turning the GPS radio fully off when the screen turns off, or by maintaining/"remembering" certain information that can that reduce the TTFF (time to first fix) of the GPS radio when it is next turned on, such as is known to those of ordinary skill in the art.
  • TTFF time to first fix
  • successive scans of available/perceptible networks/devices can be performed (e.g. periodically) and the results maintained/recorded.
  • the scan results just prior to and just after the device screen is turned on can be processed/used to determine whether or not the device is in motion (and can also be processed/used to determine/estimate the speed at which the device is moving).
  • the accuracy of the referenced technique can be improved by monitoring (e.g., with a relatively low power consumption, such as at a low duty cycle) and thereby determining in advance which of the various techniques are likely to be useful/effective/accurate, in light of the determined context of the device.
  • the device can test/determined, at various intervals while the screen is determined to be off, whether the described Wifi scan technique is likely to be applicable/accurate within the area in which the device is present (e.g., rural vs. urban). If it is determined that the referenced Wifi scan technique is not likely to be effective/available, it can be determined that another technique (e.g., one entailing higher energy expenditure) may be required in order to accurately determine the context of the device (and can be employed, for example, when the screen is turned on), or it may choose to suspend the use of determinations based on screen state altogether, e.g., until WiFi scans can provide relevant/accurate information again.
  • another technique e.g., one entailing higher energy expenditure
  • a determination that a device is connected to a power source can further indicate the hands-free state and/or in-vehicle location of the device. For example, in a scenario where a passenger is present in a car, a device determined to be connected to a power source can be determined to (a) be relatively more likely to be operated by a driver than by a passenger (because a vehicle is more likely to have a power connection that is compatible with respect to device operated by a driver than a device operated by a passenger, and (b) is relatively more likely to be positioned in close proximity to a driver than a passenger (in light of the near-driver location of some power sources, e.g., a device cradle/dock having a power connection).
  • the device can be determined to be in the same location/context (e.g., within a vehicle, not within a vehicle, etc.) in which it was in when it was last connected to Wifi.
  • the accuracy of such techniques can be improved so that the device's current location and/or context can be determined based on the location and/or context determined at the time that the last WiFi access point disconnected, based upon whether the WiFi access point is nomadic or non- nomadic.
  • the location of the device and its context can be determined to be unchanged, whereas upon determining that the last WiFi access point to which the device was connected was a nomadic access point, then, (barring other factors/information), its context (e.g., within vehicle) can be determined to be unchanged, while its location cannot.
  • the accuracy of this technique can be further improved by providing for/requiring that the device has be determined to be continuously (or substantially continuously) connected to a power source at least a certain amount of time before it was last disconnected from a WiFi connection (in doing so, a use case where a user left his home and got into his car in the driveway while still within range of her home WiFi and connected the device to a power source in the car before losing connection with the home Wifi can be addressed).
  • a given wireless ID that the mobile device connects to may "fool" the detection system at most once in most cases.
  • a wireless device e.g., WiFi AP, Bluetooth speakerphone
  • the wireless device will be marked as nomadic and future connections to it will not be understood to indicate that the mobile device is stationary (or even be used to determine that it is not stationary).
  • the notion of checking the location of a mobile device when it is connected and subsequently disconnected from a wireless device is not the only method of classifying wireless devices as nomadic or non-nomadic.
  • the mobile device's current location can be queried actively to aid in classifying the wireless device as nomadic or non-nomadic or, to save power, changes to the mobile device's "last know location" can be used to accomplish the same, i.e., if the mobile device's last know location changes sufficiently (not sufficiently) while the mobile device is connected to a wireless device, then the wireless device is classified as nomadic (non-nomadic).
  • the presence of nomadic wireless devices can be used to detect that a mobile device is in a trip more effectively, both (i) if the mobile device is connected to one or more nomadic wireless devices; and/or (ii) if the mobile device sees one or more nomadic wireless devices (or even certain types of nomadic wireless devices as further classified by their manufacturer and/or their BT major/minor class) without being connected to them.
  • nomadic wireless devices e.g., nomadic WiFi APs, nomadic Bluetooth devices
  • a mobile device connects to one or more nomadic wireless devices (e.g., as determined from comparing with a database on the device or on a remote server)
  • the mobile device may be assumed to be in a trip and apply whatever changes to the mobile device are appropriate for a trip and/or the mobile device may choose to invest additional power to seek additional confirmation as to whether it is in a trip more quickly that it would ordinarily do so.
  • the mobile device may choose to invest additional power to determine whether it is in a trip more quickly that it would ordinarily do so.
  • a device is determined to be connected to a power source during the time that it has been determined to be in a trip, then, for as long as the device remains connected to such power source (or an alternative power source, provided, for example, that the device is not unplugged from power for more than a certain period of time, e.g., the time it might take someone to switch between a vehicle power source and an office or home power source), it can be determined that the device is highly likely to still be within a trip. In so determining, many of the power expensive/intensive operations that may otherwise be utilized by the device to determine the context of the device can be curtailed/suspended (e.g., checking GPS to see if the device is still present within a trip).
  • the determination as to whether the device is still connected to power in a vehicle can be made relatively more easily/efficiently.
  • the efficiency of such re-detection can also be improved (and can consume less power) because differentiation between moving in a vehicle and certain other states (e.g., walking) which are highly unlikely when the device is connected to power, may no longer be necessary.
  • the device upon determining that a device is present within a trip and connected to a power source, it can be advantageous to determine that a trip has ended so that, upon determining that a trip has ended, the device can be used by a user (i.e., without restriction) without having to disconnect the device from the power source (e.g., in a scenario where the vehicle is parked in a parking lot at the end of a trip with the device still being connected to power, and the user may want to send a text).
  • sensors and/or inputs e.g., GPS, accelerometer, WiFi BSSIDs and/or RSSI, Cell Tower IDs and/or RSSI, BT BSSID and/or RSSIs
  • sensors/inputs can be sampled at sampling rates relatively lower than would be used if the power connection was not available and/or used to make contextual determinations (thereby reducing power consumption).
  • the sensors/inputs can also be sampled at duty cycles relatively lower than those that would be used if the power connection was not available and/or used to make contextual determinations, thereby reducing power consumption.
  • WiFi-enabled device e.g., an access point
  • WiFi-enabled device e.g., an access point
  • locations e.g., as determined by GPS, cell tower ID, WiFi AP fingerprint, etc.
  • WiFi-enabled device can be determined to be likely to be a nomadic device.
  • WiFi-enabled device e.g., an access point
  • WiFi-enabled device e.g., an access point
  • one or more devices such as those at different locations (e.g., as determined by GPS, cell tower ID, WiFi AP fingerprint, etc.) that vary little, (and, in certain implementations, based on how many such observations there are and how they are distributed across observing devices and time)
  • WiFi-enabled device can be determined to be likely to be a non-nomadic device.
  • FIG. I l l is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • a first device with respect to which a location is to be determined can be selected/identified from among one or more devices.
  • the first device can be selected/identified based on various factors including but not limited to: a power connected state of the first device, a power connected state of the first device relative to respective power connected states of one or more of the one or more devices, a battery capacity of the first device, a battery capacity the first device relative to respective battery capacities of one or more of the one or more devices, a power consumption rate of the first device, a power consumption rate of the first device relative to respective power consumption rates of one or more of the one or more devices, a location determination accuracy of the first device, and/or a location determination accuracy of the first device relative to respective location determination accuracies of one or more of the one or more devices, a battery level (e.g., a current battery level) of the first device, or a battery level the first device relative to respective battery levels of one or more of the one or more devices.
  • a battery level e.g.,
  • a first device with respect to which a location is to be determined during a first time interval can be selected/identified. Additionally, in certain implementations, a first device with respect to which a location is to be determined can be identified, such as from among one or more devices, based on a determination that at least one of the one or more devices is present on a mass-transit vehicle. Additionally, in certain implementations a first device with respect to which a location is to be determined can be identified, from among one or more devices, based on a determination that at least one of the one or more devices are likely to be in a context in which its location changes.
  • the location can be determined, such as with respect to a first device. Moreover, in certain implementations the location can be determined with respect to the first device during the first time interval.
  • the location, as determined with respect to the first device can be associated with each device from the one or more devices that is proximate to the first device.
  • the location as determined with respect to the first device can be associated, during the first time interval, with various devices from the one or more devices that are proximate to the first device.
  • a second device can be selected, such as a device with respect to which the location is to be determined during a second time interval.
  • a second device can be selected based on factors including but not limited to: (a) respective battery levels of one or more devices that have received location information from the first device, (b) respective battery capacities of one or more devices that have received location information from the first device, (c) respective battery times remaining with respect to one or more devices that have received location information from the first device, (d) respective lengths of time that one or more of devices have received information from the first device, (e) a quantity of updates that one or more devices have received from the first device, and/or (f) a location accuracy of one or more devices.
  • a timing associated with a selection of second device can be determined based on at least one of: (a) a length of time that the first device provided location information to one or more other devices, (b) a quantity of location updates that the first device provided to one or more other devices, (c) a quantity of devices that receive location information from the first device, (d) a length of time that one or more other devices provided location information to other devices, and/or (e) a quantity of location updates that one or more such devices provided to other devices, such as in a manner described herein.
  • the location can be determined with respect to the second device during the second time interval.
  • the location as determined with respect to the second device can be associated with each device from the one or more devices that is proximate to the second device.
  • the location as determined with respect to the second device can be associated with each device from the one or more devices that is proximate to the second device.
  • the 'location provider' device can acquire partial or full location information (e.g., via GPS, etc.) and distribute such location information to other nearby devices (e.g., via WiFi, Bluetooth), thereby enabling the other devices to conserve power while remaining updated with respect to their location.
  • location information e.g., via GPS, etc.
  • other nearby devices e.g., via WiFi, Bluetooth
  • different devices can be selected/designated to be the referenced 'location provider' for other proximate devices so as to distribute the power consumption burden (associated with location determination) among multiple devices.
  • the order in which and the length of time for which a device assumes the role of a location provider can depends on one or more factors such as its power state (e.g., connected, not connected, battery level, battery remaining, rate of battery usage (current, usual, predicted)) and/or its reception quality (TTFF, GPS accuracy etc.) and/or the states of other nearby devices.
  • TTFF transmission quality
  • a device that is connected to a power source and has a fully charged battery may remain as the 'location provider' for a longer time (as compared to a device that is not).
  • a full protocol for sharing such information between nearby devices can be established in which the devices can dynamically request, discover, communicate and/or negotiate with one another to provide "battery friendly" location information.
  • FIG. 69 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more devices can be identified.
  • such devices e.g., mobile devices, wireless access points, etc.
  • the referenced devices e.g., those that are identified
  • the referenced device may be being associated with a vehicle (e.g., determined to be and/or otherwise identified as being embedded, installed, or otherwise present with a vehicle).
  • the referenced device may be determined to be and/or otherwise associated with an operator of the vehicle (e.g., a driver).
  • a restriction can be employed.
  • such a restriction can be employed with respect to the device (e.g., the device that perceived the various other devices/access points, such as at 6910).
  • such a restriction can be employed based on a perception of the one or more devices.
  • such a restriction may be employed upon determining or otherwise identifying that the device (e.g., the device that perceived the various other devices/access points, such as at 6910) is moving in a manner that is consistent with the manner in which a vehicle moves, such as in a manner described herein.
  • the referenced restrictions upon determining that the vehicle is not moving in such a manner, the referenced restrictions can be adjusted, relaxed, removed, etc.
  • the mobile device associated with/determined to be operated by a mass transportation driver can be configured to go into 'driver mode' (such as is described herein) based on a perception of one or more signals (e.g., RF signals such as WiFi access points a Bluetooth BSSIDs, etc.) in a set of signals.
  • RF signals such as WiFi access points a Bluetooth BSSIDs, etc.
  • the mobile device of a bus drivers can be configured to go into 'driver mode' upon perceiving the MAC address(es) / BSSID(s) of the WiFi access points or Bluetooth devices that are associated with the bus(es) that they drive, and, in some cases, when additional conditions (e.g., movement consistent with driving) are also determined to be met. Doing so can provide various advantages, improvements and efficiencies, such as with respect to the accuracy, latency and power consumption of applications/modules that are directed to identifying/preventing distracted driving (whether executing the device itself and/or externally to the device, such as on a remote server).
  • such techniques enable the restriction of the mobile devices of mass transportation drivers while they are working (e.g., driving a bus), while also not restricting the same devices in scenarios in which the bus driver is not currently driving (it should be understood that such techniques can be further enhanced as more and more vehicles have unique wireless electronic IDs) and without having to integrate with the bus company's employee time scheduling systems.
  • FIG. 68 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • a positioning of a device can be determined.
  • the referenced positioning can include an orientation of the device (e.g., the angle at which the device is oriented, such as is describe herein) and/or a docking or connectivity status of the device (e.g., whether or not the device is connected to a dock or charger, whether or not the device is connected or linked to an in-vehicle 'infotainment' system, etc.).
  • orientation of the device e.g., the angle at which the device is oriented, such as is describe herein
  • a docking or connectivity status of the device e.g., whether or not the device is connected to a dock or charger, whether or not the device is connected or linked to an in-vehicle 'infotainment' system, etc.
  • it can be determined that the referenced device is present within a vehicle (and/or a moving vehicle) and/or within a trip, such as using one or more of the techniques described herein.
  • one or more restricted functions can be enabled.
  • such functions can be enabled at/in relation to the device.
  • such restricted functions can be enabled based on a determination that the positioning (such as that determined at 6810) is consistent with one or more positioning parameters. Examples of such positioning parameters can be one or more parameters (and/or ranges thereof) that can be defined, such as with respect to the orientation of the device, the docking connectivity of the device, etc.
  • the referenced restricted functions can include one or more applications and/or one or more interfaces.
  • the referenced restricted functions can be restricted at/in relation to the device.
  • such restricted functions can be restricted based on a determination that the positioning of the device is not consistent with the one or more positioning parameters (such as in a manner described herein).
  • certain device functionality e.g., the ability to make utilize, or otherwise interact with calls, texts, applications etc.
  • a device determined to be associated with a driver of a vehicle can be restricted in a manner that such functionality is only permitted when various conditions can be determined to be met.
  • certain functionality may only be allowed when the device is determined to be positioned in a cradle (as can be determined, for example, based on inputs originating from one or more sensors, such as accelerometer, compass/magnetometer, GPS, camera, orientation envelope, correlation between z-axis acceleration and GPS acceleration, etc., such as is described herein, and/or based on inputs/information received from a device cradle (e.g. power connection identifier, NFC, etc.) which may be used to confirm that the device is in it, such as in a manner described herein.
  • a device cradle e.g. power connection identifier, NFC, etc.
  • one or more other functionalities may also be restricted such that the use of the are permitted when the device can be determined to be connected to a vehicle's infotainment system (as can be determined based on the wired or wireless connection between the device and the vehicle's infotainment system, such as in a manner described herein).
  • one or more of the interfaces that may be used to control various functionalities of the device can be selectively (or fully) restricted.
  • a device determined to be operated by/associated with a driver of a vehicle such as in a manner described herein
  • the techniques described herein can be employed to determine if the device is still within a trip or not.
  • the techniques described herein can account for whether the device is connected to a powers source, the device's current battery level and/or rate of battery level depletion, etc., in selecting which contextual determination techniques to use and/or how to use them (e.g., in relation to sampling rate, duty cycle, etc.). For example, upon determining that a device battery is fully or mostly charged, sensors can be sampled more liberally, whereas upon determining that the device battery is low, sensors can be sampled on a more limited basis/more frugally.
  • a device when a device is determined to be connected to a power source, its GPS can be sampled more frequently, whereas upon determining that the device is not connected to a power source, its GPS can be sampled relatively less frequently (or not at all).
  • Such techniques can also be employed/extended to other sensor(s), e.g., accelerometer, WiFi scans, Cell tower scans etc.
  • a signal can be used to determine/estimate speed more accurately. Based on the fact that, for example, the RSSI of a BT device that transmits at 2.5mw (Class 2) has changed from -90 dB to -80 DB in 1 sec, it can be determined that the device is moving more slowly (relative to the BT device) than if the BT device transmits at lOOmw (Class 1).
  • FIG. 113 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more signals associated with a second device can be perceived, e.g., by a first device, such as in a manner described herein.
  • one or more characteristics of the one or more signals can be identified.
  • such characteristics can include: an SSID of the second device, a BSSID of the second device, an organizationally unique identifier (OUI) of the BSSID, a determination that a BSSID of the second device is locally administered, one or more capabilities of the second device (e.g., an extended service set (ESS), an open network, a network not requiring a passcode, an authentication type, an encryption type), a quantity of unique devices that have observers perceived of the second device, a quantity of unique devices that have perceived the second device over a time period, a quantity of unique authenticators with respect to the second device, a quantity of unique devices associated with the second device, and/or a quantity of devices that perceive the second device for at least a defined period of time.
  • ESS extended service set
  • ESS open network
  • a network not requiring a passcode an authentication type
  • an encryption type an encryption type
  • one or more aspects of a vehicle within which at least one of the first device or the second device is present can be determined, such as based on the one or more characteristics. For example, a determination can be made as to whether the vehicle within which the at least one of the first device or the second device is present is a mass transit vehicle or a non-mass transit vehicle. Moreover, in certain implementations a determination can be made that the vehicle is a mass transit vehicle based on a determination that a manufacturer associated with the OUI produces more devices associated with mass transit than devices associated with non-mass transit. Additionally, in certain implementations a determination can be made that the vehicle is a non-mass transit vehicle based on a determination that a manufacturer associated with the OUI produces more devices associated with non-mass transit than devices associated with mass transit.
  • a determination as to whether the vehicle within which the at least one of the first device or the second device is present is a mass transit vehicle or a non-mass transit vehicle can be based on: a corresponding determination associated with another device associated with a BSSID that is similar to the BSSID of the second device, a determination associated with another device associated with a BSSID that is similar to the BSSID of the second device, an independent determination associated with another device associated with a BSSID that is similar to the BSSID of the second device, a corresponding determination associated with another device associated with a BSSID that is sequential to the BSSID of the second device, a determination associated with another device associated with a BSSID that is sequential to the BSSID of the second device, an independent determination associated with another device associated with a BSSID that is sequential to the BSSID of the second device, and/or a ratio of mass transit utilization to non-mass transit utilization in a location at which the at least one of the first device or the second device is present.
  • one or more actions can be initiated, such as based on the one or more aspects of a vehicle within which at least one of the first device or the second device is present.
  • FIG. 114 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more signals such as one or more probe requests (which may include one or more SSIDs, for example) and/or one or more probe responses, can be perceived by a first device, such as in a manner described herein.
  • the one or more signals can be processed. In doing so, a quantity of network enabled devices perceptible to the first device can be determined.
  • one or more aspects of a vehicle within which the first device is present can be determined, such as based on the quantity of network enabled devices perceptible to the first device.
  • the vehicle in which a mobile device is present can be determined to be a mass transportation or a non-mass transportation vehicle (and in certain implementations and/or in certain situations the exact vehicle class, e.g., train, bus, can be determined) based on one or more wireless signals perceived by the device (alone or in conjunction with one or more other signals and/or techniques described or referenced herein).
  • Doing so can be advantageous, for example (a) in estimating vehicle emissions where different mass transportation vehicles (e.g., trains, buses) have emission more similar to one another than to non-mass transportation vehicles (e.g., cars, light trucks); (b) in distracted driving prevention where devices on non-mass transportation can be assumed to be driver devices subject to one or more restrictions, unless determined otherwise, while devices on mass transportation vehicles might be assumed to be passenger devices and subjected to fewer or no restrictions, unless determined otherwise; (c) in usage-based insurance where trips in mass transportation vehicles are not a reflection of, and therefore should not be used to compute, a policyholder's driving score; or (d) in navigation where trips in mass transportation vehicles may be weighted less heavily in determining a user's navigational familiarity with a location.
  • mass transportation vehicles e.g., trains, buses
  • non-mass transportation vehicles e.g., cars, light trucks
  • distracted driving prevention where devices on non-mass transportation can be assumed to be driver devices subject to one or more restrictions
  • a device it can be determined whether a device is on a mass or non- mass transportation vehicle by analyzing one or more of the wireless signals it perceives (e.g., WiFi including WiFi Direct, Bluetooth, Cellular, RFID, NFC, WPAN) and, in certain cases, in conjunction with one or more non-wireless sensor inputs that were received by the same device or one or more other devices, prior to or at substantially the same time.
  • WiFi including WiFi Direct, Bluetooth, Cellular, RFID, NFC, WPAN
  • An exemplary collection of techniques for making such determination with WiFi signals is described herein, though it should be understood that similar methods (with natural variation based upon the varying protocols and physical characteristics of the various types of wireless signal), may be applied to one or more other forms of wireless signals as well.
  • the device can be determined that is the device is present in a vehicle (or a moving vehicle). Moreover, in certain implementations such a determination can be made in conjunction with one or more non-wireless sensor inputs that were received by the same device or one or more other devices, prior to or at substantially the same time. Additionally, in certain implementations such a determination can be made in conjunction with the device and/or one or more other devices which, over time, perceived WiFi signals and/or one or more signals from non-wireless sensors, whether such vehicle is on a mass or non-mass transportation vehicle.
  • the device perceives a WiFi enabled device (e.g., an access point) with an
  • SSID that contains (or starts with, or is exactly, with or without case sensitivity) a particular string of characters indicative of its being a mass transportation vehicle, which may vary depending on the location (e.g., country, state, city) in which the device is in, as perceived (e.g., by the GPS, cell tower or WiFi signals), e.g., "Amtrak” or “Greyhound” (and, for example, also requiring that the device be in the U.S.), the device can be determined to be on a mass transportation vehicle (in the case of "Amtrak", it is likely a train and in the case of "Greyhound", it is likely a bus).
  • the device perceives an SSID that contains a string of characters indicative of its being a non-mass transportation vehicle, e.g., "Sync" (which is often indicative of a Ford Sync infotainment system, which system is more likely to reside (if not exclusively reside) in a non-mass transportation vehicle (e.g., car, light truck), than in a mass transportation vehicle), the device can be determined to be on a non-mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the device can be determined to be on a non-mass transportation vehicle.
  • the device perceives a WiFi enabled device with a BSSID that contains an OUI indicative of a manufacturer that produces sufficiently more WiFi enabled devices that are used on mass transportation vehicles than on non-mass transportation vehicles, the device can be determined to be on a mass transportation vehicle.
  • the described techniques can be applied to more than the first 6 characters of the BSSID.
  • the first 8 or 10 characters of the BSSID to leverage the fact that manufacturers typically manufacture WiFi enabled devices in homogenous model batches and assign them a block of consecutive BSSIDs.
  • WiFi enabled devices with BSSIDs that are close to one another will tend to be of the same model. Accordingly, if sufficiently more WiFi enabled devices of a particular model are used on mass transportation vehicles than are used in non-mass transportation vehicles, whereupon, if the device perceives a WiFi enabled device with a BSSID corresponding to a model that is used sufficiently more on mass transportation vehicles than on non-mass transportation vehicles, the device can be determined to be on a mass transportation vehicle.
  • the device can be determined to be on a non-mass transportation vehicle.
  • a manufacturer produces different model WiFi enabled devices, some models of which are used more often in mass transportation vehicles and other models of which are used more often in non-mass transportation vehicles, when a particular model of WiFi enabled device (e.g., as expressed by one or more blocks of consecutive BSSIDs) is used sufficiently more in mass transportation vehicles or sufficiently more in non-mass transportation vehicles, its 7-12 digit BSSID "prefix" (e.g., the first 7, first 8, first 9, first 10, first 11, first 12 hex characters of the BSSID), can be used in determining whether the vehicle it is in is a mass transportation vehicle or a non-mass transportation vehicle.
  • a particular model of WiFi enabled device e.g., as expressed by one or more blocks of consecutive BSSIDs
  • its 7-12 digit BSSID "prefix" e.g., the first 7, first 8, first 9, first 10, first 11, first 12 hex characters of the BSSID
  • WiFi enabled devices with BSSID 34-12-56-78-90- AE and BSSID 34-12-56-78-90-B0 are known to be of the same model, with such model being used sufficiently more on mass transportation vehicles than on non-mass transportation vehicles, then a WiFi enabled device with a BSSID of 34-12-56-78-90-AF (which bears the same 10-digit prefix as the other two) can be determined to be likely to be of the same model and, if perceived by the device, the device can be determined to be on a mass transportation vehicle (or, in certain implementations, not on a non-mass transportation vehicle).
  • the device can be determined to be on a non-mass transportation vehicle (or, in certain implementations, not on a mass transportation vehicle).
  • the WiFi enabled device's BSSID prefix can be used in determining whether or not a device that perceives such WiFi enabled device's BSSID is on a mass or non-mass transportation vehicle, for example because companies tend to purchase WiFi enabled devices in bulk from their manufacturers and are likely to use them for a largely homogenous purpose (e.g., mass transportation, non-mass transportation, non- transportation).
  • WiFi enabled devices with BSSID 34-21-56-78-90-AE and BSSID 34-21-56-78-90-B0 are known to be of the same model and are in use on a mass transportation vehicle
  • a WiFi enabled device with a BSSID of 34-21 -56-78-90- AF which bears the same 10-digit prefix as the other two
  • the device can be determined to be on a mass transportation vehicle.
  • WiFi enabled devices with BSSID 34-21-56-78-90-AE and BSSID 34-21-56-78-90-B0 are known to be of the same model and are used sufficiently more on non-mass transportation vehicles that on mass transportation vehicles
  • a WiFi enabled device with a BSSID of 34-21- 56-78-90-AF which bears the same 10-digit prefix as the other two
  • the device can be determined to be on a non-mass transportation vehicle.
  • the BSSID "neighbors" of a particular WiFi enabled device can be used in determining whether a device that perceives such particular WiFi enabled device is likely to be on a mass transportation vehicle or a non-mass transportation vehicle. Additionally, in certain implementations, how relatively closer a neighbor of the BSSID of the particular WiFi enabled device is to the BSSID of one or more WiFi enabled devices (which is known to be on a mass transportation vehicle or a non- mass transportation vehicle), the greater the likelihood that the particular WiFi enabled device perceived by the device (and hence the device itself) is on the same type of vehicle (i.e., mass or non-mass) as its WiFi enabled neighbor.
  • the device perceives (and/or authenticates with and/or associates with) a WiFi enabled device that is an access point ( ⁇ ') and whose BSSID is locally administered (e.g., the 2nd bit of the 2nd hex character of the BSSID is on), which, in this in-vehicle setting, reflects that such WiFi enabled device is likely to be a mobile device (as distinct from an in-vehicle device), and the device is in a location (e.g., country, state, city, road) and/or at a day/time where there are more people that use mass transportation than use non-mass transportation, the device can be determined to be more likely to be on a mass transportation vehicle than on a non-mass transportation vehicle.
  • the more such WiFi enabled AP devices whose BSSID is locally administered are perceived by the device the higher the likelihood that the device is on a mass transportation vehicle.
  • the device can be determined to be less likely to be on a mass transportation vehicle than on a non-mass transportation vehicle.
  • the more such WiFi enabled AP devices whose BSSID is locally administered are perceived by the device the higher the likelihood that the device is on a non-mass transportation vehicle.
  • the device perceives a WiFi enabled device (e.g., an access point), with a particular capability (e.g., as included in a 802.11 beacon frame), (e.g., ESS only, an open network, no passcode required, weak authentication, weak or no encryption), the device can be determined to be likely in many cases (which may be dependent on the relative popularity of mass transportation vs. no-mass transportation and the relative popularity of open vs. closed WiFi enabled devices on mass transportation vs. non-mass transportation, in the device's location (e.g., country, state, city, road) and/or at a day/time), to be on a mass transportation vehicle, on which occupants are frequently given open access to WiFi services.
  • a WiFi enabled device e.g., an access point
  • ESS e.g., an open network, no passcode required, weak authentication, weak or no encryption
  • the device can be determined to be likely in many cases (which may be dependent on the relative popularity of mass transportation vs. no-mas
  • the device perceives a WiFi enabled device (e.g., an access point) with a different set of capabilities (e.g., a closed network, a passcode required, strong authentication, strong encryption), the device can be determined to be likely, in many cases to be on a non-mass transportation vehicle where occupants are less frequently given open access to WiFi services than on a mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • a different set of capabilities e.g., a closed network, a passcode required, strong authentication, strong encryption
  • the device if the device perceives a WiFi enabled device (e.g., an access point), which, over time, has been perceived by sufficiently many unique mobile devices while on a vehicle, the device can be determined to be on a mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the device can be determined to be on a non-mass transportation vehicle.
  • the device perceives a WiFi enabled device with BSSID 34-21-56-78-90- AB, which is an AP that has been observed (e.g., in beacon frames) by 5,000 unique mobile devices over the past 6 weeks (which unique mobile devices have, for example, passed such information to a remote database), the device can be determined to be in a mass transportation vehicle (for example, because it is not likely that an AP in a non-mass transportation vehicle would have been perceived by so many unique devices).
  • BSSID 34-21-56-78-90- AB which is an AP that has been observed (e.g., in beacon frames) by 5,000 unique mobile devices over the past 6 weeks (which unique mobile devices have, for example, passed such information to a remote database)
  • the device can be determined to be in a mass transportation vehicle (for example, because it is not likely that an AP in a non-mass transportation vehicle would have been perceived by so many unique devices).
  • the device perceives a WiFi enabled device (e.g., an access point), which, over time, has been authenticated with and/or associated with (e.g., as per IEEE 802.11 standard) sufficiently many unique mobile devices (and/or has done so for a sufficiently long period of time and/or with sufficiently high signal strength and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities) while on a vehicle, the device can be determined to be on a mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the device can be determined to be on a mass transportation vehicle.
  • the device perceives a WiFi enabled device (e.g., an access point), which, over time, has been authenticated with and/or associated with sufficiently few unique mobile devices (and/or has done so for a sufficiently long period of time and/or with sufficiently high signal strength and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities) while on a vehicle, the device can be determined to be on a non-mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the device perceives a WiFi enabled device with BSSID 34-21-56-78-90-AB, which is an AP associated with by 2,000 unique mobile devices over the past 5 weeks (which unique mobile devices have, for example, provided such information to a remote database), the device can be determined to be on a mass transportation vehicle. This is so because, for example, it is relatively unlikely that an AP on a non-mass transportation vehicle would have associated with so many unique mobile devices in this time fame (or ever).
  • the device perceives a WiFi enabled device (e.g., an access point), which, over time, has been perceived by sufficiently many unique mobile devices for a sufficiently long period of time (and/or with sufficiently high signal strength and/or while moving in a vehicle (e.g., as determined by GPS, WiFi APs, cell towers, inertial sensors) and/or over a sufficiently long distance (e.g., as perceived by GPS, WiFi APs, cell towers etc.) and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities), the device can be determined to be on a mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the device can be determined to be on a mass transportation vehicle.
  • the device perceives a WiFi enabled device (e.g., an access point), which, over time, has been perceived by sufficiently few unique mobile devices for a sufficiently long period of time (and/or with sufficiently high signal strength and/or while moving in a vehicle (e.g., as determined by GPS, WiFi APs, cell towers, inertial sensors) and/or over a sufficiently long distance (e.g., as perceived by GPS, WiFi APs, cell towers etc.) and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities), the device can be determined to be on a non-mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the device can be determined to be on a non-mass transportation vehicle.
  • the device perceives a WiFi enabled device with BSSID 34-21-56-78-90-AB, which is an AP that has been perceived (e.g., in beacon frames) over the course of more than five (5) consecutive minutes (or while moving above 20 km/h, or for 2 miles) by 5,000 unique mobile devices over the past 5 weeks (which unique mobile devices have, for example, provided such information to a remote database), the device can be determined to be on a mass transportation vehicle. This is so because, for example, it is not likely that an AP in a non-mass transportation vehicle would have been perceived by so many unique devices over such long periods of time.
  • BSSID 34-21-56-78-90-AB which is an AP that has been perceived (e.g., in beacon frames) over the course of more than five (5) consecutive minutes (or while moving above 20 km/h, or for 2 miles) by 5,000 unique mobile devices over the past 5 weeks (which unique mobile devices have, for example, provided such information to a remote database)
  • the device can be determined to be on
  • the device if it perceives a WiFi enabled device (e.g., an access point), it can use the TIM field transmitted by the WiFi enabled device (e.g., in one or more closely spaced beacon frames) to determine how many of the devices (i.e., non-AP stations) associated with such WiFi enabled device currently have traffic being buffered by the WiFi enabled device (e.g., by looking at the TIM Partial Virtual Bitmap or the length of the TIM). If the TIM shows sufficiently many traffic-buffered devices (e.g., as perceived by the device), it can be determined that the device is on a mass transportation vehicle.
  • a WiFi enabled device e.g., an access point
  • the TIM shows sufficiently few traffic-buffered devices (e.g., as perceived by the device)
  • the number of traffic-buffered devices associated with an AP can be utilized to compute other determinations as well.
  • a device e.g., a non-AP station ('STA')
  • AP 1, AP 2 can receive beacon frames from two different APs (AP 1, AP 2) and it is able to associate with both.
  • one STA may be associated with AP 1 while ten STAs are associated with AP 2 (though this may not be known/accessible to the device), with no other APs in the vicinity.
  • the two APs may be on different channels and may be the same in every other way (e.g., on the same band, support the same version protocol, transmission rates, services and are received by the STA with the same signal strength).
  • the STA may be more effective for the STA to connect to AP 1 rather than AP 2, because it has fewer STAs currently associated with it (even, for example, if AP 1 has worse signal strength than AP 2, in light of the larger number of devices associated with AP 2 and the commensurate reduction in resources (e.g., bandwidth) that are available to allocate to each device).
  • the device can make that determination, for example, based on the fact that, over some time period, AP 1 had fewer traffic-buffered STAs than AP 2 did and therefore can be determined to (a) be more likely to have fewer associated STAs; and, therefore (b) be more likely to have better performance.
  • This technique can be used by devices (e.g., non-AP STAs), automatically or manually (i.e., by user) to decide with which AP(s) to associate at any point in time.
  • the device if it perceives a WiFi enabled device (e.g., an access point), it can use the Station Count field (and/or the Channel Utilization field) in the BSS Load element which is present in various frames (e.g., beacon frames, probe responses), sent from (or to) such WiFi enabled device to determine how many devices (i.e., non-AP STAs) are associated with such WiFi enabled device.
  • the higher the number of associated devices the more likely the device is in a mass vehicle.
  • the lower the number of associated devices the less likely the device is in a mass vehicle.
  • the device may only be able to access this information if it is associated with the WiFi enabled device, which in the event that the device is on a mass transportation vehicle may be possible given that the WiFi enabled devices in many mass transportation vehicles are open.
  • the device if the device perceives sufficiently many WiFi-enabled devices, e.g., by detecting and examining probe requests and/or probe responses from such WiFi enabled devices, (in certain cases for a sufficiently long period of time and/or with sufficiently high signal strength and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities), the device can be determined to be on a mass transportation vehicle. This is so, for example, because there are likely more people and, therefore, more WiFi- enabled devices on mass transportation vehicles.
  • the device perceives sufficiently few WiFi- enabled devices, e.g., by detecting and examining probe requests and/or probe responses from such WiFi enabled devices, (in certain cases for a sufficiently long period of time and/or with sufficiently high signal strength and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities), the device can be determined to be on a non-mass transportation vehicle. This is so, for example, because there are likely fewer people and, therefore, fewer WiFi-enabled devices on non-mass transportation vehicles.
  • the described techniques can also incorporate analyzing the transmissions of the WiFi enabled devices to determine how many of them are mobile devices, e.g., as distinct from in- vehicle devices (e.g., based on the number, content and/or signal strength of the probe requests and/or probe responses made by or sent to such WiFi enabled devices). This is so, for example, because such mobile WiFi-enabled devices may be effective indicators of whether the device is on a mass transportation vehicle or a non-mass transportation vehicle because, for example, mobile devices are more closely associated with people in vehicles than non-mobile devices are.
  • the device perceives a WiFi enabled device that issues a probe request (a) to sufficiently many SSIDs (e.g., directed probe requests), it is more likely to be a mobile device; (b) and such issuing device has a BSSID that contains an OUI of a known mobile device maker (e.g., Apple, Samsung), it is more likely to be a mobile device; (c) and such issuing device has an SSID that contains a string of characters indicative of its being a mobile device (e.g., "Android", "Galaxy S 4"), it is more likely to be a mobile device; (d) and such issuing device has a locally administered BSSID (i.e., the 2nd bit of the 2nd hex character of the BSSID is on), it is more likely to be a mobile device; and (e) in which the power management field is turned on, indicating power save mode, the issuing device is more likely to be a mobile device (mobile devices are more power constrained than non-mobile
  • 'mobile device' as used herein may refer to a WiFi enabled mobile device (e.g., as distinct from a WiFi enabled in- vehicle device).
  • the described techniques can also include (e.g., to help determine if the device and a WiFi enabled device are likely to be on the same vehicle) a determination that one or more the wireless signals is perceived by the device (a) for a sufficiently long period of time (e.g., sufficiently many management, control and/or data frames spaced sufficiently close in time); and/or (b) with a sufficiently high signal strength; and/or (c) with sufficiently low signal strength variability and/or (d) with sufficiently low time of flight of signals travelling from the device to the AP or vice versa as can be determined with, for example, the 802.11 TM or FTM capabilities.
  • the techniques and examples described herein are based on WiFi wireless signals. Similar techniques can also be applied to one or more other types of wireless signals perceivable by the device (e.g., Bluetooth, Cellular, NFC, V2V, V2I) either alone or combined with WiFi.
  • Some of the described techniques may be more effectively employed after the device is associated with an AP.
  • wireless devices may be secondary or dependent devices, e.g., devices that receive services from a primary device (e.g., a watch may be dependent on a smartphone and communicate with it via BLE).
  • a primary device e.g., a watch may be dependent on a smartphone and communicate with it via BLE.
  • the secondary devices may be co-located with the primary devices, but in other cases they may remote (e.g., a secondary device communicates with its primary device over the Internet).
  • Various techniques can be used to estimate the number of people in a location based upon one or more wireless signals perceived by a device, whether with people carrying many secondary devices or people carrying multiple primary devices. For example, by analyzing the air traffic (e.g., 802.11 association frames as well as other 802.11 management, control and data frames and analogous or similar information from other wireless protocols, e.g., Bluetooth).
  • air traffic e.g., 802.11 association frames as well as other 802.11 management, control and data frames and analogous or similar information from other wireless protocols, e.g., Bluetooth.
  • a device can detect those devices (e.g., secondary devices) that are nearby and in communication with (e.g., association frames, control frames, data frames) and/or attempting communication with (e.g., probe requests/responses) other devices (e.g., primary devices) which are also nearby, whereupon the first devices can be determined to be secondary devices and thus not counted for the purpose of determining the number of people present.
  • devices e.g., secondary devices
  • other devices e.g., primary devices
  • a device can detect that multiple primary devices belong to the same person, for example, based upon sufficient commonalities in the SSIDs of their preferred network lists as seen in probe requests. For example, if two devices send (directed) probe requests to the same list of eight (8) SSIDs, they can be determined to be likely to belong to the same person, whereupon the multiple devices can be understood to belong to one user and so counted, e.g., for the purpose of determining the number of people present.
  • a device can detect the BSSIDs of those devices nearby (e.g., using probe requests) and based upon their manufacturers and/or model types and/or BSSID range (e.g., as described herein, such as for differentiating between mass and non-mass transportation vehicles), determine which device(s) are likely to be secondary devices and adjust the count of people present in accordance.
  • a device can detect the SSIDs (and/or Bluetooth major and minor classes) of those devices nearby that recognize those likely to be secondary devices (e.g., "BT Earpiece", as described herein for differentiating between mass and non-mass transportation vehicles) and adjust the count of people present in accordance.
  • SSIDs and/or Bluetooth major and minor classes
  • BT Earpiece as described herein for differentiating between mass and non-mass transportation vehicles
  • such signals can also be used, for example, to determine (alone or in conjunction with other signals and/or techniques, such as those described herein) whether the device is in a vehicle (or in a moving vehicle) and whether the device is a passenger device or a driver device.
  • wireless signals can be used (e.g., using one or more of the signals and/or techniques described herein), alone or in conjunction with other techniques (such as those described herein), to determine whether a device is a passenger device or a driver device.
  • techniques may be based on WiFi signals, but should also be understood (with the appropriate changes for the relevant signals and protocols as known to those of ordinary skill in the art) to be applicable to other types of wireless signals as well (or combinations thereof).
  • a device in a vehicle can be used to (a) determine (or estimate) the number of occupants present in a vehicle (as estimated using one or more of the signals and/or techniques described herein); and/or (b) determine the composition (e.g., mobile devices, fixed in-vehicle devices) of the WiFi enabled devices present/perceived in the vehicle (e.g., using one or more of the signals and/or techniques described herein); and/or (c) determine whether the vehicle is a mass transportation vehicle or a non-mass transportation vehicle (as estimated using one or more of the signals and/or techniques described herein).
  • the device user can be further determined (or estimated) whether or not the device user is a driver or a passenger, whereupon certain actions can be taken to, for example, (a) place one or more restrictions on the user's device until the user verifies as a passenger (e.g., using one or more of the techniques described herein); and/or (b) to include the data from the current trip in the calculation of a UBI driver score.
  • certain actions can be taken to, for example, (a) place one or more restrictions on the user's device until the user verifies as a passenger (e.g., using one or more of the techniques described herein); and/or (b) to include the data from the current trip in the calculation of a UBI driver score.
  • a user can be required to verify as a passenger and/or the driving data collected during the trip can be used in the user's UBI driving score (and, in certain implementations, if no sufficiently strong information indicative of the user being a passenger is received).
  • the driving data collected during the trip can be used in the user's UBI driving score (and, in certain implementations, if no sufficiently strong information indicative of the user being a passenger is received).
  • the driving data collected during the trip can be used in the user's UBI driving score (and, in certain implementations, if no sufficiently strong information indicative of the user being a passenger is received).
  • WiFi enabled non-mobile devices and 27 WiFi enabled mobile devices are detected; and/or (c) that is determined to be a mass transportation vehicle, a particular user can be determined to be a passenger (e.g., without the need to verify as such) and/or the associated trip data is not used in the user's UBI driving score (and, in certain implementations, only if no sufficiently indicative information of the user being a driver is received).
  • the in-vehicle role of the user of the device can be determined.
  • Device A might be a WiFi AP at a gas station perceived by solo-commuter Joan's device on her way to work, along which route Joan rarely travels other than on her way to work.
  • Device B may be a WiFi AP in the (non-mass transportation) vehicle that only Barbara usually drives on weekdays (on weekends her husband usually drives it).
  • Device C may be a WiFi AP in a mass transportation vehicle.
  • the device if 95% of the time that a device perceived one or more wireless signals transmitted by Device D for more than 10 seconds and with a signal strength greater than -60 dBm, the device was determined to be in a moving vehicle and was determined to be a passenger device, when the device then perceives one or more wireless signals from Device D with the same or sufficiently similar criteria (e.g., in a vehicle / moving vehicle, signal type, signal strength, length of reception, day of week, time of day, trip start location, vehicle class etc.), it can be determined to be likely to be a passenger device. For example, the device may belong to a wife who is almost always the passenger when Device D, her husband's mobile device, is in the (non-mass transportation) vehicle with her.
  • the device may belong to a wife who is almost always the passenger when Device D, her husband's mobile device, is in the (non-mass transportation) vehicle with her.
  • Combinations of wireless (or other) signals from one or more devices of one or more device types are also possible. For example, if 97% of the time that a device perceives one or more wireless signals transmitted by Device B and Device D, for more than 7 seconds and with a signal strength greater than -70 dBm, the device was determined to be in a vehicle and was determined to be a passenger device, when a device then perceives one or more wireless signals from Device B and Device D with the same or sufficiently similar criteria (e.g., in a vehicle / moving vehicle, signal type, signal strength, length of reception, day of week, time of day, trip start location, vehicle class etc.), it can be determined to be likely to be a passenger device.
  • 97% of the time that a device perceives one or more wireless signals transmitted by Device B and Device D for more than 7 seconds and with a signal strength greater than -70 dBm
  • the device was determined to be in a vehicle and was determined to be a passenger device
  • the same or sufficiently similar criteria e.g.
  • Device B may be a WiFi AP in Barbara's non-mass transportation vehicle
  • Device D is her husband's mobile device and the device is Barbara's mobile device, where Barbara usually commutes as a solo driver in the vehicle, but on weekends, when her husband's Device D is perceived in the vehicle he is almost always the driver.
  • wireless signals can also be utilized (e.g., using one or more of the signals and/or techniques described herein), alone or in conjunction with other techniques (such as those described herein), to determine whether a device is in a vehicle and/or in a moving vehicle.
  • the described implementations illustrate techniques that utilize WiFi signals, but should also be understood (with the appropriate changes, e.g., for the relevant signal, as known to those of ordinary skilled in the art) to be applicable to other types (or combinations thereof) of wireless signal(s) as well.
  • a device in a vehicle can be used to (a) determine (or estimate) number of occupants present nearby (as determined, for example, using one or more of the signals and/or techniques described herein); and/or (b) determine the composition (e.g., mobile devices, fixed in-vehicle devices) of WiFi enabled devices present nearby (using one or more of the signals and/or techniques, such as those described herein); and/or (c) determine whether the device is moving (as determined using one or more of the signals and/or techniques, such as those described herein).
  • the device can be further determined whether or not the device is in a vehicle (or in a moving vehicle), whereupon certain actions should be taken to, for example, (a) place or relax/remove one or more restrictions on the user's device, e.g., until the user verifies themselves as a passenger (e.g., using one or more of the techniques described herein); and/or (b) to begin or cease collecting data from the current trip with respect to a UBI driver score.
  • certain actions should be taken to, for example, (a) place or relax/remove one or more restrictions on the user's device, e.g., until the user verifies themselves as a passenger (e.g., using one or more of the techniques described herein); and/or (b) to begin or cease collecting data from the current trip with respect to a UBI driver score.
  • the device by comparing the wireless signals perceived by a device in a vehicle with the patterns of one or more wireless signals (e.g., wireless signal fingerprint, signal strength fingerprint, length of time perceivable, time of day and day of week, passenger or driver, mass or non-mass transportation vehicle, location), perceived by one or more devices of one or more device types (e.g., in-vehicle Wi-Fi enabled devices that appear in substantially each trip the vehicle takes and for substantially the entire trip, in-vehicle mobile devices which may or may not appear in substantially each trip such vehicle takes and may or may not appear for substantially the entire trip based, among other things, upon how many different people drive the vehicle, and ex-vehicle devices that may or may not appear in substantially each trip the vehicle takes and are not likely to appear for substantially the entire trip based, among other things, on the variability of the routes the vehicle takes), over time (e.g., using techniques know to those of ordinary skill in the art), the device can be determined to be in or not in a vehicle and/or in
  • Device A the device was determined to be in a moving vehicle, when the device then perceives one or more wireless signals from Device A with the same or sufficiently similar criteria (e.g., signal type, signal strength, length of reception, day of week, time of day, trip start location, vehicle class etc.), it can be determined to be likely to be in a moving vehicle.
  • Device A might be a WiFi AP at a gas station perceived by solo- commuter Joan's device on her way to work, along which route Joan rarely travels other than on her way to work.
  • Device B may be a WiFi AP in the (non-mass transportation) vehicle that only Barbara usually drives on weekdays (on weekends her husband usually drives it).
  • Device C may be a WiFi AP in a mass transportation vehicle.
  • the described techniques can also be employed with respect to combinations of wireless (or other) signals from one or more devices of one or more device types. For example, if 97% of the time that a device perceives one or more wireless signals transmitted by Device D and Device E, for more than 7 seconds and with a signal strength greater than -70 dBm, the device was determined to be in a vehicle, when a device then perceives one or more wireless signals from Device D and Device E with the same or sufficiently similar criteria (e.g., signal type, signal strength, length of reception, day of week, time of day, trip start location, vehicle class etc.), it can be determined to be likely to be in a vehicle.
  • the same or sufficiently similar criteria e.g., signal type, signal strength, length of reception, day of week, time of day, trip start location, vehicle class etc.
  • Device D is the mobile of one of Barbara's colleagues with whom she carpools to work
  • Device E is the mobile device of another person who does not work in the same company as Barbara, but works a block away and commutes with them and the device is Barbara's. While Barbara's device sees wireless signals from Device D many times during a day while in a vehicle and also while not in a vehicle (as they commute and work in the same space), Barbara's device only perceives wireless signals from Device D and Device E, at or near the same time, when it is in a vehicle.
  • the referenced devices cannot be recognized easily, e.g., from the BSSID, because such BSSID may be locally administered (i.e., randomized), the devices can still be recognized using other techniques (e.g., the fingerprint of their directed probe requests, their SSIDs).
  • a device can allow simultaneous connection to one or more networks of the same and/or different technologies.
  • the techniques described herein can be adapted to such settings (e.g., various voting techniques) using methods known to those skilled in the art.
  • a "healing" mechanism whereby periodically checks can be performed to determine/confirm that erroneous determinations have not been made (and/or that the device is operating in an incorrect state). For example, even when a device is connected to a power source continuously since a sufficiently long time before the last time it was last disconnected to WiFi, the device can be periodically checked (e.g., at a lower sampling rate / duty cycle than it might otherwise be checked at) to further determine that it is not in a trip.
  • One or more of the techniques described herein can be implemented in an API that can be used by other applications running on/in relation to the device to provide such applications with accurate context information in a manner that is power efficient for the device.
  • a navigation application that wants to prevent a driver from the distraction of inputting a destination while driving but wants to permit a passenger to do so, can query such an API as to whether the device is (a) present within a trip and/or (b) is being operated by a driver or a passenger.
  • a navigation application e.g., Waze
  • a navigation application can be configured to turn itself on or off automatically (with or without giving the user a chance to override), based on whether the device is determined to be within a trip or not (and the device can expend relatively little power to so determine).
  • a telephony application e.g., Viber, Skype, etc.
  • a texting application e.g., WhatsApp
  • running on the device can determine whether or not to allow incoming and/or outgoing calls based upon the in-trip and driver/passenger information/determination(s), as can be provided by the referenced API.
  • one or more aspects of the techniques/technologies described herein can be configured in relation to a determination of a likelihood that a user associated with a device is going to be in a trip. It should be understood that such a likelihood can, for example be determined/estimated based on one or more inputs, factors, data elements, etc. (e.g., time, location, calendar information, user history, user or supervisor preference for tradeoff between latency, accuracy and power, etc.), such as those that may pertain to all users in a given population, pertain only to a segment of such users (e.g., sex, demographic, location), and/or or pertain to a particular individual user (e.g., calendar information, historical user behavior, etc.).
  • factors, data elements, etc. e.g., time, location, calendar information, user history, user or supervisor preference for tradeoff between latency, accuracy and power, etc.
  • a segment of such users e.g., sex, demographic, location
  • a particular individual user e.g.,
  • relatively more resources can be employed, activated, and/or otherwise dedicated towards one or more determinations pertaining to whether a user is/was performing a certain activity (e.g., present/traveling within a moving vehicle) during or near rush hour (as determined based on data pertinent to an entire population), during or near the time children are driving being driven/traveling to school within a certain neighborhood (as determined based on data pertinent to a particular population segment) and/or before a meeting scheduled within the user's calendar (as determined based on data pertinent to the individual user) - in order to determine/identify such activity more accurately (by utilizing the additional resources referenced above) and/or with less latency, with respect to one or more of the referenced context(s).
  • a certain activity e.g., present/traveling within a moving vehicle
  • rush hour as determined based on data pertinent to an entire population
  • children are driving being driven/traveling to school within a certain neighborhood (as determined based on data pertinent to a particular population segment) and/
  • a comparable configuration can also be employed to enable the usage/expenditure of relatively fewer resources in other context(s) to determine whether the same user was performing such an activity at a time when (s)he can be determined to be relatively less likely to do so, e.g., in a moving vehicle in the middle of the night.
  • the processor 110 executing one or more of software modules 130, including, preferably, determination module 170, can transform an operation state of mobile device 105 based in whole or in part on the one or more determination factor(s), such as the probability computed at step 630.
  • This operation can be further appreciated when employed in conjunction with a determination of an in- vehicle role of a user of mobile device 105, such as that depicted in FIGs. 2A-C and described in detail above.
  • a mobile device 105 upon determining (preferably to a certain minimum probability) that a mobile device 105 is under the control of a driver of a vehicle (such as by processing the inputs from accelerometer 145A and gyroscope 145B of mobile device 105 against those of other mobile devices 160 within the same vehicle, thereby identifying the driver of the vehicle, as described in detail herein), it can then be further determined whether mobile device 105, which has been determined to be under the control of a driver, is being operated in a handheld state (generally prohibited in most places) or in a non-handheld state (generally permitted).
  • a transformation (substantially similar to that described in detail above with respect to step 240) can be employed
  • processor 110 can coordinate various transformations and/or adjustments to the operation(s) of mobile device 105, as described in detail above with respect to step 240.
  • various of the referenced transformations can be employed only when either one or both of the probabilities pertaining to the user role of the user of mobile device 105 is a driver and/or the handheld state of mobile device 105 is handheld meet and/or exceed a certain minimum threshold.
  • a mobile device 105 that is positioned in a cradle or dock can be configured to employ a particular restriction and/or set of restrictions (such as those described in detail herein) thereto, such restriction/set of restrictions preferably being different than restrictions that are employed at a device that has also not been authenticated as being operated by a passenger but is not in a cradle.
  • a cradled device can be employed with a restriction that still allows the user to make and/or to receive calls in geographic locations (e.g., a particular state or country) that allow the use of hands-free devices for calling, whereas a comparable non-cradled device can be restricted from even making such calls.
  • restriction(s) can be employed, for example, on/at the mobile device, on the SIM card, and/or by the cellular carrier (that is, in relation to the mobile device).
  • the manner in which a device is being held/oriented can be determined based on a spectral analysis of the frequencies perceived/observed by the accelerometer and/or gyroscope of a device, such as in a manner known to those of ordinary skill in the art.
  • the spectral pattern perceived by the sensors of a handheld device i.e., a device being held in the hand of a user
  • the spectral pattern perceived by the sensors of a handheld device is likely to be identifiably different from that perceived with respect to a device that is placed in a dock/cradle, in that a handheld device will often perceive higher values (e.g., in a range of 3-7 hz).
  • the manner in which a device is being held/oriented can be determined based on the orientation of the device (for example, as can be measured based on inputs originating at the accelerometer, magnetometer, and/or GPS of the device, such as in the manner described herein), in conjunction with a spectral analysis of the observed/perceived frequencies (such as of the accelerometer and/or gyroscope), as described herein.
  • the referenced processing/determining of an orientation of a device based on inputs originating from the accelerometer of the device can encounter noise/interference when such determinations occur within a running or ignited vehicle (i.e., a vehicle having a running engine), as, for example, a running vehicle can impart various forces that are perceptible to the device and/or the sensors of the device and which can cause an accelerometer-measured/determined orientation to change, despite the fact that the actual orientation of the device may not change.
  • a running or ignited vehicle i.e., a vehicle having a running engine
  • a device that is cradled /docked or otherwise held in an 'upright' orientation will generally demonstrate no acceleration on its X and Z axes, while demonstrating an acceleration of lg on its Y axis, and the angle of the device, as determined from the accelerometers of the device, is 90 degrees.
  • the acceleration perceived on the Y and Z axes are each lg and the pitch angle of the device (as determined based on the trigonometric relationship between the acceleration of its axis) is 45 degrees - whereas the actual pitch angle of the device is 90 degrees.
  • one or more inputs originating at one or more sensors of a device can be used to filter out the noise introduced by vehicle acceleration events (as opposed to device- specific acceleration events). In doing so, the accuracy of the various determinations with respect to the orientation of the device can be increased/improved.
  • one or more acceleration events/instances can be detected/identified, such as using one or more other sensors, such as GPS, on-board car sensors, radio (e.g. Doppler, RSSI, cellular towers, WiFi, etc.), server-side techniques (e.g., OAO, TDOA), etc., which can be used to process the data originating from the accelerometer and/or gyroscope of the device in an improved/more accurate manner.
  • the GPS sensor of the device can measure/identify a change in speed of the vehicle that can be consistent with the referenced lg forward acceleration, and it can be determined that the lg acceleration observed with respect to the Z-axis of the accelerometer was present due to the acceleration of the vehicle and not from a change in the actual orientation of the device, and, as such, it can be further determined that the actual pitch angle of the device remains 90 degrees (and not 45 degrees). It should also be appreciated that in certain implementations, one or more restrictions can be employed at/in relation to a mobile device, based on a geographical determination, as referenced above.
  • one or more restrictions can be selectively employed at a mobile device based on a determination of the location of the device (e.g., based on inputs received from the GPS).
  • the appropriate corresponding restriction can be employed based on the location of the mobile device (as determined by the GPS, cell towers and/or wifi transceivers seen/detected, as is known to those of ordinary skill in the art).
  • a device can be determined to be positioned in a cradle or dock based on the level of movement of the device. For example, a device that is held in a cradle will tend to move less than a device that is not held fixed in a cradle - as can be determined based on an analysis of inputs originating at the accelerometer and/or gyroscope of the device, and, for example, the tightness (e.g., standard deviation) of the distribution of accelerometer readings or changes thereto, such as in a manner described herein and known to those of ordinary skill in the art.
  • the tightness e.g., standard deviation
  • the orientation of the device can be used to determine if a device is in a cradle or not.
  • a device that is in a cradle will have a fixed and known orientation (e.g., generally the y- axis accelerometer, as shown in FIG. 9A) will pick-up a large component of gravity and/or the X-axis accelerometer will pick up a large component of gravity, as can be appreciated by those of ordinary skill in the art..
  • the presence/existence of connections to the device can be used to determine if a device is in a cradle or not.
  • the orientation of a device relative to the vehicle in which it is present can be used to determine if the device is in a cradle or not. If such orientation is fixed over some time period, then it can be determined that the device is fixed (e.g., is positioned in a cradle/dock). If such orientation is not fixed (e.g., the device is actually hand-held), then it can be determined that the device is not in a cradle.
  • the orientation of the device relative to the vehicle can be determined at any point in time using (a) the device's GPS bearing, which measures the direction of movement of the device with respect to the Earth (denoted G), regardless of the orientation of the device within the vehicle; and (b) the device's magnetometer/compass, which measures the orientation of the device with respect to the Earth's magnetic field (denoted M), which depends on the orientation of the device in the vehicle and the bearing of the vehicle.
  • routine 700 that illustrates a broad aspect of a method restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 700 includes a broad aspect of a method restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • various of the steps and operations that make up routine 700 share substantial similarities to those described above in connection with FIGs. 2A-C, 3, 4, 5, and 6.
  • routing 700 will be directed primarily to operations occurring at mobile device 105, such description is exemplary and intended for the sake of clarity and consistency.
  • any and/or all of the steps in routine 700 can be similarly employed at another device/machine, such as at central machine 168, such as in the manner described in detail above with respect to FIG. 4.
  • any and all steps, operations, and/or functions described herein with regard to a particular device and/or machine should be similarly understood to be similarly capably of employment at another device and/or machine (such as central machine 168), substantially in the manner described herein, without departing from the scope of the present disclosure.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 determines whether mobile device 105 is present with a vehicle, such as through one or more of the various determination methods described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 determines whether the vehicle is in motion, such as through one or more of the various determination methods described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 employs a first restriction at mobile device 105 and/or in relation to mobile device 105.
  • the first restriction is preferably one or more instructions that dictate at least one operation state of the mobile device. Examples of such restrictions include but are not limited to: instructions that disable a particular feature or functionality of a mobile device 105 (such as the ability to type text), instructions that disable multiple features or functionalities of a mobile device 105 (such as the ability to launch certain applications and the ability to receive text messages), and instructions that functionally "lock" mobile device 105 by effectively disabling many or all of the functionalities of the device.
  • the referenced first restriction is preferably a default restriction. That is, in such arrangements the first restriction is employed by default, such as upon powering on and/or activating mobile device 105. It should be appreciated that in certain arrangements such restriction can be employed in relation to mobile device 105, such as by a central machine 168, such as in the manner disclosed in detail herein, for example with respect to FIG. 4.
  • the referenced restriction can be imposed by a communications provided (which preferably operates central machine 168) to prevent transmission of one or more communications (e.g., SMS messages) to a mobile device 105, until an identification/determination is made, such as identifying that two or more users are in a vehicle, such as in the manner disclosed in detail herein.
  • the various restrictions employed at mobile device 105 are directed towards configuring mobile device 105 in such a manner that operation of and/or interaction with the device is difficult, inconvenient, and/or impossible (that is, it can be said that operation of mobile device 105 is impeded) for a user who is also simultaneously operating a vehicle.
  • such restrictions are also preferably configured to create minimal, if any, difficulty and/or inconvenience when operated by and/or interacted with by a user who is not simultaneously operating a vehicle.
  • such restrictions preferably impede operation of the mobile device by a user who is a driver moreso than they impede operation of the mobile device by a user who is a passenger.
  • mobile device 105 it can be preferably for mobile device 105 to initially determine that the device is present within a vehicle (such as through one or more of the various determination methods described in detail herein) prior to employing such a first restriction.
  • the various steps and operations described herein with reference to FIGs. 7-8 can be further implemented, in certain arrangements, in conjunction with one or more of the various other methods and systems described in detail herein, such as those described with reference to FIGs. 2A-6.
  • any one or more of the various steps, operations, routines, functions, and/or figures disclosed herein can preferably employed in conjunction within any one or more of the various steps, operations, routines, functions, and/or figures disclosed herein.
  • the various restrictions described in conjunction with FIG. 7 can be employed in conjunction with the various determination operations described above.
  • one or more of the referenced restrictions can be employed before the occurrence of and/or in response to one or more of the determinations described in detail herein.
  • mobile device 105 preferably prompts one or more users to initiate and/or provide one or more stimuli that can be received as inputs at mobile device 105.
  • mobile device 105 can prompt each of the one or more users in a vehicle to repeat a particular word or series of words projected by mobile device 105.
  • Such a prompt can request for the words to be repeated sequentially while in other arrangements such a prompt can request for the words to be repeated simultaneously, while in yet other arrangements the timing of the repetition is of no consequence. It should be appreciated that such prompting can request practically any stimulus that can be received and/or analyzed as an input in the manner described herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 receives at least a first input and a second input (e.g., the referenced stimuli), in the manner disclosed in detail herein.
  • a first input and a second input e.g., the referenced stimuli
  • each of the first input and the second input preferably originate at one or more of sensors 145, software modules 130, user interface 172, operating system 176, and/or communication interface 150, though it should be understood that the first input and the second input need not originate from the same source.
  • a first input corresponding to the audio tones of the voice of a first user can be received at microphone 145D
  • a second input corresponding to the audio tones of the voice of a second user can also be received at microphone 145D.
  • one or more of the various inputs can be received at and/or originate from a source external to mobile device 105, such as vehicle data system 164 and or another mobile device 160.
  • vehicle data system 164 can provide an input to mobile device 105 (preferably received via communication interface 150) indicating the weight measured on one or more seats of a vehicle, and/or the usage of seat belts at one or more seats of a vehicle, etc.
  • first input and the second input have been described herein as being discrete inputs, such description is merely exemplary and for the sake of clarity and illustration. Accordingly, while in certain arrangements the first input and the second input are separate inputs in the conventional sense - that is, inputs that originate at two independent sources, in other arrangements the first input and the second input are actually aspects found within a single input.
  • a single audio input (such as an audio recording) that contains two distinct voices (such as the voices of a first user and a second user) can be processed (in the manner described herein) to identify such distinct voices within the single audio input, which are understood to be a first input and a second input within the context of the present disclosure.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, analyzes the first input and the second input. In doing so, the presence of at least one of two or more users and/or two or more mobile devices can be determined, such as a determination of the presence of a first user and the presence of a second user, such as in the manner described in detail herein.
  • the first and second inputs can be analyzed to identify an audio signature for each of the respective inputs, in a manner known to those of ordinary skill in the art, and such audio signatures can then be compared to determine if they are substantially similar and/or identical (indicating that both inputs likely originate from the same source, i.e., the same user) or substantially dissimilar (indicating that each of the inputs likely originate from different users).
  • first input here, the voice of the first user
  • second input here, the voice of the second user
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 modifies an employment of at least one restriction such as the first restriction. That is, being that a determination (at step 720) that the device is in the presence of at least two users necessarily indicates that at least one of such users is not a driver of a vehicle, this conclusion can preferably trigger and/or initiate the modification of the first restriction. In certain arrangements, such modification can include the employment of a second restriction, strengthening of the first restriction, and/or the easing of the first restriction.
  • such a second restriction can include one or more instructions that dictate one or more operational states of the mobile device 105 with respect to one or more of the various sensors 145 of the device. That is, as noted above, such a restriction can configure mobile device 105 to operate in a manner that is relatively difficult/inconvenient for a driver while being relatively unobtrusive for a passenger. Put differently, it can be said that such restrictions impeded operation of mobile device 105 by a user who is a driver moreso than the same restrictions impede operation of a mobile device 105 by a user who is a passenger.
  • restrictions include but are not limited to: requiring that the device only operate in 'landscape' mode (which generally requires two hands for efficient interaction/navigation - a demand that is relatively simple for a passenger to comply with but relatively difficult for a driver, who needs at least one hand to steer the vehicle, to comply with), requiring that the device operate only at certain orientations (as detected by one or more of sensors 145, such as gyroscope 145B, accelerometer 145A, GPS 145C, and magnetometer 145E) such as a completely upright orientation which is relatively simple for a passenger to comply with but which is inconvenient for a driver who will not find such an orientation as comfortable while driving and who will generally wish to hold the device at alternate orientations in order to obscure the device from the view of law enforcement officials), and that the device not operate in a manner/pattern that is consistent with that of a driver (such as the various in-vehicle role determinations described in detail herein).
  • sensors 145 such as gyroscope 145
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 maintains the employment of the first restriction.
  • one or more input methods associated with such a device can be modified or changed, such as to different input method(s) that selectively restrict one or more aspects of the functionality the device. For example, based on a detection/determination that a device is present/operating within a moving vehicle, an on-screen keyboard can be replaced (or altered) such that the keyboard can only receive a single input/character during a defined time period (e.g., every 15 seconds).
  • Such input method(s) can be selectively altered or replaced in relation to instances during which one or more particular application(s), such as those identified as being distracting (e.g., texting, e- mailing, web browsing, social networking, etc.), are executing at the device.
  • one or more particular application(s) such as those identified as being distracting (e.g., texting, e- mailing, web browsing, social networking, etc.) are executing at the device.
  • a determination can be made as to when the device is present/operating within a moving vehicle and, based on such a determination, the device can initiate an operation mode that can selectively restrict one or more functionalities of the device ("Driver Mode"), such as in the manner described herein.
  • Driver Mode an operation mode that can selectively restrict one or more functionalities of the device
  • the voice(s) of one or more user(s) perceived at one or more device(s) can be processed/analyzed to determine, for example, whether a particular speaker is relatively likely to be a driver or a passenger.
  • Such determination(s) can be achieved, for example, by comparing response time, pitch, tones, intonations, intensity, volume and other speech characteristics between the speaker and a reference population and/or between the speaker and one or more references to the speaker himself (or herself) over time, such as in one or more different situations (e.g., when not in a moving vehicle, when in a moving vehicle when determined to be in different roles, e.g., driver/passenger, etc.), where the context of the situations may be known and/or unknown, etc.
  • situations e.g., when not in a moving vehicle, when in a moving vehicle when determined to be in different roles, e.g., driver/passenger, etc.
  • one or more speech recognition, speaker recognition, intonation analysis, non-verbal communication and/or voice emotion detection including spectral analysis, waveform analysis and neural networks techniques can be employed in achieving such determinations.
  • one or more users can be determined to possess/exhibit certain elements or characteristics in his (or her) voice and intonations when the user is determined to be a driver (for example, at times a driver may pause for a relatively longer period of time before responding to a question), whereas such a user's voice doesn't possess/exhibit these elements (or possesses/exhibits them to a different extent or with a different frequency) when the user is determined to be a passenger or outside a vehicle.
  • reaction is to configure the mobile device to give the driver an audible and/or visual "calm down" message
  • another example is to end a call or otherwise "harden” or strengthen the mobile device's Driver Mode policy (e.g., by selectively implementing additional restrictions in relation to the device), until, for example, the driver can be determined to have calmed down.
  • FIG. 42 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • a perception of one or more signals can be identified, such as in relation to a first device.
  • the one or more signals can originate at one or more other devices.
  • a performance of one or more authentication techniques can be enabled, such as in relation to the first device.
  • a performance of one or more authentication techniques can be enabled based on an identification of the one or more signals in relation to the first device.
  • one or more other devices can project or otherwise emit or provide a signal or notification (which, in various implementations, may be audible, inaudible, via Bluetooth, etc.) (referred to herein as a "Special Signal”).
  • a signal or notification which, in various implementations, may be audible, inaudible, via Bluetooth, etc.
  • a Signal or notification which, in various implementations, may be audible, inaudible, via Bluetooth, etc.
  • such "Special Signals” can be configured such that they are perceptible only by other devices present within the same vehicle (as can be achieved by providing such signals at a relatively law power of transmission, such that, in most cases, the signals are unlikely to be perceptible to devices outside of the vehicle).
  • such signal(s) can be provided on a continuous and/or periodic basis (whether static, e.g., once every 15 seconds, or dynamic periodicity, e.g., at randomized intervals, at randomized frequencies, and/or in conjunction with randomized content generated by a remote server (such as in order to prevent tampering,).
  • the referenced device can be configured (such as via the referenced 'Driver Mode') to enable an adjustment of such a state, such as changing/transitioning the device to another operational state such as a passenger mode ("Passenger Mode") based on a determination (and, in certain implementations, for as long as) that the device is capable of perceiving a 'Special Signal' emitted by driver device (e.g., a device determined to be operated by a driver, presumably originating from a device within the same vehicle).
  • driver device e.g., a device determined to be operated by a driver, presumably originating from a device within the same vehicle.
  • a device operating in 'Driver Mode' can be transitioned to another state, such as to 'Passenger Mode' by successfully performing one or more passenger authentication technique(s), such as those described herein.
  • such a device can be configured to cease to emit/project the referenced 'Special Signals.
  • information identifying the device that was passenger authenticated e.g., a device ID such as IMEI), a SIM ID such as IMSI, UUID, telephone number, etc.
  • the device that enabled such authentication can be collected, time-stamped, GPS-stamped, saved and/or analyzed.
  • information pertaining to when a passenger device ceased to receive a Special Signal and, therefore, left Passenger Mode e.g., corresponding to a trip stopping, the driver and passenger separating, the driver turned off his/her device, etc.
  • left Passenger Mode e.g., corresponding to a trip stopping, the driver and passenger separating, the driver turned off his/her device, etc.
  • one or more aspects of the referenced information collected over time can be analyzed.
  • instances in which one device is used to enable another device to operate in 'passenger mode,' and the device that enabled such operation is then observed to operate in a relatively limited manner e.g., with respect to calls, texts, data sent/received, contact present or changes thereto, applications run, etc.
  • Identifying such instances can be advantageous in order to identify drivers who may procure one or more additional devices to "sacrifice" as driver devices so that a second device can be authenticated as a passenger device (such as in the manner described herein) and used freely while driving.
  • a driver having a device that does not have the software/application capable of configuring the device to project/emit the referenced 'Special Signal' may have the option to (a) download or otherwise obtain a 'lite' version of the software that enables projection/emitting of such a Special Signal (thereby enabling passengers within the vehicle to transition their devices into Passenger Mode, such as in the manner described herein), (b) go to a website that plays a Special Signal, and/or (c) receive a Special Signal over a voice connection (e.g., by calling a phone number that plays a Special Signal).
  • device users can stop the emission of a Special Signal from their devices. In so doing, a passenger can prevent a driver from using his/her device in Passenger Mode (or attempting to).
  • the referenced Special Signal can be emitted by and/or amplified by hardware within the vehicle.
  • other devices that perceive a Special Signal can also relay (or "repeat") such Special Signal, thereby increasing its effective range of transmission.
  • a passenger device can be prompted to request for a driver device to emit a Special Signal, rather than have all devices within a vehicle emitting a Special Signal until they transition into Passenger Mode.
  • routine 800 that illustrates a broad aspect of a method restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 800 illustrates a broad aspect of a method restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 800 shares substantial similarities to those described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 determines whether mobile device 105 is present with a vehicle, such as through one or more of the various determination methods described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 determines whether the vehicle is in motion, such as through one or more of the various determination methods described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, employs one or more restrictions at mobile device 105 and/or in relation to mobile device 105, substantially in the manner described above with respect to step 705.
  • restriction(s) are preferably configured to impede operation of mobile device 105 by a user that is a driver moreso than the restriction(s) impede operation of mobile device 105 by a user that is a passenger, as described in detail herein.
  • routine 800 examples include teenage drivers (wherein a parent/guardian wishes to employ such restrictions, which make it difficult to operate a mobile device 105 while driver, at all times) and/or phones that are fixed in vehicles, such as car phones (wherein it is always desirable to implement such restrictions). It should be appreciated that in certain arrangements such restriction can be employed in relation to mobile device 105, such as by a central machine 168, such as in the manner disclosed in detail herein, for example with respect to FIG. 4.
  • the referenced restriction can be imposed by a communications provided (which preferably operates central machine 168) to prevent transmission of one or more communications (e.g., SMS messages) to a mobile device 105, until an identification/determination is made, such as identifying that two or more users are in a vehicle, such as in the manner disclosed in detail herein.
  • a communications which preferably operates central machine 168 to prevent transmission of one or more communications (e.g., SMS messages) to a mobile device 105, until an identification/determination is made, such as identifying that two or more users are in a vehicle, such as in the manner disclosed in detail herein.
  • such restriction can be further configured to impede operation of the mobile device, and/or be more likely to be applied to a mobile device used by a driver than to a mobile device used by a passenger.
  • a particular restriction is employed such that if the 'shake' perceived at mobile device 105 exceeds a certain threshold level, SMS messages cannot be sent from the device.
  • SMS messages cannot be sent from the device.
  • employment of such a restriction does not impede drivers more than passengers (being that, once employed, it will impede a driver and a passenger equally), however such a restriction is more likely, on average, to be employed for drivers than for passengers (being that drivers, on average, shake their devices more than passengers). Further such examples are provided at EXAMPLE 4.
  • such restriction(s) can be configured to be applied to a mobile device as used by a first user moreso than such restrictions are applied to a mobile device used by a second user.
  • such restrictions can be configured to impede a user who uses the mobile device in an unauthorized operation state moreso than a user who uses the mobile device in an authorized operation state.
  • one such example which is preferably directed to preventing students from using their mobile devices while they are in a classroom setting, can impose a restriction such that the mobile device is only operable and/or functional if the device is held upright and/or at a certain altitude (as can be determined based on one or more of sensors 145, as described in detail herein).
  • a device that is located in a vehicle can be used to determine whether the vehicle's engine is on or off.
  • One manner in which this is useful is when considering that a vehicle that has recently stopped moving, but whose engine is still on, may likely continue its present trip (e.g., stopped at a red light), whereas a vehicle that has recently stopped moving and whose engine is off has likely finished its present trip. Differentiating between these two states in useful, among other reasons, in order to know when usage restrictions on a driver's device should be lifted, such as in the manner described herein.
  • the device's accelerometer and/or gyroscope and/or magnetometer can be used to determine whether the engine is running or not.
  • the accelerometer and/or gyroscope and/or magnetometer show larger movements and/or movement at different frequencies, when the engine is running as opposed to not running.
  • the device's microphone(s) is used to determine whether the engine is running or not.
  • the microphones will show signals at different frequencies, including harmonics of the base frequency which can be more easily detected by the microphones used on popular devices, when the engine is running than if the engine is not running.
  • the event of starting or stopping the ignition can be captured by the accelerometer, gyroscope and/or microphone.
  • the magnitude of the acceleration at the time the ignition is started or stopped is considerably larger than the previous and subsequent accelerations in a stationary vehicle.
  • the event of starting or stopping the ignition may be captured by the magnetometer because the magnetic field created by an electric car will change when the car is turned on or off.
  • one or more of the techniques described herein can be configured to determine whether a device is (or is likely to be) within a moving vehicle based on (a) signals that are measured by the device itself (e.g., by the internal accelerometer) or provided/imparted from external devices (e.g., cellular network, other terrestrial or non-terrestrial infrastructure, the vehicle or other vehicles, WiFi networks, GPS networks) and received at the device and/or (b) signals that are provided/imparted from the device (e.g., RF cellular signals) and picked up external to the device (e.g., the cellular network, other infrastructure, the vehicle or other vehicles), such as is described herein.
  • signals that are measured by the device itself e.g., by the internal accelerometer
  • external devices e.g., cellular network, other terrestrial or non-terrestrial infrastructure, the vehicle or other vehicles, WiFi networks, GPS networks
  • signals that are provided/imparted from the device e.g., RF cellular signals
  • one or more transmitters within a vehicle can send a signal (e.g., BT, WiFi) when (and for as long as) the vehicle is moving (or, optionally, in a lower state where it also able to move, e.g., "On” or “Not in Park") (the "In- Vehicle Signal”).
  • a signal e.g., BT, WiFi
  • Devices can be configured to 'wake up' from time to time to 'listen for' the In- Vehicle Signal and, when such a signal is perceived (e.g., for a long enough period of time), they can be determined to be in a moving vehicle.
  • the devices can be configured to 'listen' for this signal at somewhere in their architecture stack (hardware, firmware, operating system, application, etc.) so as to reduce the power needed and/or provide the option to allow such to function without the user being able to control it (e.g., turn off or opt out).
  • architecture stack hardware, firmware, operating system, application, etc.
  • the in-vehicle role of the device user can be determined (e.g., identifying the user as a driver or passenger, and, optionally further differentiate between passengers that are near to the driver and whose device use might be distracting to her (e.g., front seat passengers) and passengers who are less near to the driver (e.g., rear-seat passengers)).
  • the appropriate usage rule set can be applied to such device.
  • Such technique can determine the device user's in- vehicle role and/or can, for example, determine/infer the in-vehicle role of the device user based on the in- vehicle location of the device. Such determination can be performed one or more times per trip.
  • one or more techniques for determining the in-vehicle role of a device user and the in-vehicle location of a device user can be employed in conjunction with one or more in-vehicle transmitters (e.g., transmitters employed within the vehicle), such as by configuring the device to 'listen' periodically for certain signals.
  • the in-vehicle location of the device can be determined such that an in-vehicle role of the device user can be determined/inferred and an appropriate usage rule set can be applied to the device.
  • the time that it takes a signal emitted from the transmitter can provide an indication as to the in-vehicle location of the device, such as in order to determine the in-vehicle role of the device user.
  • the mobile device can be determined/inferred to be the driver's.
  • the clock on the mobile device and the clock on the in-vehicle transmitter can be synchronized (e.g., within a certain margin of error), and/or (b) the mobile device can be configured to determine the time at which Distance Signal is received with sufficient resolution (i.e., accuracy) to sufficiently determine the distance between the transmitter and the device, and/or (c) the device can be configured to differentiate between the original Distance Signal transmitted and myriad multipath signals that arrive at the device due to signal reflection or other indirect routes.
  • the device and the transmitter can be synchronized using existing protocols like NTP.
  • the two devices can synchronize between themselves, for example, by using a signal (the "Sync Signal", which can be part of the In- Vehicle Signal in certain embodiments) that can, for example, travel more quickly than the Distance Signal (through which the distance between the device and the transmitter will be measured).
  • the transmitter and the mobile device can synchronize by having the transmitter send a Sync Signal (e.g., an RF pulse via BT, WiFi etc.), once every 5 seconds, which a device (at a distance of 70cm) will receive approximately 2 nanoseconds later (assuming that the Sync Signal travels at the speed of light, 300,000,000 m/s).
  • the transmitter then sends the Distance Signal (e.g., an inaudible sound pulse that can be detected by the one or more microphones on the device), say, 3 seconds later and the device (at a distance of 70cm) can receive such signal about 2 milliseconds later (assuming that the Distance Signal travels at the speed of sound, 340 m/s).
  • a Sync Signal e.g., an RF pulse via BT, WiFi etc.
  • the transmitter then sends the Distance Signal (e.g., an inaudible sound pulse that can be detected by the one or more microphones on the device), say, 3 seconds later and the device (at a distance of 70cm
  • a rule might state that, if a device receives the Distance Signal within 2ms (e.g., 70cm distance) of its being sent then it is determined to be a driver device. Otherwise (i.e., if it does not receive the Distance Signal within 2ms), it is determined to be a passenger device. Signals received after 2ms and until the next Distance Signal transmission (5 seconds later) can be disregarded.
  • 2ms e.g., 70cm distance
  • the Sync Signal and/or the Distance Signal can be transmitted at the strengths and in the forms such that they can be received by the mobile devices in the vehicle with sufficient accuracy (e.g., in light of intra- vehicle or extra-vehicle noises or interference), whereby, in certain situations, the strength or the form (e.g., frequency, duration, encoding) of the signals may be modulated to so ensure. If the vehicle is the power source of the transmitter, then this is easy from a power standpoint, though in some implementations, form modulations (e.g., changes to the signal's frequency, duration or encoding) may be preferable).
  • form modulations e.g., changes to the signal's frequency, duration or encoding
  • two (or more) vehicles may be sufficiently close to one another such that one can 'hear' the signals transmitted by the other. This may be prevented or significantly minimized in various ways know to those skilled in the art, including time-slicing. For example, if Device A is in Vehicle A and constantly receives Sync Signals from the transmitter in Vehicle A approximately every 5 seconds and then Vehicle B, which is travelling in close proximity to close to Vehicle A, transmits a Sync Signal that Device A receives after, say 3 seconds, from the last Sync Signal sent from Vehicle A, such a signal can be disregarded as "crosstalk" because it was scheduled to receive its next signal after 5 seconds and will restrict the acceptance of signals to those that arrive in a tight window around that at 5 seconds time (e.g., 5 seconds +/- 3 ms), as legitimate Sync Signal (the probability of a Sync Signal arriving from another vehicle during such +/-3ms window each 5 seconds is about 0.1%, assuming that there is a vehicle in such proximity whose signal can pass through from one vehicle to another).
  • each device can transmit a random "identity" element encoded within its Sync Signal that can be received by the device and used to identify if that Sync Signal should be used (i.e., if it was received from the correct in-vehicle transmitter) or ignored.
  • the Sync Signal from the original vehicle can cease to be received in the expected time slot or with the expected identity (for one or more times), whereupon it can be determined that the device is no longer in its original vehicle. It will then begin to receive the In- Vehicle Signal from the new vehicle it has entered and begin the Sync Signal and Distance Signal processes in the new vehicle.
  • the probability of crosstalk between cars and devices can be further reduced, for example, by using two (or more) successive Sync Signals (or Distance Signals) to compute the distance between the device and the transmitter where the length of time between the Sync Signal (or Distance Signals) can be randomly varied on each transmitters and conveyed to their devices (e.g., the next Sync Signal will occur t seconds after the this Sync signal, where t is randomly drawn over some interval (0,X]).
  • the detection and the time-stamping of the signals received can be done with sufficiently low latency and sufficiently high resolution so that the in-vehicle location can be determined with sufficient accuracy. In one embodiment this is accomplished by moving the detection and time-stamping processes to a location in the device's stack architecture (hardware, firmware, operating system, application, etc.) that supports the requisite latency and clock resolution.
  • the device's stack architecture hardware, firmware, operating system, application, etc.
  • the set of rules used to determine the in-vehicle role of the device user can be adjusted in different cases. For example, such rules can be adjusted based upon the geometry of a car (e.g., larger cars have a higher distance threshold) or the location of the in-vehicle transmitter. Such information can, for example, be stored in-vehicle (or on a server external to the vehicle) and communicated by the transmitter to the device (for example, as part of the Sync Signal) and used to adjust the in-vehicle role determination rules.
  • the device in order to save device power, even when a device is determined/known to be in a moving vehicle, the device can be configured not to process the Sync Signal and/or the Distance Signal described above (or the vehicle can be configured not to transmit such signals based on a determination that there no other powered-on devices are present within the vehicle), until a certain device-based event occurs (e.g., the screen turns on, a phone call arrives, etc.), whereupon, before giving control to the user, the device can determine its in-vehicle location and determine/infer its user's in-vehicle role and apply the usage rules, such as by using the methods described above.
  • a certain device-based event e.g., the screen turns on, a phone call arrives, etc.
  • the Sync Signal can be configured to be sent more frequently than once every 5 seconds, or the device might demand an immediate a Sync Signal from the vehicle (BT, WiFi, server-side) or the transmitter (in which case it would be transceiver, able to receive certain signals too).
  • the vehicle and/or the device can be configured such that one or more aspects of the functionality described herein is (a) be pre-installed or silently or conspicuously post-installed on devices; and/or (b) run autonomously on devices, i.e., it will constantly function without the user being able to turn it off, uninstall it or opt out, except, perhaps, by powering off a device and/or a vehicle. It can be appreciated that such configurations can, for example, be dictated or incentivized by regulation, legislation, policy or decision.
  • the mobile device can be the transmitter and in-vehicle sensors can receive the signals (or a hybrid combination thereof, i.e., some in- vehicle sensors receive while others transmit).
  • the techniques described above can be implemented with respect to the classroom techniques described herein, whereby the devices in a particular classroom can be identified using the techniques described above where the transmitter can be the teacher's mobile device (or a separate in-class transmitter) and those mobile devices that receive the Distance Signal within a certain threshold period of time are determined to be in the teacher's class and can be controlled by signals sent from the teacher's phone as described herein.
  • the transmitter can be the teacher's mobile device (or a separate in-class transmitter) and those mobile devices that receive the Distance Signal within a certain threshold period of time are determined to be in the teacher's class and can be controlled by signals sent from the teacher's phone as described herein.
  • two or more transmitters that emit signals can be incorporated within one or more hardware devices that can be installed within a vehicle.
  • Such transmitters can be designed/configured to transmit spherically and/or directionally (e.g., in a three-dimensional range that is a subset of a sphere, e.g., a hemisphere, through the use of appropriate hardware as is known to those of ordinary skill in the art).
  • one or more aspects pertaining to the in-vehicle position of a mobile device can be determined. For example, based on a determination that a mobile device is present in the front left section of a vehicle, one or more restrictions can be placed on/in relation to the device. In another example, based on a determination that a mobile device is present in the front right section of a vehicle, one or more restrictions (e.g., those that were placed on the device when it was determined to be in a vehicle) can be removed/relaxed from/in relation to the device. It should be understood that in locations where vehicles travel on the left side of the road (e.g., U.K.), such techniques can be reversed.
  • the strength of a signal emitted by a nearby transmitter as perceived on a mobile device can vary from device to device within the same location, orientation and time (e.g., because of receiver hardware, antenna placement, etc.), from device orientation to device orientation and from device context to device context (e.g., in a bag, in a pocket, in a cradle).
  • Measuring the difference in strength of a signal from two or more transmitters, transmitting with the same transmission power, e.g., the difference in their RSSIs, and, optionally, averaged over time, can provide more accurate location information.
  • the system can be reversed so that a mobile device emits one or more a signals that are received by one or more hardware devices and, based upon the relative strength (or differences thereto) at which the hardware devices can be determined to receive such signals, one or more aspects pertaining to the in- vehicle position of a mobile device can be determined.
  • transmitters can encode information about the direction in which they transmit (e.g., right front, front right) within the transmission packets (e.g., SSID, payload) and the receiver (and/or other devices) can use such information, in addition to the signal strengths, to determine the in-vehicle position/location of the device.
  • a hardware device placed on the ceiling in the approximate middle of a vehicle might have three directional transmitters in it.
  • One transmitter can emit signals pointed downward at the front left of the vehicle (e.g., 1/8 of a sphere).
  • a second transmitter can emit signals pointed downward at the front right of the vehicle (e.g., 1/8 of a sphere).
  • a third transmitter can emit signals pointed downward at the rear of the vehicle (e.g. 1/4 of a sphere).
  • the SSIDs of the transmitter can contain the suffix strings "FL", "FR", and "Rear", respectively (See FIG. 51).
  • a mobile device within the vehicle determines that the strongest signal that it receives (over one or more samples) has "FL" in its SSID suffix, it can be determined that the device is likely to be in the front left of the vehicle.
  • the accuracy of such a technique can be further improved.
  • the accuracy can be further improved by requiring that the difference between the strength of two or more signals exceed one or more thresholds prior to determining its location.
  • a device near the front of the vehicle might only be determined to be in the front right of the vehicle if the difference in the strength between the FL signal and the FR signal (e.g., RSSIpu t - RSSIFF ) was (a) greater than 0 on more than 90% of 10 samples, each 500ms apart; or (b) greater than 3 dbms, on average, over 10 seconds; or (c) had a mean that was more than one standard deviation above zero, over 7 seconds.
  • the difference in the strength between the FL signal and the FR signal e.g., RSSIpu t - RSSIFF
  • the hardware unit described above can be modified so that the "FL" 1/8- sphere is enlarged slightly while the "FR" 1/8-sphere is minimized slightly and as soon as the FR signal is determined to be stronger than the FL signal (i.e., without the need for any additional margins of error), the device can be determined to be a passenger device (see FIG. 52)
  • the two or more transmitters can be placed in two different units, for example, two in the front window ("FL", "FR") and one in the rear window ("Rear”). Such placement may be preferable in certain implementations because it can utilize solar energy to power the transmitter(s).
  • the physical structure of the hardware unit (e.g., relative to the wavelength of signal) can be designed or configured to prevent/minimize the RF signals from spreading outside the polygonal/conal direction in which they are directed/intended to be transmitted after they leave the hardware unit, as is known to those of ordinary skill in the art.
  • the transmission units may be installed as original equipment (OE) by the vehicle manufacturers or their subcontractors, or in after-market by specialists or via 'do it yourself kits.
  • crosstalk can be minimized by using unique identifiers (e.g., BSSID) for all the transmitters within the same vehicle (in addition to the directional information) and ignoring all signals but those that are determined to be the strongest (collectively across all the transmitters in a vehicle).
  • BSSIDs of the transmitters in the same device are 123456000001, 123456000002, and 123456000003.
  • a device in the vehicle receives vehicle-location signals with a prefix 123456 and 654321, with strength of -20 dBm, -30 dBm, -40 dBm, -40 dBm, -50 dBm, -60 dBm for 123456FL, 123456FR, 123456Rear, 654321FL, 654321FL, 654321FL, it can compute the average signal strength from devices with the prefix 123456 and prefix 654321, i.e., -30dBm and -50dBM, respectively, and can choose to ignore the signals from the "crosstalking" signals, i.e., the signals with the 654321 prefix.
  • a device can be restricted in a manner that reduces the danger likely to result from noises emitted by device (e.g., when present within a vehicle), such as noises that may negatively affect a driver's abilities to perform his primary task.
  • applications that use sounds which, if played while in a vehicle may be confusing, distracting, dangerous, etc. e.g., a honking sound, a siren, loud sounds
  • can be adjusted e.g., by selecting different/less confusing sounds, using another feedback technique, e.g., haptic feedback, or by restricting the sounds entirely).
  • such applications can ask (e.g. poll) a trip detection module (e.g., on the mobile device) or receive a notification (e.g., subscribe to a callback) from a trip detection module when a trip starts and stops.
  • routine 1200 that illustrates a broad aspect of a method for restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 1200 illustrates a broad aspect of a method for restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • various of the steps and operations that make up routine 1200 share substantial similarities to those described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 determines whether a first mobile device 105 is present within a vehicle, and/or receives one or more first inputs from at least one of a vehicle data system 164 and/or at least one of a second mobile device 160, the one or more first inputs pertaining to a presence of the first mobile device 105 within a vehicle, such as through one or more of the various determination methods described in detail herein.
  • restriction module 171 determines whether a first mobile device 105 is present within a vehicle, and/or receives one or more first inputs from at least one of a vehicle data system 164 and/or at least one of a second mobile device 160, the one or more first inputs pertaining to a presence of the first mobile device 105 within a vehicle, such as through one or more of the various determination methods described in detail herein.
  • mobile device 105 preferably prompts one or more users to initiate and/or provide one or more stimuli that can be received as inputs at mobile device 105 and/or receives one or more second inputs in response to the prompting, and/or receives one or more third inputs from vehicle data system 164, and/or receives one or more fourth inputs from at least one of the second mobile device 160, all in the manner described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, analyzes at least one of the first inputs, the second inputs, the third inputs, and the fourth inputs to determine a presence of at least one of more than one user, more than one mobile device 105, 160, and/or one or more users not in the set of users known to be users of the first mobile device, substantially in the manner described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 employs one or more restrictions at a mobile device 105, substantially in the manner described in detail herein.
  • routine 1300 that illustrates a broad aspect of a method for restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 1300 illustrates a broad aspect of a method for restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • various of the steps and operations that make up routine 1300 share substantial similarities to those described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, employs one or more restrictions at mobile device 105, substantially in the manner described above with respect to step 705.
  • such a restriction can be employed whereby the device can prompt/require the user is to authenticate that s he is a passenger in a vehicle (such as a moving vehicle) by performing an action or a set of actions such as a CAPTCHA, a game, a puzzle, lock screen etc., as described in detail herein.
  • a vehicle such as a moving vehicle
  • Such authentication can be configured to require sufficient concentration/attention such that the authentication can be difficult to perform by a driver of a moving vehicle, who must concentrate on driving.
  • This authentication can be further strengthened by requiring that (a) in order to complete the action the user must use both hands (for example, by requiring multitouch input, as described in detail herein and illustrated with respect to FIG.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 receives one or more inputs, preferably from at least one of the mobile device 105, a vehicle data system 164, and/or one or more other mobile devices 160, substantially in the manner described above with respect to step 710.
  • such inputs correspond to one or more inputs provided by the user to the device in response to an authentication prompt (it should be understood that the term "authentication prompt" as used herein is intended to encompass one or more prompts, instructions, and/or directions that inform a user in some manner as to the manner in which inputs should be provided to a device, and/or that otherwise provide information to the user relating to the authentication of such a device).
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, analyzes at least one of the inputs. It should be understood that in certain implementations, such analysis can be performed in order to determine a presence of one or more users that are not known users of the first mobile device 105, substantially in the manner described in detail herein. In other implementations, such analysis can be performed in order to determine whether and/or to what degree one or more inputs (such as those received at step 1310) successfully and/or unsuccessfully authenticated a mobile device 105, as is also described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, modifies an employment of one or more restrictions at a mobile device 105, substantially in the manner described in detail herein.
  • routine 1400 that illustrates a broad aspect of a method for orienting a coordinate system of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • routine 1400 can share substantial similarities to those described in detail herein. It should be understood that the various steps of routine 1400 will be appreciated with reference to EXAMPLE 3 below and FIGs. 9-1 IB, and their accompanying descriptions.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170 receives one or more inputs, preferably from at least one of (i) at least one of the user interface, the operating system, the accelerometer, the gyroscope, the GPS receiver, the microphone, the magnetometer, the camera, the light sensor, the temperature sensor, the altitude sensor, the pressure sensor, the proximity sensor, the NFC device, the compass, and the communications interface of the mobile device 105 and (ii) a vehicle data system 164, substantially in the manner described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170 computes, based on the one or more inputs, an orientation of the mobile device 105 relative to a coordinate system of a vehicle, such as a vehicle within which mobile device 105 is traveling.
  • processor 110 executing one or more of software modules 130, including, preferably, determination module 170 interprets one or more subsequent inputs of the mobile device 105 in relation to the coordinate system of the vehicle and/or transforms the one or more subsequent inputs originating at the first device into values that are comparable with the coordinate system of the vehicle. See, for example, FIGs. 11A-B and EXAMPLE 3, below.
  • mobile device 105 is preferably communicatively coordinated with the vehicle data system, that vehicle data system is preferably configured (e.g., installed) with the vehicle (e.g., within the vehicle such as a car) and/or that mobile device is positioned within the vehicle, as described in detail herein.
  • the exact orientation of the device 105 can be determined relative to the ground (e.g, based on the gravitational force shown on the three accelerometers 145A, as is known to those of skill in the art based on such disciplines as trigonometry).
  • the inputs can be averaged over time and/or inputs from the gyroscope 145B can further assist this computation.
  • the orientation of the mobile device 105 can be detected relative to the car, for example, by using the angle between the device's magnetic north (e.g, from the 3-axis compass sensor) and the vehicle's GPS heading (as can be shown on the mobile device).
  • the angle between the device's magnetic north e.g, from the 3-axis compass sensor
  • the vehicle's GPS heading as can be shown on the mobile device.
  • the value read by the z-accelerometer goes down (some of the gravity that it felt in stage one is handed over to the other accelerometers)and the X-accelerometer (for roll) and Y-accelerometer (for pitch) go up.
  • the total sum of squares of the 3-accelerometers is always gravity. So we know the exact orientation of the device with regard to the ground.
  • the device's 105 north (detected, e.g., via its compass sensor) can be compared with the vehicle's GPS (such as from vehicle data system 164) heading (as read on the device). For example, (if the device screen is facing up, i.e., the device is not upside down) and its compass sensor shows that magnetic north is due north and the GPS heading sensor shows the vehicle is travelling due west, then the device is rotated 90 degrees to the right with regard to the car. Accordingly, the exact orientation of the device with respect to the coordinates of the car, as disclosed herein and described in greater detail at EXAMPLE 3 and with regard to FIGs. 9-1 IB.
  • the various mobile device(s) 105, 160 is (are) aligned with the vehicle within which they are traveling, such as as shown in FIG. 11 A. That is, the coordinate system of a particular mobile device 105, 160 should be understood to be coincident with the vehicle's coordinate system, as depicted in FIG. 11A and described in greater detail in EXAMPLE 3. It should be further recognized that in practice, such as in various arrangements, such as that shown in FIG. 1 IB, mobile device 105, 160 is rotated with respect to the coordinate system of the vehicle in up to three dimensions.
  • the rotation of the particular mobile device 105, 160 relative to the vehicle is preferably computed and the inputs originating at sensors 145 of the particular mobile device 105, 160 are preferably transformed into values for the coordinate system of the vehicle. This may be achieved in various ways, examples of which are provided below.
  • each user determination characteristic e.g. error proportion, correlation of typing speed to acceleration etc.
  • each user determination characteristic can be considered as a point in a K-dimensional space.
  • Classification algorithms based on supervised learning can then be applied to the resulting K-dimensional signature(s) to determine the probability that the in-vehicle role of the user of mobile device 105 is a driver or a passenger.
  • Text Reading/Screen Viewing - User determination characteristic(s) can be identified based on patterns in the reading of text messages (or any other such text item such as an email or webpage, or any other such viewing of items on a display screen, such as during the playing of a video game) on a mobile device 105, thereby serving to distinguish between a driver and a passenger. For example, drivers tend to change the orientation of and/or move (e.g. rotate in his/her palm) mobile device 105 more frequently when attempting to read a message of a given length (in order to periodically glance back at the road), whereas a passenger will read such a message in a comparatively more constant state. This is especially true during road maneuvers that require more driver concentration, such as turns and accelerations.
  • This phenomenon can be observed as a high degree of correlation between vehicle accelerations and/or gyroscopic rotations as detected by accelerometer 145A and gyroscope 145B, respectively, of mobile device 105 and the changes in orientation of the mobile device 160 (unrelated to movements in the vehicle) as measured by one or more of accelerometer 145 A, gyroscope 145B, GPS 145C and magnetometer 145E and, in particular, the presence or absence of a (non- vehicle related) mobile device movements just prior to vehicle movements.
  • the in- vehicle role of the user of mobile device 105 can be determined to be a driver and/or once this correlation reaches or exceeds another certain threshold, the in-vehicle role of the user of mobile device 105 can be determined to be a passenger.
  • Driver- Specific Movements can be detected by one or more of sensors 145 of mobile device 105 that can be determined to be unique to a driver.
  • a lack of perception of such unique forces, such as "signature" forces at a mobile device 105 can indicate that the user of such a device is not a driver and is thus a passenger.
  • a driver influences the movement of a mobile device 105 through driver-related actions that include pressing and releasing the gas/brake/clutch pedals and by moving his/her foot from one pedal to another over the course of driving.
  • acceleration and/or angular movement perceived slightly in the 100' s of milliseconds in advance, such as at one or more of sensors 145, that originates at the driver's body maneuver (such as the pressing of a gas pedal) that initiates the acceleration of the vehicle.
  • a driver also causes a mobile device 105 to move by rotating the steering wheel.
  • various of sensors 145 such as accelerometers 145A and/or gyroscope 145B of mobile device 105 can detect certain accelerations and rotations. Based on a retrospective analysis of such inputs - for instance, analyzing inputs corresponding to acceleration of a car with inputs perceived immediately prior - it can be determined whether the user operating such a mobile device 105 is a driver or a passenger. If such unique/signature forces are perceived in close proximity (generally, immediately before) acceleration, etc., it can be determined that the user is a driver. Conversely, if such inputs are not detected immediately prior to acceleration, it can be determined that the user is a passenger (provided that the user is in physical contact or communication with mobile device 105).
  • This approach can be also applied to other driver movements (e.g., looking in the mirrors, turning on the directional signal), wherein the driver's movements will be detected on mobile device 105 that is in contact with the driver slightly before another signal is detected (e.g., accelerometer 145A or gyroscope 145B for looking in mirrors, microphone 145D for turning on directional), on mobile device 105, whereas these serial relationships will not be present if mobile device 105 is being operated by a passenger.
  • driver movements e.g., looking in the mirrors, turning on the directional signal
  • the driver's movements will be detected on mobile device 105 that is in contact with the driver slightly before another signal is detected (e.g., accelerometer 145A or gyroscope 145B for looking in mirrors, microphone 145D for turning on directional), on mobile device 105, whereas these serial relationships will not be present if mobile device 105 is being operated by a passenger.
  • a device determined to be located within a vehicle can process various inputs, such as in order to characterize/determine the nature of a particular movement of the vehicle. For example, various inputs can be processed in order to differentiate between a vehicle that has recently stopped moving and is likely to continue its present trip (e.g., stopped at a red light or stopped in traffic) from a vehicle that has recently stopped moving and is relatively likely to have finished its present trip.
  • various inputs can be processed in order to differentiate between a vehicle that has recently stopped moving and is likely to continue its present trip (e.g., stopped at a red light or stopped in traffic) from a vehicle that has recently stopped moving and is relatively likely to have finished its present trip.
  • FIG. 35 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more inputs can be received, such as in relation to a user device.
  • at least one of the one or more inputs can be processed.
  • at least one of the one or more inputs can be processed in order to determine one or more mobility characteristics of the device.
  • one of the one or more inputs can be processed based on a determination of a mobility stoppage, such as in relation to the user device.
  • one or more inputs that are chronologically proximate to the mobility stoppage can be processed, such as in order to determine one or more mobility characteristics of the device.
  • the one or more mobility characteristics can include at least one of (a) a permanent stop or (b) a temporary stop.
  • at least one of the one or more inputs can be processed in relation to one or more data items, such as in order to determine one or more mobility characteristics of the device.
  • the one or more data items can include at least one of: map data, traffic signal data, or traffic condition data.
  • FIG. 36 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • an indication of a completion of a trip can be received, such as in relation to a user device.
  • one or more inputs can be processed, such as in order to determine one or more mobility characteristics of the user device. In certain implementations, such one or more inputs can be processed based on the indication.
  • the one or more mobility characteristics can be processed, such as in order to determine a veracity of the indication. In certain implementations, the one or more mobility characteristics can be processed in relation to the indication.
  • one or more restrictions can be selectively adjusted, such as in relation to the user device. In certain implementations, one or more restrictions can be selectively adjusted based on the veracity. In certain implementations, one or more restrictions can be maintained, irrespective of a subsequent indication of a completion of a trip.
  • one or more inputs that correspond to the behavior or operation of a vehicle prior to its stop can be processed, for example, in order to determine whether the vehicle traveled in reverse one or more times just before it stopped (indicating parallel parking), and/or whether the vehicle performed a turn in the period of time just preceding its stop (indicating entry into a parking lot or a driveway).
  • the frequency and/or length of stops (as determined based on a processing of one or more of the inputs referenced herein) made in the time prior to the current stop can be used to differentiate between these two states (i.e., between a temporary and a permanent stop).
  • a vehicle determined to have made stops (of various lengths) in the recent past and/or in close proximity to the location of the current stop and/or is on a travel route determined to be consistent with the previous stops, can be determined to be relatively more likely to be stuck in traffic than a vehicle that has not.
  • the GPS of the device can be processed to determine (a) whether the vehicle is near a traffic light, (b) whether the acceleration/deceleration of the vehicle (i.e., stopping and going) correlates with the changes in the traffic lights (e.g., their green - yellow - red pattern) on the route that the vehicle is determined to be traveling, and/or (c) whether the vehicle is on a road that currently has heavy traffic that could be the cause of the stops observed, as can be aided by data originating at one or more traffic density services (e.g., Waze, DeCell, etc.).
  • traffic density services e.g., Waze, DeCell, etc.
  • FIG. 53 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more inputs can be received.
  • such inputs can be received in relation to a geographic location (e.g., one or more coordinates, a location on a map, an address, etc.).
  • the referenced inputs can correspond to an incidence of deceleration (e.g., slowing down, stopping, etc., such as of a vehicle).
  • the referenced geographic location can include, incorporate, and/or otherwise be associated with information, parameters, metadata, etc., such as may reflect or otherwise pertain to a presence of a stop sign, traffic light, a parking lot, parking spot, a non-temporary location such as an office or home, etc. (and/or any other such status or indication that may reflect a likelihood that a vehicle stopping there may be likely to maintain such a stop for a relatively short time, such as at a stop sign or traffic light, or a relatively longer time, such as at a parking spot or parking lot) at the geographic location.
  • information, parameters, metadata, etc. such as may reflect or otherwise pertain to a presence of a stop sign, traffic light, a parking lot, parking spot, a non-temporary location such as an office or home, etc. (and/or any other such status or indication that may reflect a likelihood that a vehicle stopping there may be likely to maintain such a stop for a relatively short time, such as at a stop sign or traffic light, or a relatively longer time,
  • the presence of such items at/in relation to the location can be used/accounted for in determining whether the incidence of deceleration (e.g., the stopping of a vehicle) is likely to be maintained for a relatively shorter time duration (e.g., in the case of a vehicle stopping at a stop sign) or a relatively longer time duration (e.g., in the case of a vehicle stopping in a parking lot or parking spot).
  • a relatively shorter time duration e.g., in the case of a vehicle stopping at a stop sign
  • a relatively longer time duration e.g., in the case of a vehicle stopping in a parking lot or parking spot
  • the referenced geographic location can include information pertaining to one or more previous incidences of deceleration at the geographic location. That is, as described herein, a history or log of previous deceleration/stopping instances can be utilized/accounted for in determining whether a deceleration instance (e.g., the stopping of a vehicle) in a particular location is likely to be for a relatively shorter time duration (e.g., temporary) or for a relatively longer time duration.
  • a deceleration instance e.g., the stopping of a vehicle
  • a relatively shorter time duration e.g., temporary
  • such previous incidences of deceleration can include one or more previous incidences of deceleration at the geographic location that are associated with a user that is associated with a device (e.g., the device with respect to which a restriction is to be modified, such as at 5330).
  • a user or driver's personal history or log can be accounted for in determining whether a location is a short or long term stopping location. For example, with respect to a location that may otherwise be determined to be a short term stopping location (e.g., near a stop sign), in a scenario in which multiple long term stopping instances can be observed identified at such a location with respect to a particular user (e.g., on account of the fact that the user's driveway is nearby), such a location can be determined to be a long term stopping location with respect to that user.
  • a short term stopping location e.g., near a stop sign
  • multiple long term stopping instances can be observed identified at such a location with respect to a particular user (e.g., on account of the fact that the user's driveway is nearby)
  • such a location can be determined to be a long term stopping location with respect to that user.
  • the referenced inputs can be received in relation to an approach towards the geographic location (e.g., as described herein, when the referenced deceleration occurs in relation to an approach towards the location, as opposed to a departure from the location).
  • the referenced geographic location can be determined to be a relatively shorter term stopping location.
  • an implementation of a restriction can be maintained at the device, such as is described herein.
  • the referenced inputs can be received subsequent to a departure from the geographic location (e.g., an incidence of deceleration/stopping after a vehicle passes a stop sign or traffic light).
  • the referenced geographic location can be determined to be a relatively longer term stopping location.
  • an implementation of a restriction at the referenced device can be modified upon such a determination (e.g., a determination that the incidence of deceleration occurred subsequent to the departure from the geographic location).
  • information pertaining to one or more previous incidences of deceleration at a geographic location that are associated with a user that is associated with a device can be received or otherwise identified or accounted for based on a relative prevalence of other locations within a defined proximity to the geographic location that are determined to be relatively shorter term stopping locations. For example, as described herein, upon determining that there are several other short term stopping locations within a certain distance of a particular location, a stopping history associated with a user can be utilized/accounted for to determine if the particular location is (or is not) likely to be a short term stopping location (e.g., with respect to the user).
  • the one or more inputs can be processed.
  • such inputs can be processed in relation to a geographic location (such as a geographic location in relation to which they were received). In doing so, a relative likelihood that the incidence of deceleration is to be maintained for a relatively shorter time duration and/or a relatively longer time duration can be determined.
  • the referenced inputs can be processed in relation to one or more signals that are perceptible to a device (e.g., wireless signals such as WiFi, Bluetooth, etc.). In doing so, determine a relative likelihood that the incidence of deceleration is to be maintained for a relatively shorter time duration and/or a relatively longer time duration can be determined, such as is described herein.
  • a device e.g., wireless signals such as WiFi, Bluetooth, etc.
  • the referenced inputs can be processed in relation to a geographic location
  • the referenced inputs can be processed in relation to a geographic location and/or one or more chronological characteristics (e.g., a time of day, a day of week, or a day of month, reflecting, for example, that a particular location may be a temporary stopping location at a certain time/date - e.g., during rush hour on weekdays - but not at other times/dates). In doing so, a relative likelihood that the incidence of deceleration is relatively likely to be maintained for a relatively shorter (or longer) time duration can be determined, such as is described herein.
  • the referenced inputs can be processed in relation to one or more factors, characteristics, aspects, etc., including but not limited to: the geographic location, weather conditions, traffic conditions, date/time conditions, etc. (as described herein, a particular location may be a temporary stopping location when such factors are present, but a longer term stopping location when they are absent). In doing so, a relative likelihood that the incidence of deceleration is relatively likely to be maintained for a relatively shorter (or longer) time duration can be determined.
  • an implementation or application of one or more restriction(s) can be modified.
  • such restriction(s) may be employed at, on, and/or in relation to a device (e.g., a smartphone or any other such device).
  • a device e.g., a smartphone or any other such device.
  • an implementation or application of such restriction(s) can be modified based on a determination (such as at 5320) that the incidence of deceleration is relatively likely to be maintained for a relatively longer time duration.
  • a threshold can be adjusted. Such a threshold may define a chronological interval upon expiration of which the implementation of the restriction can be modified (e.g., eased or removed/disabled), such as is described herein.
  • a notification can be provided (e.g., at/in relation to the referenced device).
  • a notification can, for example, reflect a time duration (e.g., a countdown timer) upon expiration of which an implementation of the referenced restriction is to be modified.
  • a time duration can be defined based on a relative likelihood that the incidence of deceleration is to be maintained for a relatively shorter time duration (in which case a relatively shorter time duration can be utilized and/or a relatively longer time duration (in which case a relatively longer time duration can be utilized).
  • a selectable control can be provided, e.g., at an interface of the device. When selected, such a control can initiate a modification of the implementation of the restriction, such as is described herein. In certain implementations, as described herein, such a control can be provided based on a determination that the referenced incidence of deceleration is relatively likely to be maintained for a relatively longer time duration as well a determination that the device is not moving faster than a defined speed threshold, and/or a determination that the perceptibility of one or more access points (e.g., WiFi access points, etc.) does not change over a defined chronological interval, such as is described herein.
  • access points e.g., WiFi access points, etc.
  • the location where a trip has stopped as well as whether it has stopped can be determined using one or more mobile device sensors, including, but not limited to, GPS, cellular radio, WiFi radio, Bluetooth, accelerometers, gyroscopes, etc. as are known to those of ordinary skill in the art and a trip that is likely to end soon can be determined using one or more of such sensors, as well as historical information about the locational behavior of the device in question and/or other devices).
  • mobile device sensors including, but not limited to, GPS, cellular radio, WiFi radio, Bluetooth, accelerometers, gyroscopes, etc. as are known to those of ordinary skill in the art and a trip that is likely to end soon can be determined using one or more of such sensors, as well as historical information about the locational behavior of the device in question and/or other devices).
  • one or more applications can determine whether or not to continue implementing one or more restrictions (e.g., in relation to a device determined to be operated by a driver), such as during a temporary/short stop, or whether to remove one or more restrictions and/or to return full device access/operation (e.g., in relation to a device determined to be operated by a driver), such as during a relatively longer stop.
  • one or more restrictions e.g., in relation to a device determined to be operated by a driver
  • a determination as to which stops are more likely to be temporary/short and which are more likely to be longer can be achieved by associating various determinations of acceleration, deceleration, and/or stopping/starting a trip (and during the intervals between them) (e.g., using clustering, grouping, characterizing, classifying (e,g, kNN) etc., techniques, as are known to those of ordinary skill in the art) with the respective characteristics of the location (e.g., geographic location, one or more devices, e.g., RF/wireless devices, and/or signals that are present/perceptible in a location) in which they occur (e.g., as determined using one or more of GPS, WiFi, cellular, BT on a mobile device and/or the vehicle in which the device is traveling).
  • various determinations of acceleration, deceleration, and/or stopping/starting a trip e.g., using clustering, grouping, characterizing, classifying (e,g, kNN) etc., techniques, as
  • Such determinations can be made by using data acquired from a single device and/or multiple devices (e.g., via 'crowdsourcing'). In doing so, the locations of landmarks such as traffic lights, stop signs (e.g., if a sufficiently high time frequency sampling is used to identify stops of such length) and additional locations that may be less apparent/obvious (e.g., those that pertain to traffic patterns that cause temporary stops at locations other than traffic lights and stop signs), all of which can cause various types of 'stops' during a trip, can be identified.
  • landmarks such as traffic lights, stop signs (e.g., if a sufficiently high time frequency sampling is used to identify stops of such length) and additional locations that may be less apparent/obvious (e.g., those that pertain to traffic patterns that cause temporary stops at locations other than traffic lights and stop signs), all of which can cause various types of 'stops' during a trip, can be identified.
  • the referenced input(s) e.g., from one or more sensors on a device
  • various characteristics of previous stops e.g., quantity, frequency, mean length, median length, variability and other moments of such lengths, cumulative length, days with stops, etc.
  • it determined that it was within, for example, a 100m radius of the current location can be utilized or otherwise accounted for in determining whether the location is a temporary stopping location ('TSL'), a long- term stopping location ('LSL') or neither (or, both).
  • the identity of various transmitters can be associated with FSLs and FSLs (e.g., if the last 25 times a device perceived a WiFi AP having a MAC address of 12:34:56:78:9A:BC, the device was also determined to have stopped for 1 hour, upon subsequently perceiving this address, it can be determined that it is at an LSL).
  • determining whether a particular stop is likely to be temporary/short or longer can be achieved by associating various incidences/determinations of stopping/starting during a trip (and the intervals between them) (e.g., using clustering, grouping, characterizing, etc., techniques, as are known to those of ordinary skill in the art) with the respective signals (e.g., wireless signals, such as WiFi, Bluetooth, etc.) that are perceptible to the device when they occur.
  • the respective signals e.g., wireless signals, such as WiFi, Bluetooth, etc.
  • the likely length of the stop e.g., short or long. For example, if 14 of the last 15 times that (i) an input from one or more sensors (such as those incorporated within the device) indicated that the time for which the likelihood that the device was stopped was sufficiently high (e.g., greater than a certain threshold value); and (
  • a determination can be made without even considering inputs from the referenced sensors so that, for example, if the 14 of the last 15 times that the device was in the presence of BSSID 11 :22:33:44:55:66 with a signal strength greater than -80 dbm it was stopped and the device is now in the proximity of this BSSID again (and its signal strength is sufficiently high), it can be determined to be stopped.
  • a device Upon determining that a device is at a temporary stopping location ("TSL"), it can be further determined/assumed that the device is still within a trip (even though it is not currently moving), such as for a longer period of time than if the device were not at a TSL. Accordingly, upon determining that the device is present in/at a TSL, a relatively higher stopping threshold can be implemented before such a stop can be determined to be a long (i.e., non-temporary) stop.
  • TSL temporary stopping location
  • such a configuration can enable an application to account for stops determined to be at TSLs differently than stops determined to have occurred at other locations, thereby reducing the incidence of incorrect, premature trip end determinations in scenarios where the device is stopped at a TSL and/or reducing the latency at which trip end determinations are made at locations that are not TSL.
  • a restriction implemented in relation to a device while the device is being operated by a driver and/or is present in a trip can be removed/modified relatively more quickly at the non-temporary end of a trip (thereby improving the accuracy and overall user experience of applications that account for such stops), while also preventing unauthorized/unsafe operation of the device in scenarios where a device has only stopped temporarily.
  • stopping locations in which devices tend to stop for longer periods of time can be determined by virtue of the fact that moving devices that reach this location tend to stay there for more than a certain amount of time, e.g., a parking lot, a driveway.
  • a device is at such a long stopping location ('LSL')
  • 'LSL' it can be determined to have finished a trip relatively faster (e.g., requiring a relatively shorter time threshold to have passed during which traveling is not perceived/determined) than would otherwise be assumed, such as in another location (e.g., a location not determined to be an LSL).
  • stops occurring in locations determined to be LSLs can be accounted for differently than stops occurring at other locations and, thereby enabling determinations of trip ends with lower latency and reducing the likelihood of a false trip end determination, such as when the device is stopped at a TSL.
  • restrictions implemented at a device while driving/in a trip can be removed/modified relatively more quickly upon determining that the device has stopped at an LSL (in contrast to other locations with respect to which a relatively higher threshold, such as a time threshold, may be required before removing/modifying comparable restrictions).
  • TSL/LSL information that is specific to a particular device can be accounted for in conjunction with TSL/LSL information generated/associated with a larger population of devices. For example, if a particular intersection is identified as a TSL and a user happens to park in a parking lot that is located very close to such intersection, the user would be deemed to be at a TSL even though his (personal) stop was longer than temporary. This might cause, for example, a less positive user experience because, based on the device being stopped at a TSL, the device may be configured to take relatively longer to lift selective restrictions from the device (in light of being present in a TSL, as described herein).
  • the parking-lot-near- intersection location was one in which the device frequented regularly (e.g., it was the parking lot for his company that he visited most weekdays)
  • such a location can be identified as an LSL for that user, while still being identified as a TSL for the general population, and the restrictions can be lifted/modified relatively more quickly, thereby improving the experience for that user.
  • various configurations of the technologies described herein can be implemented based upon the TSL-density, the population density and/or some other density metric, of the location the device is in. For example, in a dense urban location, e.g., New York City, where there is a very dense packing of TSLs (e.g., many traffic lights), a system using TSLs/LSLs can be configured to place relatively less weight on information derived from a population of devices and place more weight on the information of a particular device than it otherwise would (e.g., in other locations).
  • TSLs e.g., traffic lights, stop signs
  • LSLs e.g., parking lots
  • knowing the direction in which a vehicle travelled past or approached a TSL can provide additional information which can, among other things, be used to improve the accuracy and latency of the trip-end determination, e.g., if a device in a trip crosses a TSL, e.g., from south to north and then, after 100m, stops, it is more likely to be at an actual trip end than a vehicle that approaches the TSL from the north, and stops lOOm north of the TSL but has not yet crossed it.
  • the described techniques can be further configured to define/associate locations as TSLs (or non-TSLs) only under or during specific conditions.
  • a certain location may only exhibit behavior as a TSL during certain times of days / days of week (e.g., during weekday rush hour), during certain climatic conditions (e.g., rain, snow) or during certain traffic conditions (e.g., real-time traffic conditions as provided by various mobile applications and/or road infrastructure).
  • an application that does not account for TSLs/LSLs may be configured to lift/modify a selective restriction (e.g., on driver devices) after the device has been determined not to have been in a trip for 2 minutes.
  • a selective restriction e.g., on driver devices
  • the result is that a driver who has permanently ended his/her trip will continue to be restricted from utilizing his/her device (e.g., in one or more ways that pertain to such restrictions) until 2 minutes have elapsed (a potentially frustrating result for many users) while a driver stopped at a red traffic light for three minutes (i.e., for a duration exceeding the referenced threshold) would be able to access his/her devices during the latter part of the red traffic light (and perhaps for some period of time point after the light changes and driving resumes), despite still being present in an active trip.
  • the selective restrictions implemented with respect to the device can be lifted/modified faster, for example, after one minute (resulting in a better experience for the user), whereas if the device is determined to have stopped at a TSL, the selective restrictions implemented with respect to the device can be lifted/modified more slowly, for example, after 3 minutes (resulting in a safer experience, both for the user and society).
  • a potentially different waiting period for example, 2 minutes can be implemented.
  • data originating from various sources can also be utilized in determining/defining TLSs and LSLs, such as in conjunction with the techniques described herein.
  • sources e.g., a database of traffic lights locations or parking lots
  • data originating from various sources can also be utilized in determining/defining TLSs and LSLs, such as in conjunction with the techniques described herein.
  • the level of detail of the information that is available through such databases may not include things like the length of a light for travelers in each direction at different date/times.
  • the speed at which new information is available may be higher with "field-determined" TSLs/LSLs using mobile devices than will numerous, possibly-governmental databases.
  • the various techniques described herein can be further configured to account for the fact that the locations of TSLs and LSLs may change over time, e.g., new traffic lights are installed, parking lots are built and destroyed etc. and the corresponding locations/associations that pertain to such TSL/LSL locations can be adjusted dynamically over time, such as in a manner known to those of ordinary skill in the art.
  • the various trip-end detection techniques described herein can be implemented with respect to parking assistance applications, whereby a determination as to whether a stop has occurred at a TSL (or an LSL) can enable a further determination as to whether a user has parked (and, for example, record and/or dissemination such a parked location [e.g., reflecting that one less space is available in a parking lot] and/or initiate payment arrangement for such parking) or whether such a user has just stopped temporarily in which case one or more of such determinations/functions may not be appropriate.
  • content e.g., sponsored content such as advertisements
  • content pertaining to a geographic location e.g., the geographic location with respect to which the inputs received at 5310 are associated
  • content can be identified based on a determination (such as at 5320) that the incidence of deceleration is relatively likely to be maintained for a relatively longer time duration.
  • content (such as the content identified at 5340) can be provided.
  • content can be provided at, on, and/or in relation to a device (such as the device with respect to which the inputs were received at 5310).
  • a device such as the device with respect to which the inputs were received at 5310.
  • such content can be provided based on a determination (such as at 5320) that the incidence of deceleration is relatively likely to be maintained for a relatively longer time duration, such as is described herein.
  • determining whether a stop has occurred at a TSL can enable the providing of more appropriate/targeted advertisements (e.g., to the user). For example, a user determined to have stopped permanently may be relatively more interested in dining opportunities in the area in which they stopped, whereas the same would be far less appropriate (and perhaps even distracting and dangerous) for a user who has only stopped temporarily.
  • long term trip ends can also be determined/assumed to have occurred if the user initiates one or more actions with respect to parking (e.g., user pays for parking with app) or can be determined to be likely to soon occur (e.g., user looks for parking with app, user nears the location in which she has reserved a spot), such as if the user engages a parking application on the mobile device (e.g., launch, bring to foreground, reserves a space) and/or depending upon what actions that user takes in such application.
  • a parking application on the mobile device e.g., launch, bring to foreground, reserves a space
  • TSLs and/or LSLs can have different degrees/magnitudes. For example, at some TSLs (e.g., a long red light), the time that a device spends at or near that location is relatively longer (e.g., 5 minutes), whereas at other TSLs (e.g., a short red light) the time the device spends at/ear that location is relatively shorter (e.g., 30 seconds).
  • the user in addition to (and/or instead of) determining or 'learning' the referenced TSLs and LSLs (e.g., using machine learning techniques as are known to those of ordinary skill in the art), the user (or another person knowledgeable about the user's travel patterns, e.g., an employer) can provide various input(s) as to one or more such locations at which she often makes LSLs (e.g., home driveway, work parking lot, etc.) or TSLs and device restrictions can be modified based upon the presence of the device in or near such one or more locations (e.g., as determined by GPS, WiFi access points or other signals) and, in certain implementations, further in conjunction with other inputs (e.g., speed, motion, etc.).
  • LSLs e.g., home driveway, work parking lot, etc.
  • TSLs and device restrictions can be modified based upon the presence of the device in or near such one or more locations (e.g., as determined by GPS, WiFi access points or other signals) and, in certain implementation
  • one or more restrictions employed on a device that was previously determined to be in a trip can be relaxed upon determining that the device is in (or near) the work parking lot (where the work parking lot is contained in the list of user input LSLs) and the device is moving at less than 10 km/h for 15 seconds.
  • a device can be determined to be at an LSL when (or just before, or just after) it enters a region that is known to be an LSL for that user and/or for a group of users.
  • monitoring the location of the device relative to LSLs and dynamically throttling/adjusting such monitoring e.g., frequency, intensively, sensor(s)/radio(s) used
  • the amount of power used to enable the device to determine that it is entering (or is about to enter or has just entered) an LSL "just-in-time" can be greatly reduced.
  • a device For example, if a device is 10 kilometers from its nearest known LSL, and the estimated time to reach the nearest LSL is 15 minutes, there is no need to check if the device is near the LSL for, perhaps, 14 minutes.
  • the check at 14 minutes can be done with a low power sensor (e.g., cellular radio, WiFi scan) to see how the device has progressed relative to the LSL.
  • a more accurate, higher power consuming sensor e.g., GPS
  • one or more device restrictions can be modified immediately.
  • an interface indicator/notification can be made visible or otherwise enabled when a one or more condition(s) have been met (e.g., the device/vehicle is determined to be moving at a low speed, short distance, with few/small WiFi changes, cell tower changes, etc.).
  • a one or more condition(s) e.g., the device/vehicle is determined to be moving at a low speed, short distance, with few/small WiFi changes, cell tower changes, etc.
  • Such an indicator/notification can, for example, notify or inform the user of the amount of time remaining until one or more restrictions (such as those that were employed based on a determination that the user was driving/the vehicle was moving) will be relaxed, removed, etc.
  • such an indicator/notification may not be visible or may be disabled when one or more condition(s) are met (e.g., high speed, long distance, many/large WiFi changes, cell tower changes, etc.).
  • the methods contained above may consist of the device also using information from vehicle data system 164 (e.g., OBDII), e.g., the speed of the vehicle and/or the gear in which the vehicle is engaged.
  • vehicle data system 164 e.g., OBDII
  • the speed of the vehicle and/or the gear in which the vehicle is engaged e.g., the speed of the vehicle and/or the gear in which the vehicle is engaged.
  • one or more 'trade-off(s)' or compromise(s) may need to be made with respect to how to process determinations pertaining to temporary stops/slow-downs in movement in order to determine when a device is in a moving vehicle and thereby selectively restricting functionality of the device on that basis (i.e., should the trip be determined to be over, in which case the selective restriction should be removed, or is the vehicle stopped, such as at a red light, in which case perhaps the selective restrictions should not be lifted).
  • implementing a 'timeout' period e.g., determining that a trip is over after a determination that a device has not moved for X minutes
  • an "honor system” can be employed, whereby drivers can self- declare their trips to have ended by providing/inputting such a declaration/indication to the device (e.g., via touch, voice, visual (e.g., gesture), shake etc. input), based upon which one or more restrictions can be modified, eased, or otherwise removed from the device without having to wait until a determination is made that the device is static/slow for a certain period of time.
  • the user of such as device and/or the device itself can be given/ascribed a "strike.”
  • the referenced self-declaration mechanism/technique can be disabled or otherwise cease to work (or will be made to work less effectively), such as with respect to the particular user and/or the particular device.
  • the referenced "strikes” may subsequently be canceled or otherwise "evaporate” based on different events (e.g., the passage of a certain amount of time, such as each strike expiring one week after it is created, the travelling of a certain distance - each strike expires 10 driving hours after it was created), etc.
  • a driver who has completed a trip can self-declare a trip to be over (such as in the manner described above), and such a self-declaration can cause an immediate (or shortly thereafter) determination to be triggered by acquiring sensor information and/or other information to determine, as soon as possible, whether or not the trip has ended (i.e., sooner than would otherwise be determined using various other techniques).
  • Such a feature can be advantageous, for example, in settings where, in an attempt to save power, the various data acquired to determine whether or not a trip has ended are acquired with latency (e.g., with delay, in duty cycles) and/or at less than the fastest sampling rates possible.
  • one or more of the determinations that can be made in order to lift/remove the referenced selective restriction(s) can be initiated relatively faster/sooner.
  • such technique(s) can enable ongoing power-saving techniques to be applied more readily while reducing the degradation to the user experience.
  • a similar technique can be used to allow a user to cause the device to immediately (or relatively more quickly that it would ordinarily have) re-check or re-query whether or not it is in a trip.
  • Such a technique can be advantageous, for example, in situations in which the device was determined to be present within a trip when, in actuality, it was not (false positive) and/or in situations in which the device was determined not to be present within a trip when, in actuality, it was (false negatives).
  • the trip detection techniques employed with respect to devices determined to be operated by passengers can be eased relative to devices determined to be operated by drivers.
  • power can be saved by lowering the rate at which one or more sensors (e.g. GPS, accelerometer, cellular, Wifi, BT radios, etc.) on a device determined to be operated by a passenger are sampled in order to determine whether a trip has ended.
  • sensors e.g. GPS, accelerometer, cellular, Wifi, BT radios, etc.
  • inputs originating from one or more low power sensors (e.g., accelerometer, gyroscope, etc.) of a device can be processed to determine an activity state of a device (e.g., stationary, walking, in-vehicle, etc.), such as based on one or more trends, patterns, etc., that can be identified within the various input(s).
  • an activity state of a device e.g., stationary, walking, in-vehicle, etc.
  • one or more restrictions associated with such activity can be applied to and/or in relation to the device.
  • one or more additional sensors and/or radios and/or the same low power sensors sampling at higher rates can be activated, and inputs originating from and/or determined with respect/in relation to such sensors, etc., can be processed to determine whether the device is present within a vehicle and, based upon such determination, one or more restrictions associated with such activity can be applied.
  • inputs originating from one or more motion sensors can be processed to effectively identify trip starts (or trip stops) because the length of time for which the device sustains acceleration (or deceleration) during a vehicular trip start (or trip stop) (which may be for 20 seconds or more, in some vehicle classes), generally exceeds the length of time for which sustained acceleration can occur in any human-powered activity.
  • the mobile device of a driver will, on average, display larger movements than that of a passenger measurable by sensors 145 of mobile device 105 due to the fact that the driver is likely to be holding the mobile device 105 in only one hand, whereas a passenger is more likely to be using both hands to hold a mobile device 105, or is capable of increased focus even when using only one hand to operate mobile device 105.
  • This can preferably be done by taking the Fourier transform of a 3D acceleration function and integrating it (squared, i.e. L2-norms) over N disjoint frequency intervals, as is well known to those of ordinary skill in the art.
  • the resulting 3N numbers are preferably a "signature".
  • the signature corresponding to a driver can be distinguished from that of a passenger using a classification algorithm, such as SVM, which has preferably been trained on sufficiently large pre-classified signature bank, as is also known to those of ordinary skill in the art.
  • GPS - GPS 145C of mobile device 105 can be used, preferably, in certain arrangements, in conjunction with other sensors, to identify the in- vehicle position of mobile device 105. In certain arrangements this is achieved in part based on knowledge of the lane boundaries of the road on which the vehicle is driving (based on map data or computation/observation), together with a determination of mobile device's 105 location, using GPS 145C, to be on the right or left side of such lane. If mobile device 105 is in the left part of its current lane, then it can be determined to be on the left side of the vehicle within which it is traveling, while if it is in the right part of its current lane, then it is on the right side of the vehicle. Such in-lane location calculations can further be averaged over time to increase the accuracy of the location of the mobile device 105 within its then current lane and, as a result, the accuracy of the determination of the location of mobile device 105 inside the vehicle.
  • turn can refer to a turn or any angle and/or curvature and/or any change in lateral acceleration and/or gyroscopic yaw, no matter how large or small and the comparisons described above can be applied discretely or continuously. It should also be appreciated that such inputs can be perceived at practically any time and/or interval, even those that do not necessarily correspond to “turns” as conventionally understood, and such inputs should be understood to be within the meaning of the term “turns” as used herein.
  • bump can refer to a change of any size in the upward acceleration, irrespective of positive or negative change and irrespective of how large or small and the comparisons and filtering described above can be applied discretely or continuously at regular or irregular sampling rates.
  • Magnetic Field - A vehicle's metallic and electrical parts influence the magnetic field in the vicinity of and inside such vehicle.
  • a 3-axis magnetometer 145E of mobile device 105 can be used to detect these influences by measuring such magnetic field(s) at various times before and during a vehicle's operation (e.g., a car that has not yet been started will have a different magnetic signature than one in which the electric systems are operating) and by comparing them with known magnetic signatures of different in- vehicle locations in order to determine the in-vehicle location of mobile device 105.
  • Such signatures can be universal and/or can depend on additional parameters such as vehicle model, vehicle location, etc.
  • the major metallic component in most vehicles is the motor and in most vehicles (e.g., cars, buses), and it is normally situated in the front part of the vehicle, near the center.
  • the magnetic field sensed by magnetometer 145E of mobile device 105 can be compared with the magnetic field that is otherwise present absent the magnetic disturbances - thereby indicating the direction of the motor.
  • the lateral component of that direction is preferably the opposite of the left-right in-car location of mobile device 105.
  • the values and signatures measured on and/or computed with and/or in relation to mobile device 105 are compared to baseline values (which are preferably stored in one or more databases 174, 162) in order to determine if mobile device 105 is that of a driver or a passenger.
  • baseline values can be independent of the user (e.g., the standard deviation of the time between keystrokes for all people in the country using a particular model phone), while in other arrangements such values can be user dependent (e.g., this mobile device 105 (or this user of this mobile device 105, if such is available) usually texts at 100 characters per minute, currently he is texting at the rate of 10 characters per minute - thus the person holding it is likely driving).
  • In- Vehicle Location In the United States and in most other countries in the world, drivers are the left- front most occupant in a vehicle, relative to the front end of the vehicle. By identifying whether a particular mobile device 105, 160 is or is not the left-front most device within a vehicle, a determination can be made that such device 105, 160 is or is not being operated by the driver.
  • the referenced in-vehicle identification/determination is preferably achieved in conjunction with communication between mobile device 105 and one or more of mobile devices 160, whether through direct communication or through network 166. It should also be appreciated that in certain arrangements such identification(s)/determination(s) can be performed in a server-side configuration, while in other arrangements such identification(s)/determination(s) can be performed in a client-side configuration. In one such server-side configuration, one or more software modules 130 are preferably executing at the various mobile devices 105, 160.
  • One or more of the modules configure the each of the respective devices 105, 160 to transmit its absolute location coordinates (such as those provided by GPS 145C and/or an inertial navigation system (INS) and/or its relative location (e.g., 3 meters from WiFi device #1234) to central machine 168.
  • Central machine 168 can then process the various locations coordinates and/or relative locations received from the various devices 105, 160 in order to determine which of the various devices 105, 160 are sufficiently close to one another, over a period of time (e.g., 1 minute, 1 hour, etc.), based on which it can be determined that such devices 105, 160 are present within the same vehicle.
  • the mobile devices 105, 160 communicate between one another (such as through communication interface 150), exchanging absolute location and/or relative location and determining which other devices 105, 160 are in within the same vehicle, substantially in the manner described above with regard to the server-side configuration.
  • one of devices 105, 160 can emit a tone and/or signal (such as an audio tone), and only those devices 105, 160 that perceive the emitted tone are determined to be within close proximity of the device that emitted the tone.
  • sensor data that is, data originating at one or more of sensors 145, such as location coordinates from GPS 145C, or lateral accelerations during a turn
  • sensor data can be compared with one another to determine a relative in-vehicle location of one or more of the devices 105, 160.
  • Such relative location can be subsequently filtered to generate a real-time driver-passenger determination, providing increasing accuracy in driver/passenger identification.
  • the driver of a vehicle is generally better able to anticipate the movements of the vehicle he/she is driving as compared to the passengers because the driver is the initiator of many of the movements that the vehicle undergoes, and can thus anticipate the forces that are created as a result of the vehicle's movement.
  • Such predictive actions can be detected by one or more of sensors 145 of mobile devices 105, 160 (e.g., accelerometer 145A and/or gyroscopel45B), and can be further processed to identify whether a particular mobile device 105, 160 is being used by a driver or a passenger.
  • a driver instinctively tenses and/or flexes certain of his/her muscles to adjust for the vehicle movements that are about to occur on average, more adroitly (less sudden with less corrective body movement) and more quickly than a passenger does.
  • a driver anticipates and compensates for the forces experienced during a turn quicker and more accurately than a passenger in the vehicle does.
  • a driver anticipates and compensates for the forces experienced during sharp deceleration (braking) more quickly and more accurately than a passenger.
  • a driver also anticipates and compensates for the forces of a lane change more quickly and more accurately than a passenger.
  • the driver can be thought of as a dampening system which performs better than a corresponding "passenger" system, due to the driver's higher degree of consciousness, awareness, and/or anticipation.
  • one or more of the listed effects/phenomena can be detected/identified by processing one or more inputs from one or more sensors 145, such as by measuring the change in acceleration (i.e. the L2 norm of the derivative of the acceleration) over the relevant time window.
  • the acceleration is preferably further band-pass filtered to focus only on frequencies relevant to this determination, and to further exclude other driver- acceleration effects (e.g., handshaking, etc.) as discussed herein.
  • Magnetic Field A vehicle's metallic and electrical parts influence the magnetic field in the vicinity of and inside such vehicle.
  • Inputs originating at a 3-axis magnetometer 145E of a mobile device 105, 160 can be used to detect and determine these influences by processing such inputs to determine a magnetic field at various times before and during such vehicle's operation (e.g., a car that has not yet been started will have a different magnetic signature than one in which the electric systems are operating) and by comparing them with known magnetic signatures of different in-vehicle locations in order to determine the in- ehicle location of such device 105, 160.
  • the presence of two or more devices within a single vehicle can influence each other's magnetic readings in a way that can be determined based on their comparison.
  • processing of the various inputs discussed herein is preferably enhanced by incorporating various additional processing operations which serve to further enhance the accuracy of the determinations that are made.
  • additional processing operations include, but are not limited to:
  • Clock synchronization In arrangements where inputs originating from multiple devices 105, 160 are processed together (such as several of those referenced above in EXAMPLE 2), it is preferable that simultaneous timing measurements originating at the respective devices 105, 160 are compared as well. In one arrangement, this can be effectively achieved by synchronizing the internal clocks of the respective devices 105, 160. By way of illustration, a relative displacement can be estimated, and this estimate can be used to process all relevant inputs such that they are synchronized to the same clock.
  • Examples of such synchronization methods include: (A) processing time inputs from GPS 145C to compute a mean time displacement between GPS clock and each the clock of each device 105, 160. The difference between those displacements can be determined to be the displacement between the devices. (B) Configuring one of the devices 105, 160 to emit a sound and receiving the sound at a second device (such as at microphone 145D), and further noting the time the respective events occurred at each device (that is, the time of the emitting of the sound and the time of the receipt of the sound) and then repeating same process in reverse. The noted times can then be subtracted from one another, reflecting the time that it takes to the sound to travel, and such values will cancel themselves out, leaving twice the relevant time displacement remaining.
  • FIG. 9 A depicts the relative coordinate system of mobile device 105, as is known to those of ordinary skill in the art and referenced herein.
  • FIG. 9B depicts the relative accelerations and gyroscopic rotations of a mobile device, as is known to those of ordinary skill in the art and referenced herein. It should be understood that although mobile device 105 is not shown in FIG. 9B for the sake of clarity, the various relative acceleration and rotations shown in this figure are relative to a mobile device in the same position as that shown in FIG. 9A.
  • FIG. 9C depicts the gyroscopic sign convention used herein, as is known to those of ordinary skill in the art and reference herein.
  • FIG. 10 depicts the coordinate system used in relation to a vehicle (such as at vehicle data system 164) as is known to those of ordinary skill in the art and reference herein.
  • FIGs. 11A-B depict mobile device 105 and its respective coordinate system in relation to a car and its respective coordinate system.
  • the respective coordinate systems can be transitioned, such that it is recognized, for example, that the +Z coordinate of the car corresponds to the +Y coordinate of the mobile device 105, and the +Y coordinate of the can corresponds to the -Z coordinate of the mobile device 105, as can be appreciated with reference to FIG. 11B.
  • Establishing the orientation of a mobile device 105, 160 within the coordinate system of a car can be accomplished in a number of ways.
  • the mean acceleration vector can be determined and be identified as the "down" axis.
  • the "forward" axis can be determined by comparing/processing inputs from GPS 145C that correspond to direction angles with inputs from magnetometer 145E that reflect 'north.' The third axis can be computed based on the first two determined axes using vector multiplication as is known to those of ordinary skill in the art.
  • inputs from the accelerometer 145A, the magnetometer 145E and the GPS 145C e.g., heading data
  • inputs originating at accelerometer 145 A, gyroscope 145B and/or other sensors 145 can be processed to identify real-time changes in the orientation of a device 105, 160.
  • acceleration/magnetic/GPS figures can be generated, preferably using "sensor fusion” algorithms, as is known to those of ordinary skill in the art. In doing so, the above-referenced "static" approach can be utilized to dynamically determine the relative orientation of the device 105, 160.
  • the gyroscopic sign convention adopted herein is preferably such that if an observer positioned on the positive part of the axis of rotation sees the rotation as counterclockwise, it is deemed to be positive.
  • Low-Pass filtering The values derived and/or computed from the various inputs originating at the various sensors 145 of mobile device 105, 160 can be frequently compromised by the vibration(s) present in car's environment (originating at the car's engine, road bumps, imperfect wheels, wind blowing through the windows, or even car audio sounds). Such vibrations can inject "noise" into the inputs originating at the various sensors 145, and can adversely affect the precision of the processing of the various algorithms disclosed here, both in terms of efficiency and final accuracy.
  • one or more of devices 105, 160 within the vehicle are attached to a dampening device.
  • a dampening device can include one or more weight(s) that can be attached to the mobile device 105, 160 to effectively increase its mass and thus make it more vibration resistant.
  • dampening materials e.g. sorbothane pads
  • the inputs can be preferably processed with a bounded pass filter. On such example is an FIR with 128 taps with Hamming windows.
  • Sensor Fusion As has already been noted and illustrated above, various determinations can be made by processing inputs from several sensors 145 together (e.g. forward velocity inputs originating at both the accelerometer 145A and GPS 145C).
  • biometric authentication methods include, but are not limited to, voice recognition, fingerprint recognition, face recognition, DNA identification, and retina identification.
  • mobile device 105 If talking, the device is restricted to being held on the left side (right side for U.K.) of the head/face of the user and with an upright orientation, so that driver usage cannot be hidden from external observers. (b) If texting, mailing, browsing etc., mobile device 105 is restricted to operating when having straight orientation (no yaw) (adjustment can be necessary, in certain arrangements, for a vertical/horizontal keyboard) and at least close to upright orientation (cannot be on knee or low down so that driver cannot "hide” the device use from external observers).
  • the device 105 is restricted to operating in one or more ways only when camera 145F perceives a frequently moving background (e.g., be held high, not hidden low in the driver's lap or blocked by the steering wheel).
  • a frequently moving background e.g., be held high, not hidden low in the driver's lap or blocked by the steering wheel.
  • the device 105 is restricted to operating in one or more ways that can be determined to correspond to operation by a passenger (and/or correspond to operation by a passenger of a particular device), such as the various determinations described in detail herein.
  • no correlation or, alternatively, no negative correlation
  • the device 105 is restricted to operating only when it can be determined based on one or more inputs that the device is under the control of a passenger and/or under the control of a passenger using this particular device, such as the various determinations described in detail herein.
  • the device is under the control of a passenger, as a passenger has the ability to control "shake” by using both hands to steady the device - an option not always available to drivers who generally need their second hand to steer the vehicle.
  • the general type or class of the vehicle e.g., motorcycle, car, bus, train, boat, plane, etc.
  • a signature analysis that is, an analysis of various patterns in various inputs
  • an accelerometer 145A and/or gyroscope 145B and/or GPS 145C of mobile devices 105 and/or 160 that there is a high-likelihood that a particular mobile device 105, 160 is located on a train
  • that the mobile device 105, 160 can remain fully operational without any operation state restrictions (assuming that no restrictions apply to anyone on the train including the conductor).
  • restrictions can be applied (e.g., no phone use at all or just no texting), particularly if it is determined that the user of the mobile device 105, 160 is the driver of the car, and not a passenger.
  • the type or class of vehicle in which a mobile device 105, 160 is located can preferably be identified and/or determined by using one or more of sensors 145 of mobile device 105. In certain arrangements, this identification/determination can be improved by using the onboard sensors of other mobile devices 160 and/or the onboard sensors (e.g., vehicle data system 164) of the vehicle in which mobile device 105 is traveling.
  • a device can be authenticated (e.g., determined to be likely to be operated by a user who is a passenger) if it can be determined that the user of the device is able to perform one or more actions (such as providing certain inputs) and/or demonstrate/provide evidence of certain situations (such as providing photographic/videographic documentation of such situations) that a user who is simultaneously driving a vehicle would not be reasonably capable of doing.
  • examples of such methods of authentication include: if, while having determined that the vehicle (within which the mobile device is present) is in motion, the user of the device can be determined to be capable of (a) performing an action in a different part of the vehicle (such as in an area of the vehicle where the driver could not reasonably sit and/or reach); (b) holding his/her look/gaze (i.e., maintain focus of his/her eyes) in a direction (such as towards the mobile device) that is not towards the road ahead, for a defined/sufficiently long period of time; (c) using/interacting with the device with two hands for a sufficiently long period of time/performing one or more tactile gestures that require substantially simultaneous use of both hands of the user (it should be noted that the terms "tactile gesture” and “tactile gestures” as used herein are intended to encompass inputs and/or interactions that are provided by a user in a tactile manner, such as through physical interaction with one or more media, elements, and/or components with one or more fingers or hands of
  • the device user can be prompted to provide one or more inputs in order to authenticate that s/he is a passenger.
  • authentication methods/approaches include performing an action or a set of actions, such as providing one or more alphanumeric inputs in response to a CAPTCHA prompt (as is known to those of ordinary skill in the art), providing one or more inputs during the course of interacting with a game, providing one or more inputs in attempting to solve a puzzle, a lock screen, etc.
  • authentication approaches/methods can be configured to require a significant degree of concentration (and/or prolonged concentration) on the part of the user, such that such authentication approaches/methods are too difficult to be successfully performed (and/or consistently successfully performed) by a driver of a moving vehicle (who, presumably, must concentrate on the road ahead).
  • Such authentication approaches/methods can be further improved by configuring them to require that (a) the authentication can only be successfully completed (e.g., authenticating the user of the device as the passenger) when the user uses both hands simultaneously (as described in detail herein); and/or (b) the authentication can only be successfully completed when the required inputs are provided such that the user cannot simultaneously see the road ahead while performing the authentication (such as by making the user tilt his/her head and/or eyes down or up or right or left, e.g., by requiring that the device be held flat and/or placed in the user's lap or on the seat between the user's legs, as described herein, and/or by requiring that the user look directly in to the device in the manners described herein).
  • the authentication can only be successfully completed (e.g., authenticating the user of the device as the passenger) when the user uses both hands simultaneously (as described in detail herein); and/or (b) the authentication can only be successfully completed when the required inputs are provided such that the user cannot simultaneously see the road ahead
  • various restrictions can configure the mobile device to require that the device be placed or held flat (such that the z-accelerometer of the device indicates the approximate value of gravity, e.g., when the device is positioned in the user's lap or on the seat between the user's legs,), and user tilts his/her head so that the camera(s) of the device can detect the eyes, gaze, face, and/or smile etc. of a person (not necessarily limited to a particular person), preferably for a certain minimum period of time (controlling, in certain implementations, for blinking and similar effects).
  • a forward-facing camera e.g., a camera on the side of the device that the screen is on in contemporary devices, such as the iPhone 4S produced by Apple of Cupertino, California, USA, and as is depicted in FIG. 15F, wherein 145Fi corresponds to a forward-facing camera and 145F 2 corresponds to a rear-facing camera
  • a rear-facing camera e. g., a camera on the side of the device that the screen is not on in contemporary devices such as the iPhone 4s produced by Apple of
  • the authentication methods/approaches described herein can be configured to authenticate a user of a mobile device as a passenger upon determining the presence of the eyes/gaze/face/smile of the user, such as from within a visual capture, only when such a visual capture can be determined to have been recorded while the mobile device was positioned in a manner/orientation that precludes/prevents the user from simultaneously seeing/focusing on the road ahead.
  • a driver operating the mobile device would generally only be capable of positioning the device in a manner/orientation whereby the device is able to capture his/her eyes/gaze/face/smile etc.
  • the authentication methods/approaches described herein can be configured to authenticate a user of a mobile device as a passenger only if the user holds the device within a certain range of distance from her eyes/gaze/face/smile etc. In doing so, it will be difficult, if not impossible, for a driver to fraudulently authenticate him/herself as a passenger by attempting to authenticate the device by holding the device close to his/her head such that the resulting visual capture recorded by the device does not contain any part of the steering wheel (as described herein).
  • the distance can be determined, for example, based on the number of pixels in the eyes/gaze/face/smile etc., relative to the resolution of the camera and/or relative to the angle of incidence and/or using other distance measurement techniques known to those of ordinary skill in the art.
  • the authentication methods/approaches described herein can be configured to authenticate a user of a mobile device as a passenger upon determination that the user performed (and/or is capable of performing) one or more actions/provided one or more inputs that can be determined to require(s) the use of one or two hands (e.g., swipe gestures at a touchscreen, including one or more times) such as in an implementation utilizing a forward-facing camera is used, and/or requiring a user to press a touchscreen with one or more fingers and/or touch one or more buttons on the device, irrespective of whether the camera is rear- facing or forward-facing), such as is described with reference to FIG. 15 A.
  • the device when performing the referenced authentication, it can be useful to configure the device such that the user is required to (a) hold the device at a certain orientation, e.g., in landscape orientation, thereby making it more difficult for a driver trying to falsely authenticate/unlock the device (which would otherwise allow the driver to perform certain actions), and/or (b) to encourage the user to hold the device in such orientation by having the device's screen display in landscape mode (that is, displaying the user interface in landscape mode, as described in detail herein), regardless of whether the device is actually held in a landscape orientation. It can be appreciated that such a configuration is relatively less of a burden on a passenger, as described in detail herein.
  • the authentication methods/approaches described herein can be configured to authenticate a user of a mobile device as a passenger only if no part of a steering wheel is present in a visual capture, such as a visual capture of a portion of the user.
  • the mobile device can be configured to providing audio and/or vibrational/tactile feedback to the user (such as in the case of a forward-facing or rear-facing cameras, whereby the user can be directed/instructed as to how to tilt/orient the device/camera in order to achieve the requisite angle required by the particular authentication method, as described in detail herein) and/or visual feedback (such as in the case of a rear-facing cameras and/or a forward facing camera) during the authentication process.
  • the user can be provided with guidance/feedback, as needed, in order to enable the user to achieve the requisite input (such as a particular visual capture) in order to authenticate the device, as described herein.
  • FIGS. 15D and 15E depict various examples of visual feedback that can be provided to a user during authentication.
  • a user upon determining that a device is in motion (or otherwise determining that a trip has begun) (such as in any number of ways described herein, and as depicted in FIGS. 15H and 151), a user can be prompted or otherwise presented with an interface that can enable the user to initiate an authentication process, such as by interaction with a 'slide to unlock' interface on a touchscreen, as shown in FIG. 15 J.
  • an authentication process such as by interaction with a 'slide to unlock' interface on a touchscreen, as shown in FIG. 15 J.
  • another interface can be provided which incorporates/implements any number of variable input techniques.
  • a keypad or keyboard, etc.
  • a position with respect to the touchscreen of the device can change based on one or more inputs.
  • FIGS. 15K-15P depict how the position of a keypad on a touchscreen can change based on and/or in relation to the angle/orientation of the device (as determined, for example, based on the accelerometer and/or gyroscope of the device).
  • the interface can be configured such that the keypad is centered within the touchscreen when the device is held in a 'flat' orientation (as shown in FIG. 15Q and described herein), while being positioned elsewhere on the screen (for example, in a manner that precludes a user from utilizing the entire keypad for input) when the device is held in other orientations (as shown in FIGS. 15K-P).
  • FIG. 15K-P depict how the position of a keypad on a touchscreen can change based on and/or in relation to the angle/orientation of the device (as determined, for example, based on the accelerometer and/or gyroscope of the device).
  • the interface can be configured such that the keypad is centered within the touchscreen when the device
  • the user upon orienting the device in a 'flat' orientation (and thus centering the keypad), the user can be prompted to hold their thumb (or another finger) in a specific region of the touchscreen, such as in the manner described herein.
  • the user Upon determining that the user is doing so (that is, maintaining his/her thumb/finger on the specified region), the user can be presented with a sequence of several numbers which are to be input into the keypad in order to authenticate the user to the device as a passenger, as shown in FIGS. 15R-T.
  • the user Upon successfully inputting the presented sequence, the user can be authenticated as a passenger and various aspects/functions of the device can be activated/unrestricted, as described herein.
  • the sequence can allow for some degree of user error (e.g., mistyping of the presented numeric sequence into the keypad), while also preventing a user from authenticating in the event that the quantity of errors exceeds a defined threshold (as described herein).
  • a defined threshold e.g., a defined threshold
  • the described authentication sequence is exemplary and that any number of modifications or adjustments or omissions to the sequence can be implemented and are within the scope of the present disclosure.
  • the user instead of a keypad the user may be presented with a keyboard and a sequence of letters to input.
  • the referenced technique(s) can be employed in relation to an orientation whereby the device is oriented, for example, such that the Y axis of the device (as shown in FIGs. 9A-9B and described in detail herein) is substantially aligned with the direction that the vehicle is traveling in, and the X-axis (as shown in FIGs. 9A-9B) is aligned with the up/down of the vehicle (as shown, for example, in FIG.
  • orientation is consistent with the X-axis up/down can be confirmed based on one or more inputs from the device's accelerometer, such as in the manner described herein.
  • the referenced techniques can be employed relatively consistently with respect to the various users (i.e., the same or a comparable orientation can be required of a user irrespective of whether he/she can be determined to be a driver or a passenger), in other implementations such an orientation can be selected based on a determination as to whether a particular user is likely to be a driver or a passenger (or a determination as to whether such a user is positioned on the right side or left side of the vehicle) as determined, for example, as described herein.
  • the user can be directed to orient the device in a particular way (e.g., such that the user must face towards the center of the vehicle in order to continue the referenced authentication process).
  • the referenced orientations are also exemplary and that any other such orientation can be similarly implemented.
  • the referenced technique(s) can be employed in relation to a substantially
  • 'flat' device orientation (as determined, for example, as described herein) and an orientation in which the Y- axis (or X-axis) of the device is oriented substantially up/down and the X-axis (or Y-axis) is substantially collinear with the direction in which the vehicle is traveling.
  • FIG. 58 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • a degree of correlation between one or more of (a) an axis of a device and a direction with respect to which a vehicle (within which the device can be determined to be present) is determined to be traveling and/or (b) an axis of a device and a direction with respect to the earth/ground can be determined.
  • an authentication/verification option/task can be enabled and/or an authentication attempt can be approved, such as with respect to the mobile device.
  • an option can be enabled/attempt can be approved based on a determination (e.g., at 5810) that the degree to which the one or more axes and their respective direction correlate (e.g., with respect to a direction that the vehicle is traveling and/or the ground) meets or exceed one or more defined thresholds.
  • a passenger authentication task may require that the orientation of the device must be determined to be one in which its y-axis (or x-axis) is within a certain range of the vehicle's heading (e.g., as measured by comparing the device's magnetometer with its GPS heading).
  • such an authentication task may require that the device's x-axis (or y-axis) be determined to be within a certain range of normal to the ground/earth (with respect to its roll and yaw).
  • the device's positive (or negative) y- axis must be determined to be pointed within +/- 20 degrees of the vehicle's heading and the device's positive (or negative) x-axis must be determined to be pointed up with +10 to -20 degrees of normal to the ground (roll) and +/-5 degrees of normal to the ground (yaw), such as is shown in Figure 50 (angles ranges not shown).
  • a window whose forward-facing edge is substantially vertical is indicative of a passenger window in most cars and is also prevalent in trains and buses. Moreover, the forward facing edges of windows of front seat occupants (drivers or passengers) are not substantially vertical. Accordingly, processing a visual capture provided by one or more device camera(s) (whether captured actively by a user and/or passively) together with the heading of the vehicle and the orientation of the device (e.g., using GPS, compass, magnetometer, accelerometer) can be used to identify the presence of window(s), their forward facing edges, as well as the degree of verticality of all or part of the forward facing edge (which, as noted, can be used to determine whether or not the user is likely to be a driver - or at least potentially a driver - or a passenger).
  • processing a visual capture provided by one or more device camera(s) (whether captured actively by a user and/or passively) together with the heading of the vehicle and the orientation of the device (e.g., using GPS, compass, magnetometer
  • FIG. 47 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • a first visual capture can be received, such as from a first camera of a user device.
  • a second visual capture can be received, such as from a second camera of the user device. In certain implementations, the second visual capture can be received substantially concurrent with receipt of the first visual capture.
  • the first visual capture can be processed, such as to identify a presence of a face within the first visual capture.
  • the second visual capture can be processed, such as to identify a presence of a face within the second visual capture.
  • one or more operations can be initiated, such as with respect to the user device. In certain implementations, the one or more operations can be initiated based on (a) an identification of a face within the first visual capture and/or (b) an identification of a face within the second visual capture. In certain implementations, one or more restrictions employed with respect to the user device can be modified. In certain implementations, a user of the user device can be identified as a passenger.
  • passenger authentication task(s) including but not limited to numeric input-related tasks and video capture-based tasks, such as those described herein, including in scenarios where various other elements/aspects/components (such as those described herein in relation to various passenger authentication tasks such as landscape orientation, thumb/multi-finger tasks etc.) are included/incorporated.
  • an authentication task can be employed such that the authenticator (e.g., a passenger) can be prompted to use a camera of the device (such as a rear-facing camera) to take a visual capture that can be processed to identify the inclusion of the profile (or another angle shot) of another vehicle occupant's face/head (for example, in lieu of a straight-on shot of the authenticator' s face as described herein).
  • the authenticator can be required to hold the device such that the screen is substantially parallel to vehicle's side window(s) and/or in an orientation where the device's touchscreen is facing the passenger's window. It can be appreciated that in such an orientation a driver is relatively unlikely to be able to take such a picture.
  • one or more inputs originating at the device can be processed to determine that such input(s) indicate that a driver is likely to be attempting such an authentication task (and the authentication attempt can thus be denied).
  • the referenced authentication task can also incorporate one or more of the other elements described herein (e.g., landscape orientation, thumb/multi-finger tasks, number input tasks, etc.), thereby further reducing the likelihood that the referenced task(s) can be performed by a driver, while also avoiding substantially increasing the difficulty for a passenger to perform them.
  • an authentication task can be employed such that the authenticating user
  • the authenticating user can be prompted to hold/orient the device such that the screen of the device is substantially parallel to the vehicle's side windows and is also positioned in an orientation where the device's screen is facing the passenger's window. In such a scenario, a driver is relatively unlikely to be able to take such pictures.
  • the one or more inputs originating at the device can be processed to determine that such input(s) indicate that a driver is likely to be attempting such an authentication task (and the authentication attempt can thus be denied).
  • the authenticating user can also be prompted to perform one or more of the other tasks, elements, etc. described herein (e.g., landscape orientation, thumb/multi-finger use, numeric input tasks, etc.), thereby further reducing the likelihood that the referenced task(s) can be performed by a driver, while also avoiding substantially increasing the difficulty for a passenger to perform them.
  • the authentication task can be employed such that the authenticating user (e.g., a passenger), can be prompted to use the device's front-facing and rear-facing camera simultaneously (or in sufficiently close chronological proximity to one another and/or interleaved) to take visual captures, each of which can be processed to identify a presence of a human face/head (e.g., a driver and a passenger) therein, whereby the front-facing camera (e.g., the user's camera) provides a video capture that includes a substantially straight-on face/head shot.
  • the authenticating user e.g., a passenger
  • the device's front-facing and rear-facing camera simultaneously (or in sufficiently close chronological proximity to one another and/or interleaved) to take visual captures, each of which can be processed to identify a presence of a human face/head (e.g., a driver and a passenger) therein, whereby the front-facing camera (e.g., the user's camera) provides a video capture that includes
  • the authenticating user can be prompted to hold/maintain the position/orientation of the device such that that the screen of the device is substantially parallel to vehicle's side windows and is also positioned in an orientation where the device's screen is facing the passenger's window.
  • a driver is relatively unlikely to be able to take such pictures.
  • the one or more inputs originating at the device can be processed to determine that such input(s) indicate that a driver is likely to be attempting such authentication task (and the authentication attempt can thus be denied).
  • the authenticating user can also be prompted to perform one or more of the other tasks, elements, etc. described herein (e.g., landscape orientation, thumb/multi-finger use, numeric input tasks, etc.), further reducing the likelihood that the referenced task(s) can be performed by a driver, while also avoiding substantially increasing the difficulty for a passenger to perform them.
  • the other tasks, elements, etc. described herein e.g., landscape orientation, thumb/multi-finger use, numeric input tasks, etc.
  • various aspects of the interface(s) depicted in FIGS. 15H-T can be configured to operate (e.g., be presented on a touchscreen) specifically/only in landscape mode/orientation. It can be appreciated that such an orientation can be more difficult for users who are drivers to interact with, while being relatively easier for users who are passengers to interact with.
  • any number of elements, operations, steps, or aspects of the referenced sequence(s) can be configured to determine that a user is likely to be a driver or a passenger at varying/intermittent stages of the sequence.
  • a sequence can be configured to terminate upon a determination that the device is not being held with at least a certain degree of stability (as can be determined as described herein, such as based on inputs from an accelerometer and/or gyroscope).
  • a sequence can be configured to terminate upon a determination that the sequence (or any number of the steps/aspects of the sequence) is not completed within a defined timeframe.
  • a user of a mobile device can be authenticated as being a passenger even if the vehicle within which the device is present is not moving, e.g., by recording a visual capture within which the driver's hand and the steering wheel can be determined to be present, as described in detail herein.
  • the authentication tasks/techniques described herein can include one or more oral/audio tasks (as can be prompted/instructed by the device) which can be perceived/detected by one or more microphone(s) of the device, optionally in conjunction with one or more other device sensors, and which can be processed/analyzed in order to authenticate a user of the device as a passenger, for example.
  • the device can prompt/ask the user to hold his device with two hands in a flat orientation (as if it were lying flat on a table), in landscape mode, and speak a word that flashes/is presented on the device screen at some time over a 5 second time period.
  • the voice input provided by the user and perceived by the microphone(s) can be recognized as being the correct word that was presented (or within a defined margin of error from it), while the accelerometer/gyroscope of the device also recognize that the device was held flat/in the proper orientation, while the touch screen recognizes that tactile contact was provided in one or areas of the device (e.g., corners of the device screen) s as instructed and in such a way that can be determined to be likely that the device was being held with two hands, the user can be determined to be (and authenticated as) a passenger.
  • FIG. 122 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • an input can be received, such as in a manner described herein.
  • such an input can be a visual capture, such as a visual capture that includes a face of the user, a fingerprint of the user, etc.
  • such an input can be a biometric input which can include, for example, a biometric element of the user, a fingerprint input, etc.
  • the input can be processed. In doing so, an in-vehicle role of a user can be determined and an identity of the user can be authenticated. In certain implementations, the input can be processed to determine an in-vehicle role of a user and/or to authenticate an identity of the user, such as based on a motion of the user.
  • one or more actions can be initiated, such as with respect to the user and/or the device.
  • such actions can be initiated based on a determination that the in-vehicle role of the user is a passenger and an authentication of the identity of the user, in certain implementations, one or more restrictions can be adjusted, such as with respect to the first device.
  • it can be determined, in relation to the user, which data to log with respect to a trip.
  • data logged during a trip can be associated with a navigation history associated with the user.
  • a lock screen e.g., secure or insecure
  • driver mode e.g., one or more restrictions
  • the user can be prompted to complete only one task to fulfill both requirements.
  • this can be done by using a verification technique which includes: (a) a visual capture of the user's face, which visual capture can also be used to authenticate the user as an authorized user of the device (e.g., using facial recognition); and/or (b) there is a press of the user's finger, e.g., in a particular location, which finger press can also be used to authenticate the user as an authorized user of the device (e.g., fingerprint recognition); and or (c) one of more of the user's motions (etc., signature of movement, timing, pressure) are used to identify the user as an authorized user of the device.
  • a verification technique which includes: (a) a visual capture of the user's face, which visual capture can also be used to authenticate the user as an authorized user of the device (e.g., using facial recognition); and/or (b) there is a press of the user's finger, e.g., in a particular location, which finger press can also be used to authenticate the user as an authorized user of
  • the task may only be completed (or started) if the vehicle is moving (perhaps above a certain threshold speed), as determined, with latency, for example, using GPS, which requires time to get a first fix (TTFF).
  • TTFF first fix
  • routine 1500 that illustrates a broad aspect of a method for selectively restricting operation of a mobile device 105 in accordance with at least one embodiment disclosed herein.
  • such selective restriction of a mobile device can be employed in order to restrict operation of the device by a user who is a driver of a vehicle, while not restricting operation of the device (or, restricting operation of the device relatively less) by a user who is a passenger of a vehicle. Accordingly, it should be appreciated that various of the determinations and identifications described herein are directed towards one or more factors and/or aspects that are indicative in various ways of whether the user of the particular device is a drive or a passenger of the vehicle. Additionally, it should be further understood that the referenced method is preferably implemented in situations, scenarios, and/or settings where mobile device 105 is present within a vehicle, such as a moving vehicle. Various methods for determining a presence of the device within a vehicle and/or a moving vehicle are described in detail herein.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, employs one or more restrictions at mobile device 105 and/or in relation to mobile device 105, substantially in the manner described above with respect to step 705.
  • restriction(s) are preferably configured, on average, to impede operation of mobile device 105 by a user that is a driver moreso than the restriction(s) impede operation of mobile device 105 by a user that is a passenger, as described in detail herein.
  • a single restriction can be understood and/or configured to be a combination and/or composite of multiple restrictions.
  • the methods described herein can be applied to a device that is already restricted, such as a device that is configured, by default, to operate, in a restricted state.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 receives one or more inputs, such as one or more visual captures originating at mobile device 105 and/or mobile devices 160 (e.g., from camera 145F), such as an image, a series of images, and/or a video, substantially in the manner described above with respect to step 710.
  • inputs such as one or more visual captures originating at mobile device 105 and/or mobile devices 160 (e.g., from camera 145F), such as an image, a series of images, and/or a video, substantially in the manner described above with respect to step 710.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171, processes at least one of the visual captures (such as those received at step 1510) to identify one or more indicators within the visual capture.
  • the terms "indicator” and/or “indicators” as used in context of the referenced visual capture(s) are intended to encompass one or more items, elements, and/or aspects that can be distinguished, determined, and or identified within a visual capture, as is known to those of ordinary skill in the art.
  • the visual captures e.g., images and/or videos
  • the visual captures can be analyzed using one or more image processing techniques, as are known to those of ordinary skill in the art.
  • one or more indicators can be identified within the visual capture, and such indicators can preferably be further utilized to determine if/how one or more restrictions are to be adjusted at/in relation to the mobile device 105, as will be described in greater detail below.
  • a visual capture can include an image of at least a portion of a face of a user, and such a visual capture can be processed to identify one or more indicators that reflect a steady gaze of the user. It can be appreciated that while a vehicle is in motion, a passenger in a vehicle is more likely to be able to maintain an ongoing steady gaze into a camera of a mobile device than a driver who will necessarily divert his/her gaze in order to see the road while driving.
  • a visual capture can include an image of at least a portion of a face of a user, and such a visual capture can be processed to identify an absence of a steering wheel in the visual capture. It can be appreciated that a visual capture that contains the presence of a steering wheel together with at least a portion of a face of a user indicates that it is likely that the user is in close proximity to the steering wheel, and is thus more likely to be a driver of the vehicle. Thus, in visual captures where the steering wheel has been determined to be absent, it can be determine that the user of the device which captured such a visual capture is likely to be a passenger.
  • a visual capture can include an image of one or more feet of a user, and such a visual capture can be processed to identify a position of the one or more feet. That is, it can be appreciated that a visual capture of the feet of a driver is likely to include one or more pedals (e.g., gas, brake, and/or clutch), and/or the positioning of the feet of the driver, either on or around such pedals. Conversely, the absence of such pedals and/or the foot positioning that they entails, indicates that the user of the device which captured the image capture is likely to be a passenger.
  • pedals e.g., gas, brake, and/or clutch
  • a visual capture can include an image of at least a portion of a body of a user, and such a visual capture can be processed to identify a presence of a fastened seatbelt in a passenger orientation, as depicted in FIG. 15B and as will be described in greater detail below with respect to step 1720.
  • a visual capture can include an image of an interior of a vehicle, and such a visual capture can be processed to identify at least two hands and a steering wheel, as will be described in greater detail below with respect to step 1542.
  • a video capture that reflects the scenery outside of a vehicle can be processed, such as using image/video analysis, to determine the orientation of the mobile device.
  • indicators such as the position of the sky and/or horizon can be identified within the in the visual capture, such as using image processing methods and approaches known to those of ordinary skill in the art.
  • various aspects of the visual capture will necessarily change based on the orientation of device 105 and/or the relative location of the device within the vehicle. It can thus be appreciated that in doing so, the orientation of mobile device 105 and/or the relative location of the device within a vehicle can be determined, as will be described in greater detail below.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 receives one or more inputs, preferably from at least one of the sensors 145 of mobile device 105, a vehicle data system 164, and/or one or more other mobile devices 160, substantially in the manner described above with respect to step 710.
  • various inputs can be received from multiple sources (e.g., various sensors, devices, etc.).
  • inputs from accelerometer 145 A and/or gyroscope 145B can be received, such inputs corresponding to an orientation of mobile device 105.
  • one or more input(s) from one or more tactile sensor(s) 145N such as the simultaneous depressing of one or more buttons, and/or the simultaneous tactile interaction of multiple points and/or locations on a touchscreen display.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 processes the one or more inputs, such as those received at step 1522, to determine a presence of a passenger within a vehicle.
  • inputs can be received from vehicle data system 164, such as those originating at various sensors deployed within a vehicle, such as weight/heat sensors that are positioned within one or more seats, and/or sensors that can detect whether one or more seatbelts are/are not fastened.
  • Such inputs can be processed in order to determine a presence of, and/or the likelihood of the presence of, one or more passengers within a vehicle.
  • processor 110 executing one or more of software modules 130, including, preferably, restriction module 171 processes the one or more inputs with the visual capture and/or the indicators to determine a relationship between the inputs and the visual capture and/or the indicators. That is, it can be appreciated that the processing of inputs from one or more of sensors 145 (and/or from other sources) together with the visual capture/indicators can further enhance the accuracy of determinations made and/or actions taken, such as the determination of the orientation and/or location of device 105, as referenced above.
  • inputs from accelerometer 145A and/or gyroscope 145B can be processed together with the referenced visual capture/indicators in order to determine an orientation of the device 105 and/or a relative location of the device with increased accuracy. That is, it can be appreciated that if the visual capture/indicators reflect scenery outside the vehicle, such visual capture/indicators can be processed to determine the direction in which the mobile device 105 is traveling. For instance, if a stationary item (and/or an item moving at a speed slower than the vehicle within which the user is traveling) identified in the visual capture moves from left to right over the course of the visual capture (such as the progression from Frame 1 to Frame 3 as depicted in FIG.
  • a relationship can be determined wherein preferably while the vehicle is in motion, the user presses the device against the vehicle's right-side window with the device oriented so that the Y axis of the device (as shown in FIGs. 9A-9B and described in detail herein) is pointed in the forward direction that the car is traveling in, the device is oriented so that the positive part of the X-axis (as shown in FIGs. 9A-9B) is facing up and the rear-facing camera of the device is facing out (together referred to as the "required orientation", as shown in FIG.
  • Indicators can be identified, such as by processing the one or more visual captures originating from the camera of the device to determine if the movement of pixels or blocks of pixels or features of interest in one or more successive visual captures are consistent with the device being on the right window (e.g., if pixels in successive frame are moving from left-to-right as opposed to right- to-left if the device were on the left window), as are known to those of ordinary skill in the art.
  • the referenced visual capture and/or indicators can be processed with the one or more inputs to determine a relationship therebetween.
  • one or more inputs can also be processed (such as in the manner described in detail herein) in order to determine a stability of mobile device 105. It should be understood that any number of inputs and/or combinations of inputs can be processed in order to determine the stability of device 105.
  • the degree of stability of device 105 can be indicative of whether a user of the mobile device is a driver or a passenger in that in a scenario where a driver points device 105 towards the right-hand window of the vehicle, it is likely that the mobile device 105 is supported only by the hand of the driver within the interior of the vehicle (it should also be recognized that given the fact that the driver must remain on the extreme left-hand side of the vehicle in order to operate the vehicle whose controls are situated on that side, it is exceedingly difficult, if not impossible, for the driver to effectively operate the vehicle while also reaching across the width of the car to the right-hand window). Given this positioning, a mobile device 105 operated by a driver of a vehicle is generally subject to considerable instability, especially when the vehicle is moving.
  • a passenger conversely, who can be seated on the right side of the vehicle, generally has the ability to hold and/or position his/her mobile device 105 against the window, wall panel, and/or door of the vehicle. Doing so provides an additional measure of stability which is generally unattainable by a driver, as discussed above. As such, it can be appreciated that by determining the stability of mobile device 105, and additionally, by identifying a relationship between the one or more inputs and the visual capture/indicators, the systems and methods disclosed herein can better identify and/or account for operation of the device by drivers and/or passengers.
  • any number of inputs can be analyzed in order to determine the stability of mobile device 105.
  • the visual capture/indicators themselves can be analyzed/processed in order to identify the degree of "shake” present, in a manner known to those of ordinary skill in the art. That is, it can be appreciated that multiple indicators can be identified from a single visual capture. For example, it can be appreciated that if an amount or degree of "shake" can be detected in a visual capture (in addition to other indicators that can be identified in the same visual capture, such as those described in detail above).
  • mobile device 105 In determining such an amount/degree of shake, it can be further determined that mobile device 105 is likely positioned in an unstable manner (such as by being held in a manner other than by being positioned against a window of the vehicle), and thus has a significant likelihood of being operated by a driver of the vehicle.
  • Various other inputs such as inputs from the gyroscope 145B and/or the accelerometer 145 A can also be processed/analyzed to determine the stability of the mobile device 105.
  • determinations can be further confirmed/ verified by processing the visual capture(s)/indicator(s) with one or more other inputs in order to determine a relationship between them.
  • a determination that the mobile device is pressed against the window can be computed based on inputs originating at the accelerometer and/or gyroscope and/or other sensors.
  • any number of determinations can be computed, in part in order to confirm that the device is being held in a manner consistent with being pressed against a window. For example, (i) if it can be determined that the device moves/shakes relatively less than it would if it were held in a hand that is not supported by a window and/or door; (ii) if the device vibrates consistently with being held against a window; and/or (iii) if the visual capture(s) of the mobile device do not contain pixel blocks indicative of the device not being against window (e.g., near-fixed pixels on the borders of the visual capture showing the door or the roof or the window frame of the vehicle, etc.) in the preferred/required orientation that is consistent with a user pointing the rear-facing camera of the device out the window of a vehicle (wherein the camera sees light, the camera sees moving pixels, and/or the inward facing light sensor shows an amount of light consistent with pointing inward in the required orientation, as is known to those of ordinary skill in
  • the device being pressed against the right-side window of the vehicle in the required orientation can be verified by processing one or more inputs, such as those originating at the accelerometer, by tracking the movement of the device immediately prior to its being pressed against a window. If, before determining/validating that the device is pressed against a window in the required orientation, and/or after accounting for changes in the orientation of the device during such period of movement, the device moved to the right relative to the direction in which the vehicle is heading, then it can be determined that the device has moved to the right-side window.
  • This determination can preferably be computed based on inputs originating at the accelerometer, the gyroscope, the GPS and/or the magnetometer, as is known to those of ordinary skill in the art. It should be understood that such approaches can operate independently and/or in conjunction with one or more of the various implementations and approaches described in detail herein.
  • the in- vehicle role of a device user can be determined based on a processing of a visual capture from the device to identify various objects, indicators, and/or patterns and determine the in- vehicle location and/or in-vehicle role of the user based on such objects, indicators, and/or patterns. For example, identifying a gas pedal in the rear-facing camera (or any other camera) on the device (e.g., within an image captured by such camera) is relatively likely to indicate a driver is operating such a device.
  • identifying a seat belt going over a left shoulder of a user is relatively likely to indicate that a passenger is operating such a device (reverse for the UK and other left side of road driving countries).
  • identifying a window in the front facing camera(s) e.g., within an image captured by such camera
  • identifying a window in the front facing camera(s) to the left of a user is relatively likely to indicate that a passenger is operating such a device.
  • one or more visual captures captured using a front-facing camera(s) (or other device camera(s)) of a device can be processed to determine the head/face angle (that is, the angle at which the head or face of the user is oriented, such as with respect to a device/camera) and/or gaze angle (that is, the angle at which the eye(s) of the user are oriented, such as with respect to a device/camera, which may differ from the head/face angle)of the user, such as in relation to the device and, based on such angles and/or based upon the variability of such angles over time, the in-vehicle role of the user of the device can be determined.
  • the head/face angle that is, the angle at which the head or face of the user is oriented, such as with respect to a device/camera
  • gaze angle that is, the angle at which the eye(s) of the user are oriented, such as with respect to a device/camera, which may differ from the head/face angle
  • passengers are likely to exhibit certain head and/or gaze angles in relation to a device and/or in relation to a vehicle that drivers are relatively less likely to be able to.
  • passengers are relatively likely to be able to maintain certain head and/or gaze angles in relation to a mobile device and/or in relation to a vehicle for longer lengths of time and/or with less variability than drivers can (because, for example, passengers are less likely to have to look at the road ahead).
  • FIG. 57 is a flow diagram of a routine that illustrates aspects of one or more methods, such as those described in relation to one or more embodiments described herein.
  • one or more aspects of the referenced method can be performed by one or more hardware devices/components (such as those depicted in FIG. 1), one or more software elements/components (such as those depicted in FIG. 1), and/or a combination of both.
  • one or more visual captures can be captured and/or received.
  • the one or more visual captures can be processed.
  • a head/face angle and/or an eye gaze angle can be determined (e.g., using one or more facial recognition and/or image analysis techniques such as are known to those of ordinary skill in the art).
  • a head/face angle and/or an eye gaze angle can be determined in relation to a device (e.g., a smartphone or other mobile device, such as a device that was used to capture the visual captures).
  • the referenced visual captures can be processed to determine whether the head angle and/or the eye gaze angle are maintained with respect to the device (e.g., within a defined margin of error) for at least a defined chronological interval (e.g., 10 seconds). Doing so can ensure that a driver cannot simply momentarily orient his/her head/face or eyes at a certain angle (e.g., looking 'down' towards a mobile device that is lying flat) and then change his/her head/face/eye orientation.
  • a defined margin of error e.g. 10 seconds
  • the referenced visual captures can be processed to determine a head/face orientation angle in relation to a device and an eye gaze angle in relation to the head/face orientation (reflecting, for example, whether the eye gaze angle of a user is or is not relatively consistent with the head/face orientation angle). Doing so can be advantageous in that it can enable the identification of scenarios in which the head/face of a user is oriented at one angle but the eye gaze of the user is oriented at a different angle - reflecting that the user is looking away (e.g., in an attempt to authenticate him/herself as a passenger while looking away with their eyes and continuing to drive).
  • the defined chronological interval (such as the interval with respect to which the head/face and/or eye gaze angle is to be maintained, such as at 5720) can be adjusted.
  • such an interval can be adjusted based on a determined speed (e.g., the speed at which the device and/or the vehicle within which it is present, can be determined to be traveling).
  • a determined speed e.g., the speed at which the device and/or the vehicle within which it is present, can be determined to be traveling.
  • the described technique(s) can be configured to account for the speed of the vehicle (as can be obtained, for example, through GPS or an in-vehicle system).
  • the described technologies may be configured to instruct/require the user to maintain his/her head/face and/or eye gaze angle(s) for a 10 second interval, while if the device and/or vehicle is determined to be traveling above 40 miles per hour, the described technologies may be configured to instruct require the user to maintain his/her head/face and/or eye gaze angle(s) for a 5 second interval.
  • an orientation angle of the device relative to the ground (or any other such surface) can be determined.
  • such an orientation angle can be required to be a substantially flat orientation (e.g., as depicted in FIG. 15G and described in detail herein).
  • the head/face angle and/or the eye gaze angle (such as are determined at 5720) can be compared to the orientation angle of the device (such as is determined at 5740). In doing so, a relationship between the head/face angle and/or the eye gaze angle and the orientation angle of the device can be determined.
  • a relative likelihood that a user of the device is a driver or a passenger can be computed.
  • a relative likelihood that a user of the device is a driver or a passenger can be computed based on (a) the head/face orientation angle in relation to the device (such as are determined at 5720), (b) the eye gaze angle in relation to the head orientation (such as is determined at 5720), and (c) an orientation angle of the device in relation to the ground (such as is determined at 5740).
  • an implementation of a restriction at a device can be modified.
  • such an implementation of a restriction can be modified based on the relative likelihood that a user of the device is a driver or a passenger (e.g., as computed at 5760).
  • requiring certain orientations of a mobile device when performing certain types of user authentication/verification techniques can serve as a surrogate for requiring that the user's face and/or eyes are directed in a certain manner (e.g., so that the user cannot simultaneously look at the road and successfully perform the authentication/verification task).
  • requiring that the direction, orientation, and/or angle of the head/face of a user and/or the eye gaze of the user can be within (or not within) one or more range(s)
  • such verification techniques and tasks can be even more difficult to falsely authenticate.
  • the angle of the device e.g., a smartphone
  • the angle of the device can be determined (e.g., based on inputs originating from the accelerometer, compass, magnetometer, etc. of the device, such as in a manner known to those of ordinary skill in the art) and (ii) the angle of the head/face of the user relative to the device can be determined (e.g., via image analysis of visual captures originating at the camera of the device).
  • the angle of the orientation of the device relative to the ground/earth can be determined (e.g., based on inputs originating from the accelerometer, compass, magnetometer, etc. of the device, such as in a manner known to those of ordinary skill in the art), (ii) the angle of the head/face of the user relative to the device can be determined (e.g., via image analysis of visual captures originating at the camera of the device), and (iii) the angle of the eye gaze of the user relative to his/her face can be determined.
  • visual capture(s) can be used to enable passengers to actively authenticate themselves as such, such as is described herein, such technologies can also be used to enable determination of an in-vehicle role and/or in-vehicle location of the user of a device and/or the likelihood thereof (e.g., in a passive manner).
  • a visual capture e.g., together with the orientation of the capturing device relative to the earth and/or relative to the vehicle
  • a database of physical properties of vehicle interiors (which may include, for example, visual captures of such interiors, spacing information/specifications, object location, colorings, consistencies, ceiling, floor and side markings as provided, for example, by vehicle manufacturers and/or from crowdsourcing), captured images, videos, etc., including views of the vehicle from different device positions and/or orientations can be utilized, e.g., to compare with various patterns (e.g., feature extractions) within a visual capture originating from a device (e.g., in addition to one or more other techniques described herein or known to those of ordinary skill in the art) to determine the in-vehicle location of the device.
  • various patterns e.g., feature extractions
  • various machine learning techniques can be employed with respect to visual capture(s) using supervised (and/or unsupervised) training sets to better determine how to associate such captures with in-vehicle locations and/or in-vehicle roles.
  • Such techniques can be applied on a device by device basis, for a larger population or for a combination of the two.
  • a device can learn the ceiling of a car in which a person commutes in regularly and, for example, after collecting information over several trips, be able to identify the in- vehicle location of a device in that car, and/or by matching visual captures against a database of car ceilings/interiors.
  • such a database may also include views (actual or generated) of the interior of a vehicle in different lighting conditions (e.g., day vs. night) and/or the ability to compare actual visual captures taken with simulations/projections of what such interiors are likely to look like under different lighting conditions (with or without ever actually rendering them).
  • different lighting conditions e.g., day vs. night
  • the ability to compare actual visual captures taken with simulations/projections of what such interiors are likely to look like under different lighting conditions with or without ever actually rendering them.
  • comparable technique(s) can be applied to determine the mode of transportation and/or vehicle class that a device is present within (together with or without various others vehicle class determination techniques described herein or known to those of skill in the art).
  • vehicle class determination techniques described herein or known to those of skill in the art.
  • the patterns and/or physical properties of interiors of different vehicles classes are likely to perceptibly differ from one another.
  • a visual capture originating from a device in a passenger car is likely to depict lower ceilings and a narrower interior and smaller windows as compared to a comparable visual capture originating from a device present within a train or a bus.
  • the visual capture can be compared to a database of visual captures to determine the class of vehicle that a device is in.
  • a visual capture can be processed using similar techniques to determine whether a user is traveling on public transportation (e.g., bus or train), in which case, most/all surrounding devices can be determined to be likely to be passenger devices (and thus neither the in-vehicle location nor in- vehicle role associated with such a device necessarily need be determined) or non-public transportation (e.g., a car) in which case such determinations can be performed.
  • public transportation e.g., bus or train
  • non-public transportation e.g., a car
  • a visual capture can also be processed to improve (e.g., based on accuracy, power, latency, etc.) the techniques described herein (or those of others) that attempt to determine whether or not a device is present within a vehicle or a device (and/or its user) is present outside of a vehicle and/or engaged in another form of activity (with or without various other trip detection and/or activity recognition techniques described herein or known to those of ordinary skill in the art).
  • a visual capture originating from a device present outside a vehicle can be differentiated/distinguished from one taken inside a vehicle.
  • a visual capture that can be processed to identify a pattern associated with a sidewalk (as captured from a then downward-facing camera) or open sky (as captured from a then upward-facing camera) can be determined to be likely to be outside a vehicle.
  • identifying a pattern that shows a background consisting of office ceiling tiles with a certain geometry can indicate that the device is present indoors (i.e., not within a vehicle), while the presence of window patterns, headrests, seatbelts, a steering wheel or gas pedals, etc. can be determined to be likely be in a vehicle.
  • Different levels of light can also be utilized to differentiate between devices inside vehicles and those outside vehicles, and different levels of light between multiple cameras on the same device can serve as a further indication of the same, such as is described herein.
  • multiple cameras can be used "stereophonically" to more accurately determine the device's in-vehicle location.
  • a default setting can indicate the device as belonging to a driver (or a passenger), e.g., until capture of the visual capture is no longer impeded. This can be done with or without given a warning notification to the user.
  • some of the techniques described herein may involve identifying objects and/or features or properties within a still image and/or analyzing changes in the same or different aspects over more than one image and, in some cases, in relation to other device sensors (for example, changes in images as compared to changes perceived in a position and/or orientation of a device, e.g., at the time the referenced images are captured, such as in order to better measure distance.
  • a device determined to have rotated two degrees and in which an object within various visual captures can be determined to have moved by 200 pixels can be used to further determine the distance between the device and the object which, as previously described, can be different for different vehicle classes).
  • An accelerometer is used on many mobile devices (and/or in many mobile operating systems) to determine the orientation in which the device's screen should be displayed (e.g., portrait, landscape, negative portrait or negative landscape). For example, if the accelerometer perceives sufficient gravity is falling on its y-axis, the screen displays in portrait (or negative portrait, depending on the sign of the y-axis accelerometer value). If the accelerometer perceives sufficient gravity is falling on the x-axis, the screen displays in landscape (or negative landscape, depending on the sign of the x-axis accelerometer value).

Abstract

L'invention concerne des systèmes et des procédés destinés aux interférences de contexte de dispositif mobile relatives au transport. Dans un mode de réalisation, une première capture visuelle peut être reçue en provenance d'un premier composant de capture visuelle d'un dispositif ; une seconde capture visuelle peut être reçue en provenance d'un second composant de capture visuelle du dispositif ; et la première capture visuelle et la seconde capture visuelle peuvent être traitées par un dispositif de traitement pour déterminer un rôle, à bord du véhicule, d'un utilisateur du dispositif. L'invention concerne également diverses autres technologies.
PCT/US2015/047054 2012-06-21 2015-08-26 Interférences de contexte de dispositif mobile relatives au transport WO2016033252A2 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/506,327 US20170279957A1 (en) 2013-08-23 2015-08-26 Transportation-related mobile device context inferences
GB1704680.6A GB2547809A (en) 2014-08-26 2015-08-26 Transportation-related mobile device context inferences
US15/089,186 US9638537B2 (en) 2012-06-21 2016-04-01 Interface selection in navigation guidance systems
US15/583,140 US20170234691A1 (en) 2012-06-21 2017-05-01 Interface selection in navigation guidance systems

Applications Claiming Priority (24)

Application Number Priority Date Filing Date Title
US201462042244P 2014-08-26 2014-08-26
US62/042,244 2014-08-26
US201462047649P 2014-09-09 2014-09-09
US62/047,649 2014-09-09
US201462063152P 2014-10-13 2014-10-13
US62/063,152 2014-10-13
US201462066378P 2014-10-21 2014-10-21
US62/066,378 2014-10-21
US14/540,936 2014-11-13
US14/540,932 2014-11-13
US14/540,954 2014-11-13
US14/540,954 US9772196B2 (en) 2013-08-23 2014-11-13 Dynamic navigation instructions
US14/540,951 US20150168174A1 (en) 2012-06-21 2014-11-13 Navigation instructions
US14/540,936 US20150177010A1 (en) 2013-08-23 2014-11-13 Suppressed navigation instructions
US14/540,951 2014-11-13
US14/540,932 US20150141043A1 (en) 2013-08-23 2014-11-13 Corrective navigation instructions
US201462092929P 2014-12-17 2014-12-17
US62/092,929 2014-12-17
US201562128002P 2015-03-04 2015-03-04
US62/128,002 2015-03-04
US14/706,954 2015-05-07
US14/706,954 US9175967B2 (en) 2012-06-21 2015-05-07 Navigation instructions
US201562194761P 2015-07-20 2015-07-20
US62/194,761 2015-07-20

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/540,932 Continuation-In-Part US20150141043A1 (en) 2012-06-21 2014-11-13 Corrective navigation instructions

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/089,186 Continuation-In-Part US9638537B2 (en) 2012-06-21 2016-04-01 Interface selection in navigation guidance systems

Publications (2)

Publication Number Publication Date
WO2016033252A2 true WO2016033252A2 (fr) 2016-03-03
WO2016033252A3 WO2016033252A3 (fr) 2016-04-21

Family

ID=55400820

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/047054 WO2016033252A2 (fr) 2012-06-21 2015-08-26 Interférences de contexte de dispositif mobile relatives au transport

Country Status (1)

Country Link
WO (1) WO2016033252A2 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018023755A1 (fr) * 2016-08-05 2018-02-08 胡明祥 Procédé permettant d'empêcher une mauvaise opération sur un ordinateur par un enfant en fonction de la reconnaissance faciale, et système de reconnaissance
CN107894902A (zh) * 2016-09-30 2018-04-10 北京小米移动软件有限公司 设备控制方法及装置
US20190019133A1 (en) * 2017-07-14 2019-01-17 Allstate Insurance Company Controlling Vehicles Using Contextual Driver And/Or Rider Data Based on Automatic Passenger Detection and Mobility Status
US20190384295A1 (en) * 2015-02-10 2019-12-19 Mobileye Vision Technologies Ltd. Systems and methods for identifying landmarks
US10832261B1 (en) 2016-10-28 2020-11-10 State Farm Mutual Automobile Insurance Company Driver profiles based upon driving behavior with passengers
US20230026515A1 (en) * 2016-10-14 2023-01-26 Allstate Insurance Company Bilateral communication in a login-free environment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003098424A (ja) * 2001-09-25 2003-04-03 Fujitsu Ten Ltd 画像処理測距装置
JP4567630B2 (ja) * 2006-05-26 2010-10-20 富士通株式会社 車種判別プログラムおよび車種判別装置
US20130241720A1 (en) * 2012-03-14 2013-09-19 Christopher P. Ricci Configurable vehicle console
US9584735B2 (en) * 2010-11-12 2017-02-28 Arcsoft, Inc. Front and back facing cameras
PT3255613T (pt) * 2010-12-15 2022-12-02 Auto Telematics Ltd Método e sistema para registar o comportamento de veículos
US8538402B2 (en) * 2012-02-12 2013-09-17 Joel Vidal Phone that prevents texting while driving

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190384295A1 (en) * 2015-02-10 2019-12-19 Mobileye Vision Technologies Ltd. Systems and methods for identifying landmarks
US20190384294A1 (en) * 2015-02-10 2019-12-19 Mobileye Vision Technologies Ltd. Crowd sourcing data for autonomous vehicle navigation
US11599113B2 (en) * 2015-02-10 2023-03-07 Mobileye Vision Technologies Ltd. Crowd sourcing data for autonomous vehicle navigation
US11774251B2 (en) * 2015-02-10 2023-10-03 Mobileye Vision Technologies Ltd. Systems and methods for identifying landmarks
WO2018023755A1 (fr) * 2016-08-05 2018-02-08 胡明祥 Procédé permettant d'empêcher une mauvaise opération sur un ordinateur par un enfant en fonction de la reconnaissance faciale, et système de reconnaissance
CN107894902A (zh) * 2016-09-30 2018-04-10 北京小米移动软件有限公司 设备控制方法及装置
US20230026515A1 (en) * 2016-10-14 2023-01-26 Allstate Insurance Company Bilateral communication in a login-free environment
US10832261B1 (en) 2016-10-28 2020-11-10 State Farm Mutual Automobile Insurance Company Driver profiles based upon driving behavior with passengers
US11037177B1 (en) 2016-10-28 2021-06-15 State Farm Mutual Automobile Insurance Company Vehicle component identification using driver profiles
US11875366B2 (en) 2016-10-28 2024-01-16 State Farm Mutual Automobile Insurance Company Vehicle identification using driver profiles
US20190019133A1 (en) * 2017-07-14 2019-01-17 Allstate Insurance Company Controlling Vehicles Using Contextual Driver And/Or Rider Data Based on Automatic Passenger Detection and Mobility Status
US11928621B2 (en) * 2017-07-14 2024-03-12 Allstate Insurance Company Controlling vehicles using contextual driver and/or rider data based on automatic passenger detection and mobility status

Also Published As

Publication number Publication date
WO2016033252A3 (fr) 2016-04-21

Similar Documents

Publication Publication Date Title
US9638537B2 (en) Interface selection in navigation guidance systems
US9175967B2 (en) Navigation instructions
US9772196B2 (en) Dynamic navigation instructions
US20190349470A1 (en) Mobile device context aware determinations
US20170279957A1 (en) Transportation-related mobile device context inferences
US20190082047A1 (en) Device context determination
US20170302785A1 (en) Device context determination in transportation and other scenarios
US20220182482A1 (en) Restricting mobile device usage
US9800716B2 (en) Restricting mobile device usage
US20200349666A1 (en) Enhanced vehicle sharing system
WO2016033252A2 (fr) Interférences de contexte de dispositif mobile relatives au transport
US9378601B2 (en) Providing home automation information via communication with a vehicle
US20170067747A1 (en) Automatic alert sent to user based on host location information
US20140309789A1 (en) Vehicle Location-Based Home Automation Triggers
AU2012313395A1 (en) Restricting mobile device usage
CN110910190A (zh) 汽车共享和出租车服务的综合识别和认证

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15506327

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 201704680

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20150826

122 Ep: pct application non-entry in european phase

Ref document number: 15836330

Country of ref document: EP

Kind code of ref document: A2