WO2008075082A1 - Mobile device and method of operation thereof - Google Patents

Mobile device and method of operation thereof Download PDF

Info

Publication number
WO2008075082A1
WO2008075082A1 PCT/GB2007/004948 GB2007004948W WO2008075082A1 WO 2008075082 A1 WO2008075082 A1 WO 2008075082A1 GB 2007004948 W GB2007004948 W GB 2007004948W WO 2008075082 A1 WO2008075082 A1 WO 2008075082A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
light
sensor
sound
emitter
Prior art date
Application number
PCT/GB2007/004948
Other languages
French (fr)
Inventor
Masao Kajihara
Original Assignee
Symbian Software Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB0625642.4A external-priority patent/GB0625642D0/en
Application filed by Symbian Software Limited filed Critical Symbian Software Limited
Publication of WO2008075082A1 publication Critical patent/WO2008075082A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present invention relates to a mobile device, and method of operation thereof, and in particular to a mobile device and method of operation in which the device is able to sense its present situation, and, optionally, adapt its behaviour in dependence thereon.
  • FIG. 1 is a block diagram of the internal elements of such a device, being a conventional mobile telecommunications device 10.
  • the mobile telecommunications device 10 comprises RF processor hardware 32, baseband processor hardware 34, and a power regulator 36, all of which deal with the telecommunications operations of the mobile telecommunications device 10, i.e. using telecommunications protocols to make, for example, voice calls, data connections, and the like.
  • the user interface and applications which run on the smart phone are run by the application processor 38, which runs the graphical user interface, as well as any applications requested by the user, and provides an interface to the telecommunications stack provided by the baseband processor.
  • secondary communication subsystems such as, for example, Bluetooth subsystem 40.
  • An infrared subsystem may also be provided.
  • the mobile telecommunications device 10 is further provided with various memory, such as ROM 42, RAM 44 and user data memory 46.
  • ROM 42 read only memory
  • RAM 44 random access memory
  • user data memory 46 stores data which is accessible by the user, such as contact data, messages, images, user settings data, and the like.
  • the device 10 is typically provided with a speaker 20 for providing an audio output, and a corresponding microphone 22, which provides an audio input into the device.
  • the speaker 20 can be used to provide an audio output during the voice or video call, or can, in some devices of the prior art, also provide other audio output, such as for playing digitally encoded music tracks or the like.
  • the microphone 22 is provided to capture a user's voice during voice or video calls, and can also be used via other applications.
  • a screen 24 is typically provided and it is also common for a video camera 26, capable of capturing still or moving images, to be provided. Very often, the screen 24 is used to display the image presently being captured by the camera 26 i.e. the signal flow is from the camera 26 to the screen 24.
  • typical mobile telecommunications devices 10 of the prior art also include a vibrator 29, typically a pancake/coin motor, as is well known in the art.
  • a vibrator 29 typically a pancake/coin motor
  • Other components capable of producing a vibration are also known, such as piezoelectric vibration generation devices, which can also be used.
  • the usual operation is for the vibrator to be activated when an incoming call or message is received, to provide a physical movement of the device which can be felt by a user, for example if the device is in a user's pocket or hand.
  • United States patent application publication number 2006/0172706 describes a wireless communications device having a vibration motor therein, arranged to vibrate the wireless device for a predetermined period.
  • the wireless device is also provided with an accelerometer, at which an acceleration measurement can be taken during the period when the vibration motor is activated.
  • an onboard accelerometer By combining an onboard accelerometer with onboard vibrator, wireless devices are then given the means to detect if they are being held by users. Particularly, if a wireless device is on the table or is in the user's pocket and is not being held by anyone, the accelerometer can predictably measure the acceleration pattern that occurs when a vibrator is turned on.
  • the acceleration patterns differ when a wireless device is being held by a person and when it is not being held, and in particular the acceleration patterns that are measured by the accelerometer reflect the effective mass seen by the vibrator.
  • the effective mass is greater as it includes the mass of the wireless device and that of the user's hand and arm.
  • the wireless device provided with the vibrator and accelerometer can determine if it is being held by turning on its vibrator for a predetermined period of time and by reading the output of its accelerometer.
  • US 2006/0172706 therefore describes how, using a vibrator and accelerometer, a wireless device is able to determine information pertaining to its surrounding environment, and adjust its operation accordingly.
  • a wireless device is able to determine information pertaining to its surrounding environment, and adjust its operation accordingly.
  • 2006/0172706 requires the wireless device to be fitted with an accelerometer, a component which is not usually included within such devices, and which adds to the cost. Moreover, the exclusive use of an accelerometer to detect the device's situation means that only a relatively small number of situations can be detected i.e. whether the device is being held or not. Whilst the ability of a device to adapt its operation depending on its surrounding environment as described in US 2006/0172706 is useful, it would be more advantageous if such sensing of its surrounding environment could be performed without the need for additional components within the device, thus saving cost, and component integration difficulties.
  • Embodiments of the invention provide a mobile device, such as , for example, a mobile telephone, which uses sensor components which are typically conventionally already found in such mobile devices, such as cameras and microphones, to perform a determination as to the present situation of the device.
  • a camera mounted on the mobile device (and which would typically be provided for other uses) is able to capture images of the surroundings of the device, and processing of those images then undertaken to determine a present situation of the device.
  • a sound signal captured by a microphone can be processed to provide an estimate of the device's present situation.
  • a stimulus signal for measurement by the sensor can be provided, and this again is preferably provided by a component which is typically standard in such a device.
  • the mobile device screen can be caused to light, to attempt to light up the device's present surroundings.
  • the mobile device speaker (which would also be typically provided) can emit a test sound, and attenuation, distortion or other changes in the sound as measured by the microphone used to inform the situation determination.
  • multiple sensors may be used together, and this allows for additional situations to be discriminated between.
  • the ability to distinguish between situations allows for the mobile device behaviour to be adapted in dependence on the discrimination. For example, the mobile device ringer volume may be changed, or the magnitude or frequency of vibration of the device vibrator altered.
  • the present invention provides a mobile device comprising: a light or sound sensor; and a processor for processing output signals from said light or sound sensor; the arrangement being such that said processor determines from said output signals a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found.
  • a mobile device comprising: a light or sound sensor; and a processor for processing output signals from said light or sound sensor; the arrangement being such that said processor determines from said output signals a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found.
  • the possible situations comprise at least two or more selected from the group comprising: in an enclosed space, in a user's hand, face up on a surface, face down on a surface. These are typical situations in which a mobile device may be found.
  • the device behaviour is adapted in dependence on the determination of the present situation.
  • the adaptation may take the form, for example, of adapting a ringer volume, or a vibrator vibration intensity or frequency.
  • the screen brightness may be controlled.
  • the light sensor is a camera and the sound sensor is a microphone.
  • the mobile device has light and sound sensors, said determination being performed in dependence on output signals from both such sensors. Using multiple such sensors allows for discrimination of a larger number of situations then with just using one sensor.
  • the mobile device has a second light sensor in addition to said first light sensor, said determination being performed in dependence on output signals from both such sensors.
  • the second light sensor is provided on a different face of said device than said first light sensor.
  • the mobile device further comprises light or sound emitters, the arrangement being such that output signals from said light or sound sensor are obtained whilst said light or sound emitter is emitting.
  • said light or sound sensor input signal is compared by said processor with an output signal emitted by said light or sound emitter to perform said situation determination. This allows for a more accurate situation determination to be performed.
  • said light emitter is a display screen of said mobile device and said sound emitter is a speaker of said mobile device.
  • said sound emitter is a speaker of said mobile device.
  • the mobile device further comprises: a vibrator; and an accelerometer arranged to measure vibrations of the said mobile device caused, at least in part, by said vibrator; the arrangement further being such that said processor receives an accelerometer output signal indicative of the measured vibrations, and said situation determination is performed in dependence thereon.
  • a vibrator and accelerometer sensor combination in addition to the light and/or sound sensors further increases the number of situations which may be discriminated between.
  • the present invention further provides a mobile device having a plurality of sensors for sensing one or more sensor media; a processor for processing signals produced by said sensors indicative of the present state of the sensor media; the arrangement being such that said processor determines from said produced signals a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found.
  • a mobile device having a plurality of sensors for sensing one or more sensor media; a processor for processing signals produced by said sensors indicative of the present state of the sensor media; the arrangement being such that said processor determines from said produced signals a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found.
  • said processor determines from said produced signals a particular present situation of at least four predetermined possible situations in which said mobile device is to be presently found. The number of situations is therefore further increased.
  • the predetermined possible situations are selected from the group comprising: in an enclosed space, in a user's hand, face up on a surface, face down on a surface. These are typical situations in which a mobile device may be found.
  • the multiple sensors preferably comprise two or more sensors selected from the group comprising: a first light sensor, a second light sensor, a sound sensor, and a motion sensor.
  • a first light sensor is a first camera mounted on a first face of the mobile device.
  • said second light sensor is a second camera mounted on a second face of the mobile device. Again, such components are also common, and hence no additional components are used.
  • said sound sensor is a microphone and said motion sensor is an accelerometer.
  • the mobile device preferably further comprises at least one emitter for emitting energy in the form of at least one of the sensor media detected by at least one of the sensors. This allows for an active investigation of the device's present surroundings to be undertaken, thus increasing the ability and accuracy of the discrimination between situations which is undertaken.
  • said produced signals are obtained from at least one of said sensor whilst said at least one emitter is emitting energy.
  • said determination performed by said processor includes comparing a produced signal from at least one of the sensors which senses the sensor medium in which said emitter emits energy with an output signal of the emitter. Again, such functions allows the number of situations which can be discriminated to be higher than has heretofore been the case.
  • the at least one emitter is preferably chosen from a group comprising: a light emitter, a sound emitter, and a motion generator.
  • the light emitter is a display screen of the mobile device, and preferably the sound emitter is a speaker of the mobile device.
  • Such components are typically provided in mobile devices for other uses, and hence component count and cost is not increased.
  • the invention provides a method of operating a mobile device provided with a light or sound sensor, comprising the steps of: obtaining output signals from said light or sound sensor indicative of the mobile device surroundings; processing said output signals to determine therefrom a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found.
  • the invention also provides a method of operating a mobile device having a plurality of sensors for sensing one or more sensor media, comprising the steps of: obtaining signals from said plurality of sensors, said signals being indicative if the present state of the sensor media; and processing said signals to determine therefrom a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found.
  • the invention additionally provides a computer program or suite of computer programs so arranged such that when executed by a computer processor they cause the computer to perform the steps of any of the third and fourth aspects above.
  • a machine readable storage medium storing the computer program or at least one of the suite of computer programs according to the fifth aspect.
  • the machine readable storage medium may be any medium known in the art, such as solid state memory, optical discs, magneto-optical discs, magnetic discs, or the like.
  • Figure 1 is a block diagram of a mobile communications device of the prior art
  • Figure 2 is a drawing illustrating a first situation in which such a device may be found
  • Figure 3 is a drawing illustrating a second situation in which such a device may be found
  • Figure 4a is a drawing illustrating a third situation in which such a device may be found
  • Figure 4b is a drawing illustrating a further situation in which such a device may be found
  • Figure 5 is a block diagram of a mobile communications device according to a first embodiment of the invention.
  • Figure 6 is a diagram illustrating how light emitted from a screen can be reflected into a camera
  • Figure 7 is a diagram illustrating how light emitted from a screen can be absorbed by suiTounding material
  • Figure 8 is a flow diagram of a method operation of a mobile communications device according to a first embodiment of the invention
  • Figure 9 is a diagram illustrating how sound emitted from a speaker can be picked up by a microphone
  • Figure 10 is a diagram illustrating how sound emitted via speaker can be absorbed by surrounding material
  • Figure 11 is a flow diagram illustrating a method of operation of a mobile communications device according to a second embodiment of the invention
  • Figure 12 is a table illustrating how various outputs and inputs of a mobile communications device are affected by various mobile device situations
  • Figure 13 is a flow diagram of a method of operation of a- mobile communications device according to another embodiment of the invention.
  • Figure 14 is a flow diagram of a method of operation of a mobile communications device according to yet another embodiment of the invention.
  • Embodiments of the invention to be described relate to a mobile device, such as a mobile communications device such as a telephone, or other mobile device such as a
  • the mobile device which is able to sense its surroundings, and then, optionally, adapt its behaviour in dependence on the sensed surroundings.
  • the mobile device may be any other type of device, such as, as mentioned, a PDA, or a laptop, media player, or the like.
  • Figures 2, 3 and 4 depict particular situations in which a mobile communications device 10 may be found. More particularly, Figure 2 illustrates how a mobile communications device 10 can commonly be found in an enclosed space, such as a user's pocket. In this case, the device 10 is in close contact with the material of the enclosed space 16, such that the material substantially surrounds the device 10. Commonly, the material of the enclosed space 16 would commonly be a type of fabric.
  • Figure 3 illustrates a second situation in which a mobile device 10 may be found, in this case being held in the hand of a user 12. Typically, this would be when the device is being used, or about to be used.
  • Figures 4a and 4b illustrate a third situation which is commonly found, that is where the device 10 is simply placed on a surface, such as table top 14. In this case, as shown in Figure 4a the device may be placed face up, or, as shown in Figure 4b, face down.
  • the device may conveniently adapt its behaviour in dependence on the situation. For example, in the case of the device finding itself in an enclosed situation such as shown in Figure 2, where the device is commonly within a user's pocket, then the ringer of the device may be made to be louder, so that the user can hear the ringer more easily.
  • any vibrator provided in the device 10 may also be controlled to cause the device to vibrate with a greater magnitude and/or different (preferably higher) frequency, so that the user feels the vibrations caused by the device more clearly.
  • the device 10 upon detecting that it is in an enclosed space 16, may control the screen so that it is not lit. This has the advantage of saving device battery power, which is an important consideration for mobile devices.
  • the device finds itself in the second situation shown in Figure 3, wherein the device is in its user's hand, then the user has direct tactile communication with the device 10, which tactile communication can be used to attract the user's attention, for example without having to activate the ringer of the device.
  • the ringer may be controlled so as to be of reduced volume, or to be rendered mute, as not being necessary. Whilst in a user's hand 12, it is likely that the user may be looking at the device, and hence in this case it would be useful for the screen of the device 10 to be lit.
  • any vibrations caused by the vibrator in the device but given the direct tactile communication between the device and the user's hand, it is not necessary for these vibrations to be of any great magnitude.
  • the magnitude of vibrations and/or frequency of vibrations produced by a vibrator in the device can be reduced.
  • muting the ring tone, and reducing the magnitude of vibrations produced by the vibrator saves battery power.
  • a third situation in which the device may find itself is that of Figure 4a, i.e. face up on a surface 14, such as a table top, or the like.
  • the device is not in contact with, or necessarily in close proximity to a user, and hence there is no need to provide a tactile output in the form of the vibrator.
  • the vibrator within the device can be disabled.
  • the ringer may be set at its default volume, or may, alternatively, caused to be louder.
  • the user may be able to see the screen of the device, and hence the screen is preferably lit, or made brighter. In this way, only those outputs of the device which are able to attract the user's attention usefully are used.
  • embodiments of the invention focus on using components which have become standard in mobile telecommunications devices, such as speakers and microphones, and cameras and screens, to enable sensing of the device's surroundings. In this way, reliance on relatively complicated and expensive components such as accelerometers is reduced.
  • FIG. 5 is a block diagram of a mobile communications device 10 according to the embodiments of the invention.
  • the mobile communications device 10 of Figure 5 is identical to the mobile communications device 10 of Figure 1 described previously, but with the difference that stored in the ROM 42 is an attitude detection program which may be run by the application processor 38 either periodically or constantly to detect the attitude i.e. the environment and circumstances of the mobile device 10, to permit adaptation of the device's outputs.
  • the steps performed by the attitude detection program stored in the ROM 42 when run by the application processor 38 in each of the embodiments will be described later. Note that each of the embodiments to be described is based upon the provision of the attitude detection program in the ROM 42, the differences between each embodiment lying in the steps performed by the mobile device 10 under the control of the attitude detection program of each embodiment, when run by the application processor 38.
  • Figures 6 and 7 illustrate how the screen 24, and camera 26 can be used together in the first embodiment to detect the mobile device 10 situation.
  • the first embodiment relies on the ability of the video camera 26 to collect images of the mobile device's situation, and in particular to discriminate whether the device is in a light or a dark place. Additionally, the first embodiment also relies on the use of the screen 24 as a light source, which can be used for lighting the surrounding environs of the device, for viewing by the camera. In particular, light emitted from the screen 24 can reflect off nearby objects and be captured by the camera 26, as well as ambient light, and any information thus obtained can be used to determine the mobile device 10 situation. Such an arrangement is shown in Figure 6, where light emitted by the screen 24 can reflect off any nearby objects, and by captured by the camera 26.
  • attitude detection program stored in the ROM 42 uses the camera and screen to determine the device lO's situation, by first obtaining an image from the camera at step 8.2.
  • a determination is performed as to whether the image contains light. This determination may, for example, take the form of a thresholding operation looking at the grey scale values of the pixels of the image, and determining an average grey scale value, and comparing that average to a threshold.
  • the image is determined to contain light, i.e. is a light image.
  • the threshold may for example be set to be the median possible greyscale pixel value.
  • the mobile device preferably adapts its output such as the ringer and/or vibrator and/or screen to the detected hand or table top situation.
  • the device in the first embodiment it is not possible using just a camera to distinguish as to whether the device is either in the user's hand, or on the table top surface, but what is known is that at step 8.20 the device is not in an enclosed space such as a pocket. Therefore, the device knows that there is no need to increase the volume of the ringer, as the ringer will not be attenuated by the enclosed space. Therefore, the ringer volume can be kept at the default volume. Similarly, because the device knows that it is not an enclosed space, then there is a possibility of the user being able to see the screen of the device, and hence the device can cause the screen to be lit to alert the user to the receipt of a call or message.
  • the vibrator when in a user's hand the vibrator can be activated at a reduced setting, and when on the table top it need not be activated at all.
  • a device does not know whether it is either in the user's hand or on a table top, and hence the vibrator may be activated at a reduced setting i.e. taking that result which is most likely to attract the user's attention.
  • the latent image captured by the camera at step 8.2 does not contain any light, i.e. the grey scale thresholding operation gives an average value below the threshold value, then at step 8.6 the device determines that it is necessary to perform a night check, and step 8.8 lights the screen of the device.
  • step 8.12 may be identical to that performed at step 8.4 i.e. a thresholding operation performed on the average grey scale value of the image pixels.
  • the threshold may be set at a different, preferably lower, level than the threshold of step 8.4. The reason for this is that in step 8.4 an evaluation is performed as to whether the image contains ambient light i.e. daylight, which can be expected to be at a relatively high level.
  • step 8.12 an evaluation is being performed as to whether the image contains light emitted from the screen 24, and reflected from nearby objects. Given the power of a typical screen 24 in a mobile device, and the amount of light emitted therefrom, the level of light detected at step 8.12 will therefore be relatively low. Hence, a lower threshold will typically be used at step 8.12 than at step 8.4.
  • the device can then infer at step 8.22 that it is either in the user's hand, or on the table top.
  • the device's outputs can be adapted at step 8.20, and in the same manner as before.
  • step 8.14 if it is determined thereat that the image does not contain any light i.e. that the lower threshold value for the average grey scale of the image pixels is not met, then it is determined at step 8.14 that the phone must be in an enclosed space such as the user's pocket 16. In this case the attitude detection program then adapts the ringer and/or vibrator and/or screen output at step 8.16 to the pocket situation.
  • the ringer when in an enclosed space such as the user's pocket, the ringer is preferably caused to ring more loudly, and the vibrator to vibrate with a greater magnitude. However, it is not necessary to light the screen to attract the user's attention, as the user will be unable to see such a lit screen.
  • the mobile device 10 uses just the camera 26 as a sensor and the screen 24 as a light source it is possible for the mobile device 10 to detect whether or not it is in an enclosed situation such as a user's pocket, or whether it is either in the user's hand or on a surface.
  • the device 10 can then adapt its user alert outputs in dependence on the determined situation.
  • the great advantage of the first embodiment is that conventional mobile communications devices such as mobile telephones are typically provided with screens and cameras already, and hence no additional componentry is required within the device in order to operate according to the first embodiment. Instead, all that is required is additional software to cause the application processor 38 to control the speaker 20 and microphone 22, and to process the signals received therefrom. This is provided by the attitude detection program 42.
  • Figure 9 illustrates how the speaker 20 can be controlled to emit sound waves, preferably of known characteristics, such as a test tone or the like.
  • the sound waves emanate cleanly from the speaker 20, and can be picked up by the microphone 22.
  • the signal detected by the microphone 22 should be substantially similar to, or at least a known variation thereof, the emitted test tone.
  • FIG 10 is a flow diagram illustrating the operation of the attitude detection program in the ROM 42 according to the second embodiment, in performing such a determination.
  • the attitude detection program stores the test tone patterns, being the "bright” i.e. clean or undistorted test tone pattern itself, as well as information relating to a low attenuated version of the test tone pattern i.e. corresponding to the test tone having undergone a low degree of attenuation and/or distortion; as well as a high attenuated version of the test tone pattern i.e. a version of the test tone pattern which has undergone a high degree of attenuation and/or distortion.
  • the test tone patterns are stored as part of the data of the attitude detection program in the ROM 42.
  • the attitude detection program controls the application processor to cause the speaker 20 to emit the bright version of the test tone from the speaker.
  • the microphone 22 is controlled to record its input during the period while the test tone is being emitted, at step 11.6.
  • the recorded input from the microphone 22 is compared with the test tone patterns stored in the ROM 42, and a determination performed at step 11.10 as to which test tone pattern is most similar. How the comparison and determination steps 11.8 and 11.10 are performed is a matter of implementation detail, and is dependent upon the information representing the test tone patterns.
  • the stored test tone patterns stored at step 11.2 represent actual signal patterns then pattern matching techniques can be employed, to compare the recorded input with the test tone patterns, and determine the pattern that is most similar.
  • Various conventional pattern matching techniques which may be used in this respect are known in the art, such as those used in speech recognition systems or the like.
  • a more simple comparison and determination can be performed.
  • the information relating to the low attenuated test tone and high attenuated test tone may, for example, simply be a signal threshold level which determines the degree of attenuation of the signal from the bright or unattenuated version.
  • the average power, or absolute average signal level of the recorded input can be compared against the low attenuation and high attenuation threshold values, and determination as to the degree of attenuation of the signal determined based on this threshold operation.
  • a distortion measurement of the signal may be used, rather than power or absolute signal level, with the distortion of the recorded input signal being compared with distortion values for the bright test pattern, the low attenuator test pattern, and the high attenuator test pattern. In further embodiments combinations of these measurements may be used to make the decision.
  • step 11.10 howsoever the determination is performed at step 11.10, at step 11.12 an evaluation is performed as to whether the bright pattern is most similar, and if this is not the case, then at step 11.18 an evaluation is performed as to whether the low attenuated test tone pattern was the most similar. If this evaluation returns negative, then at step 11.22 an evaluation as to whether the high attenuated pattern was the most similar is undertaken.
  • step 11.24 the determination is made that given the high degree of attenuation of the signal then the mobile device is in the enclosed situation such as the user's pocket. In such a case, processing proceeds to step 11.6, wherein the behaviour of the mobile device is then adapted to the pocket situation.
  • the adaptation of the mobile device outputs to the pocket situation at step 11.26 can be identical to that of step 8.16 of the first embodiment, described previously. That is, when determined to be in the pocket situation, the vibrator and ringer are caused to be of greater magnitude, and the screen is preferably not lit.
  • step 11.18 an evaluation was performed as to whether the lower attenuated pattern was most similar. If this is the case, then this is because it is likely that the phone is either in an enclosed space such as a pocket (but the precise arrangement is such that a high attenuation of the signal has not occurred, although some attenuation or distortion has occurred) or that the phone is face down on a surface such as a table top. In this case, if the phone is face down then the speaker will likely be facing into the surface, and hence any sound waves emitted therefrom and subsequently recorded by the microphone will be distorted and/or attenuated by the surface.
  • step 11.20 the determination that the mobile device is either in the face down on table top situation, or in an enclosed situation such as a pocket is made but it is not possible to distinguish further between these two situations.
  • there is no need to light the screen of the mobile device and, if the device is in fact face down on a table top, increasing the magnitude of the ringer and vibrator, as if the phone was in a pocket, will still attract the user's attention (and perhaps even more so). Therefore, in this circumstance where it is not possible to distinguish between the phone being either face down on the table top or in a pocket, it is reasonable to adapt the phones output to the pocket situation, and hence processing proceeds to step 11.26.
  • step 11.12 if it was determined that the bright pattern was most similar, i.e. the recorded input from the microphone substantially reproduced with very little attenuation or distortion the emitted test tone, then here it is possible for the mobile device to determine that it is either in the user's hand, or face up on a table top, and such determination is made at step 11.14. It is not possible to distinguish further between these two situations, but as in the first embodiment it is possible to reconcile this lack of information and adapt the behaviour of the device accordingly to provide an output profile which is suitable to both the hand or table top situation. Therefore, at step 11.16 the device behaviour is adapted to the hand and table top situation, this adaptation being the same as previously described in step 8.20 of Figure 8 in respect of the first embodiment i.e. the screen is caused to be lit, but both the ringer and vibrator outputs can be reduced in magnitude.
  • a mobile device 10 is able to use sound as a sensing medium using a speaker to emit a known sound, and then recording that sound through the microphone, as modified by the phone's surrounding environment. By then comparing the recorded sound with the original sound a determination can be performed as to the mobile device's situation, and its behaviour is adapted accordingly.
  • no additional componentry is required within the mobile device other than that which is conventionally provided. Instead, again as in the first embodiment, all that is required is additional software, in the form of the attitude detection program stored in the ROM 42 and, in this case, the additional data in the form of the test tone patterns or information against which the recorded signal can be compared.
  • Figure 12 is a table showing the degree to which sensor combinations experience attenuation or distortion depending on the situation in which the mobile device finds itself. For example, as shown in Figure 12 when the mobile device is in an enclosed situation, the sensor combination of speaker output and microphone input will suffer either a low, or a high degree of attenuation or distortion.
  • the speaker output/microphone input sensor combination will not suffer any attenuation or distortion.
  • the device is in an -open situation, such as on the table, attenuation or distortion will be experienced depending on whether the device is face up, or face down. Similar considerations can be made for each of the other sensor combinations, as shown in the table.
  • the table also includes in this case the signal attenuation or distortion which would be suffered by a rear camera input provided on the mobile device.
  • a rear camera input provided on the mobile device.
  • many conventional mobile devices to be provided with two cameras, being one on the front face of the device i.e. the same face as the screen, and another on the rear face of the device i.e. the opposite face to the screen.
  • rear camera input in the table, we mean the input image obtained from the rear camera on the opposite face of the device to the screen.
  • the rear camera input will be highly attenuated and/or distorted i.e. will be dark.
  • the input will be the opposite of the front camera input, i.e. will be highly attenuated when the phone is in the face up position such that the surface on which the rear camera is mounted is face down to the surface, and will not be attenuated at all when the mobile device is in the face down position i.e. the surface on which the rear camera is mounted is facing upwards.
  • step 14.2 an image is obtained from the rear camera on the mobile device, where this is provided.
  • An evaluation is then performed as to whether the image from the rear camera is highly attenuated i.e. is the image dark or light, at step 14.4. As in the first embodiment, this evaluation can be performed by taking the grey scale values of the image pixels, and finding the average grey scale value.
  • the attitude determination program can conclude that of the two situations i.e. face down on table top, or in pocket, then it is likely that the phone is in the user's pocket, and hence at step 14.12 the phone behaviour can then be adapted to the pocket situation.
  • This adaptation can preferably take the same form as the adaptation used in the same situation in the first and second embodiments, such as at step 11.26 or step 8.16.
  • the attitude determination program can conclude at step 14.6 that the phone is probably face down on the table top, such that the rear camera input is then capturing ambient light. In this case, the program then proceeds, at step 14.8, to adapt the phone to a face down on table top situation. As described previously, here it is not necessary to light the screen, and neither is it necessary to operate the vibrator, as the phone is not in any way in tactile communication with the user. Instead, all that need be activated is the phone ringer, which may either be kept in the default position, or, increased in magnitude.
  • the further embodiments based on Figure 14 provide the ability to further distinguish between the mobile device situations, and in particular as to whether the device is face down on a table top, or in a user's pocket.
  • the use of the rear camera means that the sensing medium is always light, as the front camera and screen are used to provide the initial determination, and then the image captured from the rear camera used to provide the secondary determination.
  • sound is used as the sensing medium to make the primary determination, and then light in terms of the image captured from the rear camera used to make the secondary sensing determination. In both cases, however, multiple sensing devices are used to further distinguish the mobile device situation.
  • a vibrator and accelerometer sensor combination is used to provide further distinction between the mobile phone situations.
  • a vibrator and accelerometer sensor combination By combining the information obtainable by use of a vibrator and accelerometer sensor combination with other sensor combinations, such as the microphone and speaker combination, then full distinction between the four possible situations shown previously in Figures 2 to 4 becomes possible.
  • the use of a vibrator and accelerometer sensor combination was previously disclosed in US 2006/0172706 mentioned previously, therein it was merely used to distinguish between two possible mobile device situations i.e. whether the device was in the user's hand or not.
  • a test is performed using the speaker and microphone sensor combination.
  • This test is essentially the same as the second embodiment described previously, and involves emitting a test tone, recording the microphone input signal whilst the test tone is being emitted, and then comparing the recorded signal with test tone patterns to determine the degree of attenuation or distortion of the signal.
  • the recorded signal is examined to determine whether it is highly attenuated, and if this is the case it is because the phone is likely in a user's pocket or other enclosed space, and hence at step 13.6 the phone behaviour is adapted to the pocket situation.
  • the adaptation to the pocket situation is preferably as described previously in the other embodiments.
  • step 133.8 an evaluation is performed as to whether the signal suffered a low degree of attenuation or distortion. If this was the case, then as described previously in respect of the second embodiment, it is possible to conclude that the phone is either likely in a pocket, or face down on a table. This determination is made at step 13.10. To distinguish between the situations, then as in the embodiment just described it might be possible to use an input image from a rear camera, if provided, to distinguish between the two situations. However, in the present embodiment, the accelerometer and vibrator sensor combination is used to distinguish between the two situations, and this is performed at step 13.12.
  • the test involves, as in the prior art, activating the vibrator for a brief period of time, and at the same time using the accelerometer to record the vibration pattern experienced by the device. If the recorded pattern is highly attenuated compared to what was expected, then it is likely that the phone is being held close to the user's body or is in close contact with other material, and hence is likely to be in a pocket or other enclosed space. In contrast, if a low attenuation of the vibration is perceived by the accelerometer, then it is likely that the phone is on the table top.
  • step 13.20 it must be the case that the recorded sound was substantially similar to the test tone, suffering little or no distortion or attenuation. In this case, as in the second embodiment, it is possible to conclude at step 13.22 that the mobile device is either in the user's hand, or face up on a table. However, in the previous embodiments it was not possible to further distinguish between these situations.
  • the mobile device situation is tested with the vibrator and accelerometer sensor combination, the testing process being substantially as described previously, that is substantially the same as in the prior art. If the vibrations produced by the vibrator as measured by the accelerometer have suffered a high degree of attenuation, as evaluated at step 13.26, then, as in the prior art, it is possible to deduce that the phone is likely in the user's hand. In this case, as shown at step 13.28, the mobile device behaviour is then adapted to the hand situation as previously described.
  • step 13.26 does not detect a high degree of attenuation of the vibrations produced by the vibrator as measured by the accelerometer, and there is either no attenuation, or a low degree of attenuation, as determined at step 13.30, then it is possible to conclude, as shown at step 13.32, that the phone is likely on the table top face up.
  • the mobile device behaviour is then adapted to the face-up-on-table situation as described previously i.e. the vibrator is disabled, the screen is caused to be lit, and the ringer can be activated at either the default, or increased, volume.

Abstract

Embodiments of the invention provide a mobile device, such as, for example, a mobile telephone, which uses sensor components which are typically conventionally already found in such mobile devices, such as cameras and microphones, to perform a determination as to the present situation of the device. For example, in one embodiment a camera mounted on the mobile device (and which would typically be provided for other uses) is able to capture images of the surroundings of the device, and processing of those images then undertaken to determine a present situation of the device. Likewise, in another embodiment a sound signal captured by a microphone can be processed to provide an estimate of the device's present situation. In preferred embodiments a stimulus signal for measurement by the sensor can be provided, and this again is preferably provided by a component which is typically standard in such a device. Thus, for example, to aid situation determination by a camera, the mobile device screen can be caused to light, to attempt to light up the device's present surroundings. In preferred embodiments the ability to distinguish between situations allows for the mobile device behaviour to be adapted in dependence on the discrimination. For example, the mobile device ringer volume may be changed, or the magnitude or frequency of vibration of the device vibrator altered.

Description

Mobile Device and Method of Operation Thereof
Technical Field
The present invention relates to a mobile device, and method of operation thereof, and in particular to a mobile device and method of operation in which the device is able to sense its present situation, and, optionally, adapt its behaviour in dependence thereon.
Background to the Invention
Mobile devices such as mobile communications devices are well known in the art. Figure 1 is a block diagram of the internal elements of such a device, being a conventional mobile telecommunications device 10. Here, the mobile telecommunications device 10 comprises RF processor hardware 32, baseband processor hardware 34, and a power regulator 36, all of which deal with the telecommunications operations of the mobile telecommunications device 10, i.e. using telecommunications protocols to make, for example, voice calls, data connections, and the like. The user interface and applications which run on the smart phone are run by the application processor 38, which runs the graphical user interface, as well as any applications requested by the user, and provides an interface to the telecommunications stack provided by the baseband processor. Also typically provided are secondary communication subsystems, such as, for example, Bluetooth subsystem 40. An infrared subsystem may also be provided.
The mobile telecommunications device 10 is further provided with various memory, such as ROM 42, RAM 44 and user data memory 46. The graphical user interface and any data which is required therefor, such as icon images, and the like are stored in ROM 42. RAM 44 typically stores any applications available on the device 10, as well as associated data. User data memory 46 stores data which is accessible by the user, such as contact data, messages, images, user settings data, and the like.
Concerning the physical user interface of such a typical mobile telecommunications device 10, and in particular those elements thereof which can be used to alert the user to an incoming call or message, the device 10 is typically provided with a speaker 20 for providing an audio output, and a corresponding microphone 22, which provides an audio input into the device. As is well known, the speaker 20 can be used to provide an audio output during the voice or video call, or can, in some devices of the prior art, also provide other audio output, such as for playing digitally encoded music tracks or the like. The microphone 22 is provided to capture a user's voice during voice or video calls, and can also be used via other applications. To provide a video output, a screen 24 is typically provided and it is also common for a video camera 26, capable of capturing still or moving images, to be provided. Very often, the screen 24 is used to display the image presently being captured by the camera 26 i.e. the signal flow is from the camera 26 to the screen 24.
In addition to the above, typical mobile telecommunications devices 10 of the prior art also include a vibrator 29, typically a pancake/coin motor, as is well known in the art. Other components capable of producing a vibration are also known, such as piezoelectric vibration generation devices, which can also be used. The usual operation is for the vibrator to be activated when an incoming call or message is received, to provide a physical movement of the device which can be felt by a user, for example if the device is in a user's pocket or hand.
It has also been proposed within the prior art, although this is not typical, to include an accelerometer 28 within a mobile communications device 10, to detect movement of the device. A prior example of such a mobile telecommunications device incorporating a vibrator and an accelerometer will be discussed next.
Prior Art
United States patent application publication number 2006/0172706 describes a wireless communications device having a vibration motor therein, arranged to vibrate the wireless device for a predetermined period. The wireless device is also provided with an accelerometer, at which an acceleration measurement can be taken during the period when the vibration motor is activated. By combining an onboard accelerometer with onboard vibrator, wireless devices are then given the means to detect if they are being held by users. Particularly, if a wireless device is on the table or is in the user's pocket and is not being held by anyone, the accelerometer can predictably measure the acceleration pattern that occurs when a vibrator is turned on. However, the acceleration patterns differ when a wireless device is being held by a person and when it is not being held, and in particular the acceleration patterns that are measured by the accelerometer reflect the effective mass seen by the vibrator. When the wireless device is being held, the effective mass is greater as it includes the mass of the wireless device and that of the user's hand and arm. Thus, the wireless device provided with the vibrator and accelerometer can determine if it is being held by turning on its vibrator for a predetermined period of time and by reading the output of its accelerometer.
US 2006/0172706 therefore describes how, using a vibrator and accelerometer, a wireless device is able to determine information pertaining to its surrounding environment, and adjust its operation accordingly. However, the technique of US
2006/0172706 requires the wireless device to be fitted with an accelerometer, a component which is not usually included within such devices, and which adds to the cost. Moreover, the exclusive use of an accelerometer to detect the device's situation means that only a relatively small number of situations can be detected i.e. whether the device is being held or not. Whilst the ability of a device to adapt its operation depending on its surrounding environment as described in US 2006/0172706 is useful, it would be more advantageous if such sensing of its surrounding environment could be performed without the need for additional components within the device, thus saving cost, and component integration difficulties.
Summary of the Invention
Embodiments of the invention provide a mobile device, such as , for example, a mobile telephone, which uses sensor components which are typically conventionally already found in such mobile devices, such as cameras and microphones, to perform a determination as to the present situation of the device. For example, in one embodiment a camera mounted on the mobile device (and which would typically be provided for other uses) is able to capture images of the surroundings of the device, and processing of those images then undertaken to determine a present situation of the device. Likewise, in another embodiment a sound signal captured by a microphone can be processed to provide an estimate of the device's present situation. In preferred embodiments a stimulus signal for measurement by the sensor can be provided, and this again is preferably provided by a component which is typically standard in such a device. Thus, for example, to aid situation determination by a camera, the mobile device screen can be caused to light, to attempt to light up the device's present surroundings. Similarly, where a microphone is being used as the sensor device, then the mobile device speaker (which would also be typically provided) can emit a test sound, and attenuation, distortion or other changes in the sound as measured by the microphone used to inform the situation determination. In further embodiments, multiple sensors may be used together, and this allows for additional situations to be discriminated between. In preferred embodiments the ability to distinguish between situations allows for the mobile device behaviour to be adapted in dependence on the discrimination. For example, the mobile device ringer volume may be changed, or the magnitude or frequency of vibration of the device vibrator altered.
In view of the above, from a first aspect the present invention provides a mobile device comprising: a light or sound sensor; and a processor for processing output signals from said light or sound sensor; the arrangement being such that said processor determines from said output signals a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found. Given that light and sound sensors are typically already provided on mobile devices for other uses, the use of such sensors in the context of the present invention means that no additional device componentry is required.
Preferably, the possible situations comprise at least two or more selected from the group comprising: in an enclosed space, in a user's hand, face up on a surface, face down on a surface. These are typical situations in which a mobile device may be found.
Preferably, the device behaviour is adapted in dependence on the determination of the present situation. The adaptation may take the form, for example, of adapting a ringer volume, or a vibrator vibration intensity or frequency. Alternatively, the screen brightness may be controlled.
In preferred embodiments the light sensor is a camera and the sound sensor is a microphone. Such components are typically already found on mobile devices, and hence the component cost of such a device is kept lower. Preferably in some embodiments the mobile device has light and sound sensors, said determination being performed in dependence on output signals from both such sensors. Using multiple such sensors allows for discrimination of a larger number of situations then with just using one sensor.
Preferably the mobile device has a second light sensor in addition to said first light sensor, said determination being performed in dependence on output signals from both such sensors. Again, using two such sensors allows for more situations to be discriminated. Preferably, to aid the discrimination, the second light sensor is provided on a different face of said device than said first light sensor.
In preferred embodiments the mobile device further comprises light or sound emitters, the arrangement being such that output signals from said light or sound sensor are obtained whilst said light or sound emitter is emitting. In particular, preferably said light or sound sensor input signal is compared by said processor with an output signal emitted by said light or sound emitter to perform said situation determination. This allows for a more accurate situation determination to be performed.
In preferred embodiments said light emitter is a display screen of said mobile device and said sound emitter is a speaker of said mobile device. Again, these are components typically found on a mobile device, and hence component count and cost can be kept low.
In preferred embodiments the mobile device further comprises: a vibrator; and an accelerometer arranged to measure vibrations of the said mobile device caused, at least in part, by said vibrator; the arrangement further being such that said processor receives an accelerometer output signal indicative of the measured vibrations, and said situation determination is performed in dependence thereon. Provided such a vibrator and accelerometer sensor combination in addition to the light and/or sound sensors further increases the number of situations which may be discriminated between.
From another aspect the present invention further provides a mobile device having a plurality of sensors for sensing one or more sensor media; a processor for processing signals produced by said sensors indicative of the present state of the sensor media; the arrangement being such that said processor determines from said produced signals a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found. Thus, using multiple sensors a greater number of mobile device situations can be discriminate between than has heretofore been the case.
In preferred embodiments, said processor determines from said produced signals a particular present situation of at least four predetermined possible situations in which said mobile device is to be presently found. The number of situations is therefore further increased.
Preferably, the predetermined possible situations are selected from the group comprising: in an enclosed space, in a user's hand, face up on a surface, face down on a surface. These are typical situations in which a mobile device may be found.
In embodiments the multiple sensors preferably comprise two or more sensors selected from the group comprising: a first light sensor, a second light sensor, a sound sensor, and a motion sensor. Preferably said first light sensor is a first camera mounted on a first face of the mobile device. Such components are typical in mobile devices. Likewise, preferably said second light sensor is a second camera mounted on a second face of the mobile device. Again, such components are also common, and hence no additional components are used.
Similarly, preferably said sound sensor is a microphone and said motion sensor is an accelerometer.
In preferred embodiment the mobile device preferably further comprises at least one emitter for emitting energy in the form of at least one of the sensor media detected by at least one of the sensors. This allows for an active investigation of the device's present surroundings to be undertaken, thus increasing the ability and accuracy of the discrimination between situations which is undertaken.
Preferably said produced signals are obtained from at least one of said sensor whilst said at least one emitter is emitting energy. More preferably said determination performed by said processor includes comparing a produced signal from at least one of the sensors which senses the sensor medium in which said emitter emits energy with an output signal of the emitter. Again, such functions allows the number of situations which can be discriminated to be higher than has heretofore been the case.
In embodiments the at least one emitter is preferably chosen from a group comprising: a light emitter, a sound emitter, and a motion generator. Preferably the light emitter is a display screen of the mobile device, and preferably the sound emitter is a speaker of the mobile device. Such components are typically provided in mobile devices for other uses, and hence component count and cost is not increased.
From another aspect the invention provides a method of operating a mobile device provided with a light or sound sensor, comprising the steps of: obtaining output signals from said light or sound sensor indicative of the mobile device surroundings; processing said output signals to determine therefrom a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found.
The same advantages as previously described in respect of the first aspect are obtained. Moreover, the same further features and associated advantages as previously described in respect of the first aspect may also be provided.
From a fourth aspect the invention also provides a method of operating a mobile device having a plurality of sensors for sensing one or more sensor media, comprising the steps of: obtaining signals from said plurality of sensors, said signals being indicative if the present state of the sensor media; and processing said signals to determine therefrom a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found.
The same advantages as previously described in respect of the second aspect are obtained. Moreover, the same further features and associated advantages as previously described in respect of the second aspect may also be provided From another aspect the invention additionally provides a computer program or suite of computer programs so arranged such that when executed by a computer processor they cause the computer to perform the steps of any of the third and fourth aspects above. Moreover, additionally provided is a machine readable storage medium storing the computer program or at least one of the suite of computer programs according to the fifth aspect. The machine readable storage medium may be any medium known in the art, such as solid state memory, optical discs, magneto-optical discs, magnetic discs, or the like.
Brief Description of the Drawings
Further features and advantages of the present invention will become apparent from the following description of embodiments thereof, presented by way of example only, and by reference to the accompanying drawings, wherein like reference numerals refer to like parts, and wherein: -
Figure 1 is a block diagram of a mobile communications device of the prior art;
Figure 2 is a drawing illustrating a first situation in which such a device may be found;
Figure 3 is a drawing illustrating a second situation in which such a device may be found;
Figure 4a is a drawing illustrating a third situation in which such a device may be found;
Figure 4b is a drawing illustrating a further situation in which such a device may be found;
Figure 5 is a block diagram of a mobile communications device according to a first embodiment of the invention;
Figure 6 is a diagram illustrating how light emitted from a screen can be reflected into a camera;
Figure 7 is a diagram illustrating how light emitted from a screen can be absorbed by suiTounding material; Figure 8 is a flow diagram of a method operation of a mobile communications device according to a first embodiment of the invention;
Figure 9 is a diagram illustrating how sound emitted from a speaker can be picked up by a microphone; Figure 10 is a diagram illustrating how sound emitted via speaker can be absorbed by surrounding material;
Figure 11 is a flow diagram illustrating a method of operation of a mobile communications device according to a second embodiment of the invention; Figure 12 is a table illustrating how various outputs and inputs of a mobile communications device are affected by various mobile device situations; Figure 13 is a flow diagram of a method of operation of a- mobile communications device according to another embodiment of the invention; and
Figure 14 is a flow diagram of a method of operation of a mobile communications device according to yet another embodiment of the invention.
Description of the Embodiments
Embodiments of the invention to be described relate to a mobile device, such as a mobile communications device such as a telephone, or other mobile device such as a
PDA, or the like, which is able to sense its surroundings, and then, optionally, adapt its behaviour in dependence on the sensed surroundings. Within the description below focus is made on describing the mobile device as being a telephone, but this is not to be taken as a limiting feature, and in other embodiments of the invention the mobile device may be any other type of device, such as, as mentioned, a PDA, or a laptop, media player, or the like.
Figures 2, 3 and 4 depict particular situations in which a mobile communications device 10 may be found. More particularly, Figure 2 illustrates how a mobile communications device 10 can commonly be found in an enclosed space, such as a user's pocket. In this case, the device 10 is in close contact with the material of the enclosed space 16, such that the material substantially surrounds the device 10. Commonly, the material of the enclosed space 16 would commonly be a type of fabric.
Figure 3 illustrates a second situation in which a mobile device 10 may be found, in this case being held in the hand of a user 12. Typically, this would be when the device is being used, or about to be used. Figures 4a and 4b illustrate a third situation which is commonly found, that is where the device 10 is simply placed on a surface, such as table top 14. In this case, as shown in Figure 4a the device may be placed face up, or, as shown in Figure 4b, face down.
Depending on which of the above situations a mobile device finds itself, in embodiments of the invention the device may conveniently adapt its behaviour in dependence on the situation. For example, in the case of the device finding itself in an enclosed situation such as shown in Figure 2, where the device is commonly within a user's pocket, then the ringer of the device may be made to be louder, so that the user can hear the ringer more easily. Likewise, any vibrator provided in the device 10 may also be controlled to cause the device to vibrate with a greater magnitude and/or different (preferably higher) frequency, so that the user feels the vibrations caused by the device more clearly. Similarly, however, because the device is in an enclosed space, and the user cannot see the screen of the device 10, it would not be useful for the screen 10 to be lit whilst in the enclosed space. Thus, therefore, the device 10, upon detecting that it is in an enclosed space 16, may control the screen so that it is not lit. This has the advantage of saving device battery power, which is an important consideration for mobile devices.
When the device finds itself in the second situation shown in Figure 3, wherein the device is in its user's hand, then the user has direct tactile communication with the device 10, which tactile communication can be used to attract the user's attention, for example without having to activate the ringer of the device. Thus, for example, where the device 10 finds itself in a user's hand 10, then the ringer may be controlled so as to be of reduced volume, or to be rendered mute, as not being necessary. Whilst in a user's hand 12, it is likely that the user may be looking at the device, and hence in this case it would be useful for the screen of the device 10 to be lit. Likewise, being in a user's hand, the user will feel any vibrations caused by the vibrator in the device, but given the direct tactile communication between the device and the user's hand, it is not necessary for these vibrations to be of any great magnitude. Thus, for example, the magnitude of vibrations and/or frequency of vibrations produced by a vibrator in the device can be reduced. Again, muting the ring tone, and reducing the magnitude of vibrations produced by the vibrator saves battery power. A third situation in which the device may find itself is that of Figure 4a, i.e. face up on a surface 14, such as a table top, or the like. In this case, the device is not in contact with, or necessarily in close proximity to a user, and hence there is no need to provide a tactile output in the form of the vibrator. Thus, in this situation, the vibrator within the device can be disabled. With respect to the ringer, however, the ringer may be set at its default volume, or may, alternatively, caused to be louder. Likewise, given that the device is face up, it is possible that the user may be able to see the screen of the device, and hence the screen is preferably lit, or made brighter. In this way, only those outputs of the device which are able to attract the user's attention usefully are used.
In the case of Figure 4b where the device finds itself face down on the surface, then as in the situation of Figure 4a there is no need to activate the vibrator, as the device is not in tactile communication with a user, and the user will be unable to feel such vibrations. Again as with the situation of Figure 4a, however, it is necessary to activate the ringer, either at the default volume, or, louder if necessary. However, different from Figure 4a, because the device is face down there is no need to activate the screen, as any light emitted by the screen will be blocked by the surface 14. Thus, in the situation of Figure 4b the screen does not need to be lit, thus further saving power.
Having described the possible modifications to the mobile communication device's behaviour which can be performed in dependence on the sensing of the device's situation or circumstances, several embodiments illustrating how the device may sense its surroundings will now be described. In particular, embodiments of the invention focus on using components which have become standard in mobile telecommunications devices, such as speakers and microphones, and cameras and screens, to enable sensing of the device's surroundings. In this way, reliance on relatively complicated and expensive components such as accelerometers is reduced.
Figure 5 is a block diagram of a mobile communications device 10 according to the embodiments of the invention. The mobile communications device 10 of Figure 5 is identical to the mobile communications device 10 of Figure 1 described previously, but with the difference that stored in the ROM 42 is an attitude detection program which may be run by the application processor 38 either periodically or constantly to detect the attitude i.e. the environment and circumstances of the mobile device 10, to permit adaptation of the device's outputs. The steps performed by the attitude detection program stored in the ROM 42 when run by the application processor 38 in each of the embodiments will be described later. Note that each of the embodiments to be described is based upon the provision of the attitude detection program in the ROM 42, the differences between each embodiment lying in the steps performed by the mobile device 10 under the control of the attitude detection program of each embodiment, when run by the application processor 38.
In view of the above, a first embodiment will be described with respect to Figures 6 to 8. Within the first embodiment, light is used as the sensing medium.
Figures 6 and 7 illustrate how the screen 24, and camera 26 can be used together in the first embodiment to detect the mobile device 10 situation. The first embodiment relies on the ability of the video camera 26 to collect images of the mobile device's situation, and in particular to discriminate whether the device is in a light or a dark place. Additionally, the first embodiment also relies on the use of the screen 24 as a light source, which can be used for lighting the surrounding environs of the device, for viewing by the camera. In particular, light emitted from the screen 24 can reflect off nearby objects and be captured by the camera 26, as well as ambient light, and any information thus obtained can be used to determine the mobile device 10 situation. Such an arrangement is shown in Figure 6, where light emitted by the screen 24 can reflect off any nearby objects, and by captured by the camera 26. In contrast, as shown in Figure 7, when the device 10 finds itself in an enclosed space, and is surrounded by, often dark, material such as that of pocket 16, then in this case the camera 26 will not capture any ambient light, as in the case of Figure 6, and moreover, due to the close proximity of the material 16 to screen 24, any light emitted by the screen 24 is either absorbed by, or reflected with a high degree of attenuation from, the material 16 of the enclosed space, such as a pocket. In this case, even when the screen 24 is lit, the camera 26 is unable to capture a light image. The attitude detection program stored in the ROM 42 of the first embodiment is therefore able to use such image information obtainable from the camera to discriminate between the various situations in which the mobile device may find itself. An example operation of the attitude detection program stored in the ROM 42 in performing such a discrimination is shown in Figure 8. More particularly, with reference to Figure 8, the attitude detection program stored in the ROM 42 and run on the application processor 38 within the first embodiment, uses the camera and screen to determine the device lO's situation, by first obtaining an image from the camera at step 8.2. At step 8.4 a determination is performed as to whether the image contains light. This determination may, for example, take the form of a thresholding operation looking at the grey scale values of the pixels of the image, and determining an average grey scale value, and comparing that average to a threshold. Where the average grey scale value is above the threshold (where a higher grey scale value indicates white, and a lower grey scale value indicates black) then the image is determined to contain light, i.e. is a light image. Here, the threshold may for example be set to be the median possible greyscale pixel value.
In this case, if the image from the camera does contain light, i.e. the average greyscale pixel value is greater than the threshold value, then the device knows that it is not in an enclosed space, and must either be in a user's hand, or on a table top, as determined at step 8.18. With this knowledge, at step 8.20 the mobile device preferably adapts its output such as the ringer and/or vibrator and/or screen to the detected hand or table top situation.
In the first embodiment it is not possible using just a camera to distinguish as to whether the device is either in the user's hand, or on the table top surface, but what is known is that at step 8.20 the device is not in an enclosed space such as a pocket. Therefore, the device knows that there is no need to increase the volume of the ringer, as the ringer will not be attenuated by the enclosed space. Therefore, the ringer volume can be kept at the default volume. Similarly, because the device knows that it is not an enclosed space, then there is a possibility of the user being able to see the screen of the device, and hence the device can cause the screen to be lit to alert the user to the receipt of a call or message. With respect to the vibrator, we described previously that when in a user's hand the vibrator can be activated at a reduced setting, and when on the table top it need not be activated at all. Within the first embodiment at step 8.20 a device does not know whether it is either in the user's hand or on a table top, and hence the vibrator may be activated at a reduced setting i.e. taking that result which is most likely to attract the user's attention. Returning to step 8.4, if the latent image captured by the camera at step 8.2 does not contain any light, i.e. the grey scale thresholding operation gives an average value below the threshold value, then at step 8.6 the device determines that it is necessary to perform a night check, and step 8.8 lights the screen of the device. This will cause light to be emitted from the screen 24, as shown in Figures 6 and 7. At the same time as the screen is lit, a second image is obtained from the camera at step 8.10, and an evaluation as to whether the image contains light is performed at step 8.12. The evaluation at step 8.12 may be identical to that performed at step 8.4 i.e. a thresholding operation performed on the average grey scale value of the image pixels. However, in this case the threshold may be set at a different, preferably lower, level than the threshold of step 8.4. The reason for this is that in step 8.4 an evaluation is performed as to whether the image contains ambient light i.e. daylight, which can be expected to be at a relatively high level. However, in step 8.12 an evaluation is being performed as to whether the image contains light emitted from the screen 24, and reflected from nearby objects. Given the power of a typical screen 24 in a mobile device, and the amount of light emitted therefrom, the level of light detected at step 8.12 will therefore be relatively low. Hence, a lower threshold will typically be used at step 8.12 than at step 8.4.
If, however, it is determined that the image does contain light at step 8.12 above the set threshold level, then this is likely because light emitted from the screen 24 has reflected off nearby objects i.e. has not been absorbed or highly attenuated by material in close proximity to the screen, as would be the case if the mobile device 10 was in a user's pocket or the like. Thus, the device can then infer at step 8.22 that it is either in the user's hand, or on the table top. In this case, the device's outputs can be adapted at step 8.20, and in the same manner as before.
Returning to step 8.12, if it is determined thereat that the image does not contain any light i.e. that the lower threshold value for the average grey scale of the image pixels is not met, then it is determined at step 8.14 that the phone must be in an enclosed space such as the user's pocket 16. In this case the attitude detection program then adapts the ringer and/or vibrator and/or screen output at step 8.16 to the pocket situation. As described previously, when in an enclosed space such as the user's pocket, the ringer is preferably caused to ring more loudly, and the vibrator to vibrate with a greater magnitude. However, it is not necessary to light the screen to attract the user's attention, as the user will be unable to see such a lit screen.
Thus, according to the first embodiment then using just the camera 26 as a sensor and the screen 24 as a light source it is possible for the mobile device 10 to detect whether or not it is in an enclosed situation such as a user's pocket, or whether it is either in the user's hand or on a surface. The device 10 can then adapt its user alert outputs in dependence on the determined situation. The great advantage of the first embodiment is that conventional mobile communications devices such as mobile telephones are typically provided with screens and cameras already, and hence no additional componentry is required within the device in order to operate according to the first embodiment. Instead, all that is required is additional software to cause the application processor 38 to control the speaker 20 and microphone 22, and to process the signals received therefrom. This is provided by the attitude detection program 42.
A second embodiment will now be described with respect to Figures 9 to 11. In the second embodiment, sound is used as the sensing medium for the mobile device 10, in that the speaker 20 of the mobile device 10 can be caused to emit a test signal, which is then recorded by the microphone 22. Depending on a comparison of the signal recorded by the microphone with the expected test pattern emitted by the speaker, an estimate of the environmental situation of the mobile device, and in particular which of the situations noted previously it finds itself in, can be undertaken.
More particularly, Figure 9 illustrates how the speaker 20 can be controlled to emit sound waves, preferably of known characteristics, such as a test tone or the like. When the device 10 is not in an enclosed space such as a user's pocket, then the sound waves emanate cleanly from the speaker 20, and can be picked up by the microphone 22. Here, the signal detected by the microphone 22 should be substantially similar to, or at least a known variation thereof, the emitted test tone.
In contrast, as shown in Figure 10, when the device 10 is in an enclosed space, then the sound waves emitted by the speaker 20 due to the test tone are absorbed, attenuated, or otherwise distorted by the material, such that the microphone 22 will pick up an attenuated and/or distorted signal corresponding to the test tone. By comparing the signal recorded by the microphone 22 during the emission of the test tone from the speaker 20 with the known test tone, an estimate can be made as to whether the device 10 is located within the enclosed situation, as shown in Figure 10, or in an open situation, as in Figure 9. Figure 11 is a flow diagram illustrating the operation of the attitude detection program in the ROM 42 according to the second embodiment, in performing such a determination.
More particularly, at step 11.2 the attitude detection program stores the test tone patterns, being the "bright" i.e. clean or undistorted test tone pattern itself, as well as information relating to a low attenuated version of the test tone pattern i.e. corresponding to the test tone having undergone a low degree of attenuation and/or distortion; as well as a high attenuated version of the test tone pattern i.e. a version of the test tone pattern which has undergone a high degree of attenuation and/or distortion. The test tone patterns are stored as part of the data of the attitude detection program in the ROM 42.
In order to detect the situation of the device 10 using the speaker and microphone, at step 11.4 the attitude detection program controls the application processor to cause the speaker 20 to emit the bright version of the test tone from the speaker. At the same time, the microphone 22 is controlled to record its input during the period while the test tone is being emitted, at step 11.6. Then, at step 11.8 the recorded input from the microphone 22 is compared with the test tone patterns stored in the ROM 42, and a determination performed at step 11.10 as to which test tone pattern is most similar. How the comparison and determination steps 11.8 and 11.10 are performed is a matter of implementation detail, and is dependent upon the information representing the test tone patterns. For example, where the stored test tone patterns stored at step 11.2 represent actual signal patterns then pattern matching techniques can be employed, to compare the recorded input with the test tone patterns, and determine the pattern that is most similar. Various conventional pattern matching techniques which may be used in this respect are known in the art, such as those used in speech recognition systems or the like. In other embodiments, a more simple comparison and determination can be performed. For example, the information relating to the low attenuated test tone and high attenuated test tone may, for example, simply be a signal threshold level which determines the degree of attenuation of the signal from the bright or unattenuated version. In this case, the average power, or absolute average signal level of the recorded input can be compared against the low attenuation and high attenuation threshold values, and determination as to the degree of attenuation of the signal determined based on this threshold operation.
In another embodiment, a distortion measurement of the signal may be used, rather than power or absolute signal level, with the distortion of the recorded input signal being compared with distortion values for the bright test pattern, the low attenuator test pattern, and the high attenuator test pattern. In further embodiments combinations of these measurements may be used to make the decision.
Howsoever the determination is performed at step 11.10, at step 11.12 an evaluation is performed as to whether the bright pattern is most similar, and if this is not the case, then at step 11.18 an evaluation is performed as to whether the low attenuated test tone pattern was the most similar. If this evaluation returns negative, then at step 11.22 an evaluation as to whether the high attenuated pattern was the most similar is undertaken. Here this must be the case, and hence at step 11.24 the determination is made that given the high degree of attenuation of the signal then the mobile device is in the enclosed situation such as the user's pocket. In such a case, processing proceeds to step 11.6, wherein the behaviour of the mobile device is then adapted to the pocket situation. In this respect, the adaptation of the mobile device outputs to the pocket situation at step 11.26 can be identical to that of step 8.16 of the first embodiment, described previously. That is, when determined to be in the pocket situation, the vibrator and ringer are caused to be of greater magnitude, and the screen is preferably not lit.
Returning to step 11.18, here an evaluation was performed as to whether the lower attenuated pattern was most similar. If this is the case, then this is because it is likely that the phone is either in an enclosed space such as a pocket (but the precise arrangement is such that a high attenuation of the signal has not occurred, although some attenuation or distortion has occurred) or that the phone is face down on a surface such as a table top. In this case, if the phone is face down then the speaker will likely be facing into the surface, and hence any sound waves emitted therefrom and subsequently recorded by the microphone will be distorted and/or attenuated by the surface. Thus, at step 11.20 the determination that the mobile device is either in the face down on table top situation, or in an enclosed situation such as a pocket is made but it is not possible to distinguish further between these two situations. In view of these situations, however, in either case there is no need to light the screen of the mobile device, and, if the device is in fact face down on a table top, increasing the magnitude of the ringer and vibrator, as if the phone was in a pocket, will still attract the user's attention (and perhaps even more so). Therefore, in this circumstance where it is not possible to distinguish between the phone being either face down on the table top or in a pocket, it is reasonable to adapt the phones output to the pocket situation, and hence processing proceeds to step 11.26.
Returning to step 11.12, if it was determined that the bright pattern was most similar, i.e. the recorded input from the microphone substantially reproduced with very little attenuation or distortion the emitted test tone, then here it is possible for the mobile device to determine that it is either in the user's hand, or face up on a table top, and such determination is made at step 11.14. It is not possible to distinguish further between these two situations, but as in the first embodiment it is possible to reconcile this lack of information and adapt the behaviour of the device accordingly to provide an output profile which is suitable to both the hand or table top situation. Therefore, at step 11.16 the device behaviour is adapted to the hand and table top situation, this adaptation being the same as previously described in step 8.20 of Figure 8 in respect of the first embodiment i.e. the screen is caused to be lit, but both the ringer and vibrator outputs can be reduced in magnitude.
Thus, within the second embodiment a mobile device 10 is able to use sound as a sensing medium using a speaker to emit a known sound, and then recording that sound through the microphone, as modified by the phone's surrounding environment. By then comparing the recorded sound with the original sound a determination can be performed as to the mobile device's situation, and its behaviour is adapted accordingly. Moreover, as with the first embodiment, no additional componentry is required within the mobile device other than that which is conventionally provided. Instead, again as in the first embodiment, all that is required is additional software, in the form of the attitude detection program stored in the ROM 42 and, in this case, the additional data in the form of the test tone patterns or information against which the recorded signal can be compared. Within the first and second embodiments described previously which use respectively light and sound as the sensing medium it was possible to distinguish between some of the possible situations as shown in Figures 2 to 4 in which a mobile device might find itself, but not all of the situations. For example, within the first embodiment it was possible to determine that the phone is either in: i) the user's hand, or face up on the table top; or alternatively, ii) in a user's pocket, or face down on the table top. Similarly, in the second embodiment it was possible to determine whether the phone was either: i) in the user's pocket, or face down on the table top; ii) in the pocket; or iii) in the user's hand or face up on the table top. However; a definitive determination between all four possible situations was not possible.
In further embodiments to be described, however, further discrimination between situations becomes possible by combining the sensors, for example to use two or more sensor media, such as, for example, sound and light. Using multiple sensors in this manner enables a greater degree of differentiation between the mobile device situations to be performed. Before describing such embodiments, reference is first made to Figure 12 which is a table showing the degree to which sensor combinations experience attenuation or distortion depending on the situation in which the mobile device finds itself. For example, as shown in Figure 12 when the mobile device is in an enclosed situation, the sensor combination of speaker output and microphone input will suffer either a low, or a high degree of attenuation or distortion. However, when the device is in a partially enclosed situation such as in a user's hand, then it is likely that the speaker output/microphone input sensor combination will not suffer any attenuation or distortion. Where the device is in an -open situation, such as on the table, attenuation or distortion will be experienced depending on whether the device is face up, or face down. Similar considerations can be made for each of the other sensor combinations, as shown in the table.
Additionally, the table also includes in this case the signal attenuation or distortion which would be suffered by a rear camera input provided on the mobile device. In this respect, it is common for many conventional mobile devices to be provided with two cameras, being one on the front face of the device i.e. the same face as the screen, and another on the rear face of the device i.e. the opposite face to the screen. By "rear camera input" in the table, we mean the input image obtained from the rear camera on the opposite face of the device to the screen. In this respect, when the device is in an enclosed space such as a bag or pocket the rear camera input will be highly attenuated and/or distorted i.e. will be dark. This will also be the case when the device is in a partially enclosed situation in a user's hand, as often the user's hand will obscure the rear camera input, and again a dark image will likely be obtained. However, when the device is in an open situation on a surface such as a table, then here the input will be the opposite of the front camera input, i.e. will be highly attenuated when the phone is in the face up position such that the surface on which the rear camera is mounted is face down to the surface, and will not be attenuated at all when the mobile device is in the face down position i.e. the surface on which the rear camera is mounted is facing upwards.
Using the additional input information provided by the rear camera, it is possible to provide further embodiments, based on either of the first and second embodiments, which allow for further distinction between the device situations. For example, as mentioned, in both the first and second embodiments a conclusion can be drawn at step 8.14 (for the first embodiment), or step 11.20 (for the second embodiment) that the device is either in the user's pocket, or face down on the table top, but no further distinction can be drawn therebetween. However, using additionally the information available from the rear camera input, then a further distinction can be drawn between these two situations. Figure 14 therefore illustrates the additional steps to be performed to provide further embodiments, based on either the first or second embodiment, and which uses an input signal from the rear camera to further distinguish between these two situations.
More particularly, with respect to Figure 14 following on from either step 8.14 (when based on the first embodiment using light) or step 11.20 (when based on the second embodiment using sound) wherein a determination has been made that the phone is either in the face down position on a surface, or in the user's pocket, then in order to distinguish between these two situations at step 14.2 an image is obtained from the rear camera on the mobile device, where this is provided. An evaluation is then performed as to whether the image from the rear camera is highly attenuated i.e. is the image dark or light, at step 14.4. As in the first embodiment, this evaluation can be performed by taking the grey scale values of the image pixels, and finding the average grey scale value. This can then be compared with a threshold value predetermined in advance as to whether an image is light or dark. Here, typically the threshold will be the median grey scale value as the threshold. If it is determined at step 14.4 that the image is highly attenuated i.e. too dark, then the attitude determination program can conclude that of the two situations i.e. face down on table top, or in pocket, then it is likely that the phone is in the user's pocket, and hence at step 14.12 the phone behaviour can then be adapted to the pocket situation. This adaptation can preferably take the same form as the adaptation used in the same situation in the first and second embodiments, such as at step 11.26 or step 8.16.
If, however,. at step 14.4 it is determined that the image is not highly attenuated i.e. the image is light, then the attitude determination program can conclude at step 14.6 that the phone is probably face down on the table top, such that the rear camera input is then capturing ambient light. In this case, the program then proceeds, at step 14.8, to adapt the phone to a face down on table top situation. As described previously, here it is not necessary to light the screen, and neither is it necessary to operate the vibrator, as the phone is not in any way in tactile communication with the user. Instead, all that need be activated is the phone ringer, which may either be kept in the default position, or, increased in magnitude.
Thus, the further embodiments based on Figure 14 provide the ability to further distinguish between the mobile device situations, and in particular as to whether the device is face down on a table top, or in a user's pocket. Moreover, when based upon the first embodiment then the use of the rear camera means that the sensing medium is always light, as the front camera and screen are used to provide the initial determination, and then the image captured from the rear camera used to provide the secondary determination. Conversely, when based upon the second embodiment, then sound is used as the sensing medium to make the primary determination, and then light in terms of the image captured from the rear camera used to make the secondary sensing determination. In both cases, however, multiple sensing devices are used to further distinguish the mobile device situation.
A final embodiment will now be described with respect to Figure 13. Here, within the embodiment to be described a vibrator and accelerometer sensor combination is used to provide further distinction between the mobile phone situations. By combining the information obtainable by use of a vibrator and accelerometer sensor combination with other sensor combinations, such as the microphone and speaker combination, then full distinction between the four possible situations shown previously in Figures 2 to 4 becomes possible. Whilst the use of a vibrator and accelerometer sensor combination was previously disclosed in US 2006/0172706 mentioned previously, therein it was merely used to distinguish between two possible mobile device situations i.e. whether the device was in the user's hand or not. In the present embodiment, however, by combining the information obtainable from the vibrator and accelerometer sensor combination with the information obtainable from other sensor combinations, it becomes possible to distinguish between more than two different mobile device situations, and, as will be shown, in the present embodiment between all four possible situations shown in Figures 2 to 4.
Referring now to Figure 13, in the present embodiment firstly the speaker and microphone sensor combination is used to perform a first determination, the results of which are then refined using the vibrator and accelerometer sensor combination. Therefore, at step 13.2 a test is performed using the speaker and microphone sensor combination. This test is essentially the same as the second embodiment described previously, and involves emitting a test tone, recording the microphone input signal whilst the test tone is being emitted, and then comparing the recorded signal with test tone patterns to determine the degree of attenuation or distortion of the signal. At step 13.4 the recorded signal is examined to determine whether it is highly attenuated, and if this is the case it is because the phone is likely in a user's pocket or other enclosed space, and hence at step 13.6 the phone behaviour is adapted to the pocket situation. In this respect, the adaptation to the pocket situation is preferably as described previously in the other embodiments.
If it is not determined at step 13.4 that the sound signal recorded by the microphone was highly attenuated or distorted, then at step 13.8 an evaluation is performed as to whether the signal suffered a low degree of attenuation or distortion. If this was the case, then as described previously in respect of the second embodiment, it is possible to conclude that the phone is either likely in a pocket, or face down on a table. This determination is made at step 13.10. To distinguish between the situations, then as in the embodiment just described it might be possible to use an input image from a rear camera, if provided, to distinguish between the two situations. However, in the present embodiment, the accelerometer and vibrator sensor combination is used to distinguish between the two situations, and this is performed at step 13.12. More particularly, here the test involves, as in the prior art, activating the vibrator for a brief period of time, and at the same time using the accelerometer to record the vibration pattern experienced by the device. If the recorded pattern is highly attenuated compared to what was expected, then it is likely that the phone is being held close to the user's body or is in close contact with other material, and hence is likely to be in a pocket or other enclosed space. In contrast, if a low attenuation of the vibration is perceived by the accelerometer, then it is likely that the phone is on the table top. Examination of the recorded vibration by the accelerometer is performed at step 13.4, and if it is determined that there is a high attenuation then at step 13.6 the determination is made that the phone is in the pocket, and the mobile device behaviour is then adapted to the pocket situation. Similarly, if at step 13.14 it is evaluated that there was a low attenuation or distortion of the vibration, then at step 13.18 the conclusion is reached that the phone is likely face down on the table top, and hence the mobile device behaviour is adapted to this situation, as previously described.
Returning to step 13.8, if it is determined that the sound did not suffer even low attenuation or distortion, then at step 13.20 it must be the case that the recorded sound was substantially similar to the test tone, suffering little or no distortion or attenuation. In this case, as in the second embodiment, it is possible to conclude at step 13.22 that the mobile device is either in the user's hand, or face up on a table. However, in the previous embodiments it was not possible to further distinguish between these situations.
In the present embodiment, it is possible to distinguish between these two situations by again using the vibrator and accelerometer sensor combination. Therefore, at step 13.24 the mobile device situation is tested with the vibrator and accelerometer sensor combination, the testing process being substantially as described previously, that is substantially the same as in the prior art. If the vibrations produced by the vibrator as measured by the accelerometer have suffered a high degree of attenuation, as evaluated at step 13.26, then, as in the prior art, it is possible to deduce that the phone is likely in the user's hand. In this case, as shown at step 13.28, the mobile device behaviour is then adapted to the hand situation as previously described. That is, because the user is in close tactile contact with the device and is likely looking at the device, to alert the user the screen can be lit, and a low magnitude of vibration produced by the vibrator. Such alerts are probably enough to alert the user, and hence it is not necessary to activate the ringer.
If step 13.26 does not detect a high degree of attenuation of the vibrations produced by the vibrator as measured by the accelerometer, and there is either no attenuation, or a low degree of attenuation, as determined at step 13.30, then it is possible to conclude, as shown at step 13.32, that the phone is likely on the table top face up. In this case, the mobile device behaviour is then adapted to the face-up-on-table situation as described previously i.e. the vibrator is disabled, the screen is caused to be lit, and the ringer can be activated at either the default, or increased, volume.
Thus, within this embodiment, using multiple sensors including the accelerometer it becomes possible to detect all four of the possible mobile device situations noted in Figures 2 to 4 previously. This allows for further and closer control of the mobile device behaviour, as already described. In particular, being able to control the device behaviour in the manners described allows for a more efficient alerting of the user to incoming calls or messages, or the like, whilst adapting the phone behaviour to the situations ensures that no unnecessary output is used, thus saving battery power.
Various further changes or modifications may be made to the above described embodiments to provide further embodiments. For example, within the previously described embodiments we have described adapting either the vibrator magnitude or simply lighting the screen or not. In further embodiments, however, closer control of the screen may be performed, for example to cause the screen to light more brightly in some situations than in others. For example, where the screen is one of the primary means of alerting the user, such as in the case where the phone is in the user's hand, then the screen may be caused to light more brightly then would otherwise be the default case in such a situation, to alert the user. Furthermore, various combinations of sensors may be used to provide further embodiments. Within the final embodiment described a combination of sound and vibration as the sensor media was used to enable detection of all possible mobile phone situations. However, in other embodiments other combinations may be used, such as, for example, light and vibration, or, light, sound and vibration. Examination of the information in the table of Figure 12 will indicate to the skilled person the various combinations of sensors which may be put together, to distinguish between the mobile phone situations. Of course, in further embodiments additional mobile phone situations may be provided, to be distinguished between.
Various further modifications and variations may be made to provide further embodiments using the same inventive concept, any and all of which are intended to be encompassed by the appended claims.

Claims

Claims
1. A mobile device comprising: a light or sound sensor; and a processor for processing output signals from said light or sound sensor; the arrangement being such that said processor determines from said output signals a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found.
2. A mobile device according to claim 1, wherein the possible situations comprise at least two or more selected from the group comprising: in an enclosed space, in a user's hand, face up on a surface, face down on a surface.
3. A mobile device according to claims 1 or 2, wherein the device behaviour is adapted in dependence on the determination of the present situation.
4. A mobile device according to any of claims 1 to 3, wherein the light sensor is a camera.
5. A mobile device according to any of claims 1 to 4, wherein the sound sensor is a microphone.
6. A mobile device according to any of the preceding claims, wherein the mobile device has light and sound sensors, said determination being performed in dependence on output signals from both such sensors.
7. A mobile device according to any of the preceding claims, wherein the mobile device has a second light sensor in addition to said first light sensor, said determination being performed in dependence on output signals from both such sensors.
8. A mobile device according to claim 7, wherein the second light sensor is provided on a different face of said device than said first light sensor.
9. A mobile device according to any of the preceding claims, comprising light or sound emitters, the arrangement being such that output signals from said light or sound sensor are obtained whilst said light or sound emitter is emitting.
10. A mobile device according to claim 9, wherein said light or sound sensor input signal is compared by said processor with an output signal emitted by said light or sound emitter to perform said situation determination.
11. A mobile device according to claims 9 or 10, wherein said light emitter is a display screen of said mobile device.
12. A mobile device according to claims 9 to 11, wherein said sound emitter is a speaker of said mobile device.
13. A mobile device according to any of claims 9 to 12, wherein said mobile device is provided with both a light emitter and a sound emitter.
14. A mobile device according to any of the preceding claims, wherein said mobile device further comprises: a vibrator; and an accelerometer arranged to measure vibrations of the said mobile device caused, at least in part, by said vibrator; the arrangement further being such that said processor receives an accelerometer output signal indicative of the measured vibrations, and said situation determination is performed in dependence thereon.
15. A mobile device having a plurality of sensors for sensing one or more sensor media; a processor for processing signals produced by said sensors indicative of the present state of the sensor media; the arrangement being such that said processor determines from said produced signals a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found.
16. A mobile device according to claim 15, wherein said processor determines from said produced signals a particular present situation of at least four predetermined possible situations in which said mobile device is to be presently found.
17. A mobile device according to claims 15 or 16, wherein the predetermined possible situations are selected from the group comprising: in an enclosed space, in a user's hand, face up on a surface, face down on a surface.
18. A mobile device according to claims 15 to 17, wherein the multiple sensors comprise two or more sensors selected from the group comprising: a first light sensor, a second light sensor, a sound sensor, and a motion sensor.
19. A mobile device according to claim 18, wherein said first light sensor is a first camera mounted on a first face of the mobile device.
20. A mobile device according to claims 18 or 19, wherein said second light sensor is a second camera mounted on a second face of the mobile device.
21. A mobile device according to claims 18 to 20, wherein said sound sensor is a microphone.
22. A mobile device according to claims 18 to 21, wherein said motion sensor is an accelerometer.
23. A mobile device according to any of claims 15 to 22, and further comprising at least one emitter for emitting energy in the form of at least one of the sensor media detected by at least one of the sensors.
24. A mobile device according to claim 23, wherein said produced signals are obtained from at least one of said sensor whilst said at least one emitter is emitting energy.
25. A mobile device according to claims 23 or 24, wherein said determination performed by said processor includes comparing a produced signal from at least one of the sensors which senses the sensor medium in which said emitter emits energy with an output signal of the emitter.
26. A mobile device according to any of claims 23 to 25, wherein the at least one emitter is chosen from a group comprising: a light emitter, a sound emitter, and a motion generator.
27. A mobile device according to claim 26, wherein the light emitter is a display screen of the mobile device.
28. A mobile device according to claim 26 or 27, wherein the sound emitter is a speaker of the mobile device.
29. A mobile device according to claim 26, 27, or 28, wherein the motion generator is a vibrator of the mobile device.
30. A method of operating a mobile device provided with a light or sound sensor, comprising the steps of: obtaining output signals from said light or sound sensor indicative of the mobile device surroundings; processing said output signals to determine therefrom a particular one of a plurality of predetermined possible situations in which said mobile device is presently to be found.
31. A method of operating a mobile device having a plurality of sensors for sensing one or more sensor media, comprising the steps of: obtaining signals from said plurality of sensors, said signals being indicative if the present state of the sensor media; and processing said signals to determine therefrom a particular present situation of at least three or more predetermined possible situations in which said mobile device is to be presently found.
32. A computer program or suite of computer programs so arranged such that when executed by a computer processor they cause the computer to perform the steps of any of claims 30 and 31.
33. A machine readable storage medium storing the computer program or at least one of the suite of computer programs according to claim 32.
PCT/GB2007/004948 2006-12-21 2007-12-21 Mobile device and method of operation thereof WO2008075082A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB0625642.4 2006-12-21
GBGB0625642.4A GB0625642D0 (en) 2006-12-21 2006-12-21 Mobile sensor feedback
GB0711759.1 2007-06-18
GB0711759A GB2445436A (en) 2006-12-21 2007-06-18 Mobile device which can sense its present situation

Publications (1)

Publication Number Publication Date
WO2008075082A1 true WO2008075082A1 (en) 2008-06-26

Family

ID=39319692

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2007/004948 WO2008075082A1 (en) 2006-12-21 2007-12-21 Mobile device and method of operation thereof

Country Status (1)

Country Link
WO (1) WO2008075082A1 (en)

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2207331A1 (en) 2008-12-30 2010-07-14 HTC Corporation Method and apparatus for automatically changing operating modes in a mobile device
WO2011041535A1 (en) 2009-09-30 2011-04-07 Apple Inc. Self adapting haptic device
US8117471B2 (en) 2007-11-27 2012-02-14 Htc Corporation Power management method for handheld electronic device using G-sensor
GB2484715A (en) * 2010-10-21 2012-04-25 Vodafone Ip Licensing Ltd Communication terminal with situation based configuration updating
US8213999B2 (en) 2007-11-27 2012-07-03 Htc Corporation Controlling method and system for handheld communication device and recording medium using the same
WO2012120346A1 (en) * 2011-03-07 2012-09-13 Sony Ericsson Mobile Communications Ab Electronic apparatus use environment detecting method, electronic apparatus performance optimizing method and electronic apparatus
US8552859B2 (en) 2009-09-30 2013-10-08 Apple Inc. Self adapting alert device
EP2854383A1 (en) * 2013-09-27 2015-04-01 Alcatel Lucent Method And Devices For Attention Alert Actuation
US9178509B2 (en) 2012-09-28 2015-11-03 Apple Inc. Ultra low travel keyboard
US9288305B2 (en) 2010-02-09 2016-03-15 Nokia Corporation Method and apparatus for monitoring a characteristic of an object in mechanical contact with a mobile terminal
US9317118B2 (en) 2013-10-22 2016-04-19 Apple Inc. Touch surface for simulating materials
US9501912B1 (en) 2014-01-27 2016-11-22 Apple Inc. Haptic feedback device with a rotating mass of variable eccentricity
US9564029B2 (en) 2014-09-02 2017-02-07 Apple Inc. Haptic notifications
US9608506B2 (en) 2014-06-03 2017-03-28 Apple Inc. Linear actuator
WO2017071735A1 (en) * 2015-10-27 2017-05-04 Telefonaktiebolaget Lm Ericsson (Publ) Light sensor input for controlling device
US9652040B2 (en) 2013-08-08 2017-05-16 Apple Inc. Sculpted waveforms with no or reduced unforced response
US9706303B2 (en) 2013-02-12 2017-07-11 Qualcomm Incorporated Speaker equalization for mobile devices
US9779592B1 (en) 2013-09-26 2017-10-03 Apple Inc. Geared haptic feedback element
AU2017100482B4 (en) * 2016-06-12 2017-11-09 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US9830784B2 (en) 2014-09-02 2017-11-28 Apple Inc. Semantic framework for variable haptic output
US9864432B1 (en) 2016-09-06 2018-01-09 Apple Inc. Devices, methods, and graphical user interfaces for haptic mixing
US9886093B2 (en) 2013-09-27 2018-02-06 Apple Inc. Band with haptic actuators
US9928950B2 (en) 2013-09-27 2018-03-27 Apple Inc. Polarized magnetic actuators for haptic response
US9984539B2 (en) 2016-06-12 2018-05-29 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US9996157B2 (en) 2016-06-12 2018-06-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US10013058B2 (en) 2010-09-21 2018-07-03 Apple Inc. Touch-based user interface with haptic feedback
US10039080B2 (en) 2016-03-04 2018-07-31 Apple Inc. Situationally-aware alerts
CN108605071A (en) * 2016-02-03 2018-09-28 高通股份有限公司 System, apparatus and method for proximity detection
US10120446B2 (en) 2010-11-19 2018-11-06 Apple Inc. Haptic input device
US10126817B2 (en) 2013-09-29 2018-11-13 Apple Inc. Devices and methods for creating haptic effects
US10175762B2 (en) 2016-09-06 2019-01-08 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US10236760B2 (en) 2013-09-30 2019-03-19 Apple Inc. Magnetic actuators for haptic response
US10268272B2 (en) 2016-03-31 2019-04-23 Apple Inc. Dampening mechanical modes of a haptic actuator using a delay
US10276001B2 (en) 2013-12-10 2019-04-30 Apple Inc. Band attachment mechanism with haptic response
US10353467B2 (en) 2015-03-06 2019-07-16 Apple Inc. Calibration of haptic devices
US10481691B2 (en) 2015-04-17 2019-11-19 Apple Inc. Contracting and elongating materials for providing input and output for an electronic device
JP2020009463A (en) * 2013-02-07 2020-01-16 アップル インコーポレイテッドApple Inc. Voice trigger for digital assistant
US10545604B2 (en) 2014-04-21 2020-01-28 Apple Inc. Apportionment of forces for multi-touch input devices of electronic devices
US10566888B2 (en) 2015-09-08 2020-02-18 Apple Inc. Linear actuators for use in electronic devices
US10599223B1 (en) 2018-09-28 2020-03-24 Apple Inc. Button providing force sensing and/or haptic output
US10622538B2 (en) 2017-07-18 2020-04-14 Apple Inc. Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body
US10691211B2 (en) 2018-09-28 2020-06-23 Apple Inc. Button providing force sensing and/or haptic output
US11314330B2 (en) 2017-05-16 2022-04-26 Apple Inc. Tactile feedback for locked device user interfaces
US11311022B2 (en) 2017-02-21 2022-04-26 Whitewave Services, Inc. System and method for mixing polyunsaturated fatty acids into a fluid food product
US11380470B2 (en) 2019-09-24 2022-07-05 Apple Inc. Methods to control force in reluctance actuators based on flux related parameters
AU2022224773B2 (en) * 2013-02-07 2023-06-01 Apple Inc. Voice trigger for a digital assistant
US11809631B2 (en) 2021-09-21 2023-11-07 Apple Inc. Reluctance haptic engine for an electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0774691A (en) * 1993-08-31 1995-03-17 Sanyo Electric Co Ltd Portable telephone set with folding mechanism
EP1109382A2 (en) * 1999-12-17 2001-06-20 Nokia Mobile Phones Ltd. Controlling a terminal of a communication system
WO2001086920A2 (en) * 2000-05-12 2001-11-15 Zvi Lapidot Apparatus and method for the kinematic control of hand-held devices
US20030157969A1 (en) * 2002-02-18 2003-08-21 Samsung Electronics Co., Ltd. Portable telephone, control method thereof, and recording medium therefor
EP1494440A2 (en) * 2003-07-04 2005-01-05 Lg Electronics Inc. Automatic control of image transmission in a mobile communication device
US20060172706A1 (en) * 2005-01-31 2006-08-03 Research In Motion Limited User hand detection for wireless devices

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0774691A (en) * 1993-08-31 1995-03-17 Sanyo Electric Co Ltd Portable telephone set with folding mechanism
EP1109382A2 (en) * 1999-12-17 2001-06-20 Nokia Mobile Phones Ltd. Controlling a terminal of a communication system
WO2001086920A2 (en) * 2000-05-12 2001-11-15 Zvi Lapidot Apparatus and method for the kinematic control of hand-held devices
US20030157969A1 (en) * 2002-02-18 2003-08-21 Samsung Electronics Co., Ltd. Portable telephone, control method thereof, and recording medium therefor
EP1494440A2 (en) * 2003-07-04 2005-01-05 Lg Electronics Inc. Automatic control of image transmission in a mobile communication device
US20060172706A1 (en) * 2005-01-31 2006-08-03 Research In Motion Limited User hand detection for wireless devices

Cited By (122)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8213999B2 (en) 2007-11-27 2012-07-03 Htc Corporation Controlling method and system for handheld communication device and recording medium using the same
US8117471B2 (en) 2007-11-27 2012-02-14 Htc Corporation Power management method for handheld electronic device using G-sensor
US8886252B2 (en) 2008-12-22 2014-11-11 Htc Corporation Method and apparatus for automatically changing operating modes in a mobile device
EP2207331A1 (en) 2008-12-30 2010-07-14 HTC Corporation Method and apparatus for automatically changing operating modes in a mobile device
US10290202B2 (en) 2009-09-30 2019-05-14 Apple Inc. Self adapting alert device
CN104917885A (en) * 2009-09-30 2015-09-16 苹果公司 Self adapting haptic device
US11043088B2 (en) 2009-09-30 2021-06-22 Apple Inc. Self adapting haptic device
CN102714683A (en) * 2009-09-30 2012-10-03 苹果公司 Self adapting haptic device
JP2013507059A (en) * 2009-09-30 2013-02-28 アップル インコーポレイテッド Self-adaptive tactile device
US8487759B2 (en) 2009-09-30 2013-07-16 Apple Inc. Self adapting haptic device
US8552859B2 (en) 2009-09-30 2013-10-08 Apple Inc. Self adapting alert device
US8836499B2 (en) 2009-09-30 2014-09-16 Apple Inc. Self adapting alert device
KR101441769B1 (en) 2009-09-30 2014-09-17 애플 인크. Self adapting haptic device
US8860562B2 (en) 2009-09-30 2014-10-14 Apple Inc. Self adapting haptic device
US9934661B2 (en) 2009-09-30 2018-04-03 Apple Inc. Self adapting haptic device
US10629060B2 (en) 2009-09-30 2020-04-21 Apple Inc. Self adapting alert device
AU2010300576B2 (en) * 2009-09-30 2015-03-05 Apple Inc. Self adapting haptic device
US10475300B2 (en) 2009-09-30 2019-11-12 Apple Inc. Self adapting haptic device
JP2015119488A (en) * 2009-09-30 2015-06-25 アップル インコーポレイテッド Self adapting haptic device
US9691260B2 (en) 2009-09-30 2017-06-27 Apple Inc. Electronic device with orientation-based alert adjustment
US11605273B2 (en) 2009-09-30 2023-03-14 Apple Inc. Self-adapting electronic device
WO2011041535A1 (en) 2009-09-30 2011-04-07 Apple Inc. Self adapting haptic device
US9640048B2 (en) 2009-09-30 2017-05-02 Apple Inc. Self adapting haptic device
US9299244B2 (en) 2009-09-30 2016-03-29 Apple Inc. Self adapting alert device
KR20160042469A (en) * 2009-09-30 2016-04-19 애플 인크. Self adapting haptic device
KR101613776B1 (en) 2009-09-30 2016-04-19 애플 인크. Self adapting haptic device
US9984554B2 (en) 2009-09-30 2018-05-29 Apple Inc. Electronic device with orientation-based alert adjustment
KR101647941B1 (en) 2009-09-30 2016-08-11 애플 인크. Self adapting haptic device
CN108616660A (en) * 2009-09-30 2018-10-02 苹果公司 Adaptive haptic apparatus
US9202355B2 (en) 2009-09-30 2015-12-01 Apple Inc. Self adapting haptic device
US9288305B2 (en) 2010-02-09 2016-03-15 Nokia Corporation Method and apparatus for monitoring a characteristic of an object in mechanical contact with a mobile terminal
US10013058B2 (en) 2010-09-21 2018-07-03 Apple Inc. Touch-based user interface with haptic feedback
GB2484715A (en) * 2010-10-21 2012-04-25 Vodafone Ip Licensing Ltd Communication terminal with situation based configuration updating
US10120446B2 (en) 2010-11-19 2018-11-06 Apple Inc. Haptic input device
WO2012120346A1 (en) * 2011-03-07 2012-09-13 Sony Ericsson Mobile Communications Ab Electronic apparatus use environment detecting method, electronic apparatus performance optimizing method and electronic apparatus
CN102680736A (en) * 2011-03-07 2012-09-19 索尼爱立信移动通讯有限公司 Electronic equipment, operating environment detection method of same and performance optimization method of same
US8925399B2 (en) 2011-03-07 2015-01-06 Sony Corporation Electronic apparatus use environment detecting method, electronic apparatus performance optimizing method and electronic apparatus
US9911553B2 (en) 2012-09-28 2018-03-06 Apple Inc. Ultra low travel keyboard
US9997306B2 (en) 2012-09-28 2018-06-12 Apple Inc. Ultra low travel keyboard
US9178509B2 (en) 2012-09-28 2015-11-03 Apple Inc. Ultra low travel keyboard
JP2021192269A (en) * 2013-02-07 2021-12-16 アップル インコーポレイテッドApple Inc. Voice trigger for digital assistants
JP2020009463A (en) * 2013-02-07 2020-01-16 アップル インコーポレイテッドApple Inc. Voice trigger for digital assistant
KR20230048166A (en) * 2013-02-07 2023-04-10 애플 인크. Voice trigger for a digital assistant
KR20220106856A (en) * 2013-02-07 2022-07-29 애플 인크. Voice trigger for a digital assistant
US11862186B2 (en) 2013-02-07 2024-01-02 Apple Inc. Voice trigger for a digital assistant
US10978090B2 (en) 2013-02-07 2021-04-13 Apple Inc. Voice trigger for a digital assistant
US11636869B2 (en) 2013-02-07 2023-04-25 Apple Inc. Voice trigger for a digital assistant
KR102516577B1 (en) 2013-02-07 2023-04-03 애플 인크. Voice trigger for a digital assistant
KR102579086B1 (en) 2013-02-07 2023-09-15 애플 인크. Voice trigger for a digital assistant
AU2022224773B2 (en) * 2013-02-07 2023-06-01 Apple Inc. Voice trigger for a digital assistant
US11557310B2 (en) 2013-02-07 2023-01-17 Apple Inc. Voice trigger for a digital assistant
JP7177235B2 (en) 2013-02-07 2022-11-22 アップル インコーポレイテッド Voice trigger for digital assistant
EP2957109B1 (en) * 2013-02-12 2019-04-10 QUALCOMM Incorporated Speaker equalization for mobile devices
US9706303B2 (en) 2013-02-12 2017-07-11 Qualcomm Incorporated Speaker equalization for mobile devices
US9652040B2 (en) 2013-08-08 2017-05-16 Apple Inc. Sculpted waveforms with no or reduced unforced response
US9779592B1 (en) 2013-09-26 2017-10-03 Apple Inc. Geared haptic feedback element
US9886093B2 (en) 2013-09-27 2018-02-06 Apple Inc. Band with haptic actuators
US9928950B2 (en) 2013-09-27 2018-03-27 Apple Inc. Polarized magnetic actuators for haptic response
EP2854383A1 (en) * 2013-09-27 2015-04-01 Alcatel Lucent Method And Devices For Attention Alert Actuation
US10126817B2 (en) 2013-09-29 2018-11-13 Apple Inc. Devices and methods for creating haptic effects
US10236760B2 (en) 2013-09-30 2019-03-19 Apple Inc. Magnetic actuators for haptic response
US10651716B2 (en) 2013-09-30 2020-05-12 Apple Inc. Magnetic actuators for haptic response
US9317118B2 (en) 2013-10-22 2016-04-19 Apple Inc. Touch surface for simulating materials
US10459521B2 (en) 2013-10-22 2019-10-29 Apple Inc. Touch surface for simulating materials
US10276001B2 (en) 2013-12-10 2019-04-30 Apple Inc. Band attachment mechanism with haptic response
US9501912B1 (en) 2014-01-27 2016-11-22 Apple Inc. Haptic feedback device with a rotating mass of variable eccentricity
US10545604B2 (en) 2014-04-21 2020-01-28 Apple Inc. Apportionment of forces for multi-touch input devices of electronic devices
US9608506B2 (en) 2014-06-03 2017-03-28 Apple Inc. Linear actuator
US10069392B2 (en) 2014-06-03 2018-09-04 Apple Inc. Linear vibrator with enclosed mass assembly structure
US9830782B2 (en) 2014-09-02 2017-11-28 Apple Inc. Haptic notifications
US10417879B2 (en) 2014-09-02 2019-09-17 Apple Inc. Semantic framework for variable haptic output
US9564029B2 (en) 2014-09-02 2017-02-07 Apple Inc. Haptic notifications
US10977911B2 (en) 2014-09-02 2021-04-13 Apple Inc. Semantic framework for variable haptic output
US9830784B2 (en) 2014-09-02 2017-11-28 Apple Inc. Semantic framework for variable haptic output
US11790739B2 (en) 2014-09-02 2023-10-17 Apple Inc. Semantic framework for variable haptic output
US10490035B2 (en) 2014-09-02 2019-11-26 Apple Inc. Haptic notifications
US10504340B2 (en) 2014-09-02 2019-12-10 Apple Inc. Semantic framework for variable haptic output
US9928699B2 (en) 2014-09-02 2018-03-27 Apple Inc. Semantic framework for variable haptic output
US10089840B2 (en) 2014-09-02 2018-10-02 Apple Inc. Semantic framework for variable haptic output
US10353467B2 (en) 2015-03-06 2019-07-16 Apple Inc. Calibration of haptic devices
US10481691B2 (en) 2015-04-17 2019-11-19 Apple Inc. Contracting and elongating materials for providing input and output for an electronic device
US11402911B2 (en) 2015-04-17 2022-08-02 Apple Inc. Contracting and elongating materials for providing input and output for an electronic device
US10566888B2 (en) 2015-09-08 2020-02-18 Apple Inc. Linear actuators for use in electronic devices
WO2017071735A1 (en) * 2015-10-27 2017-05-04 Telefonaktiebolaget Lm Ericsson (Publ) Light sensor input for controlling device
US10331272B2 (en) 2015-10-27 2019-06-25 Telefonaktiebolaget Lm Ericsson (Publ) Light sensor input for controlling device
CN108605071B (en) * 2016-02-03 2019-10-25 高通股份有限公司 System, apparatus and method for proximity detection
CN108605071A (en) * 2016-02-03 2018-09-28 高通股份有限公司 System, apparatus and method for proximity detection
US10609677B2 (en) 2016-03-04 2020-03-31 Apple Inc. Situationally-aware alerts
US10039080B2 (en) 2016-03-04 2018-07-31 Apple Inc. Situationally-aware alerts
US10809805B2 (en) 2016-03-31 2020-10-20 Apple Inc. Dampening mechanical modes of a haptic actuator using a delay
US10268272B2 (en) 2016-03-31 2019-04-23 Apple Inc. Dampening mechanical modes of a haptic actuator using a delay
US10175759B2 (en) 2016-06-12 2019-01-08 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US10139909B2 (en) 2016-06-12 2018-11-27 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
AU2017100482B4 (en) * 2016-06-12 2017-11-09 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
JP2020119575A (en) * 2016-06-12 2020-08-06 アップル インコーポレイテッドApple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US9984539B2 (en) 2016-06-12 2018-05-29 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US11037413B2 (en) 2016-06-12 2021-06-15 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US10692333B2 (en) 2016-06-12 2020-06-23 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US11735014B2 (en) 2016-06-12 2023-08-22 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US9996157B2 (en) 2016-06-12 2018-06-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US10156903B2 (en) 2016-06-12 2018-12-18 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
JP7240347B2 (en) 2016-06-12 2023-03-15 アップル インコーポレイテッド Devices, methods, and graphical user interfaces that provide haptic feedback
US10276000B2 (en) 2016-06-12 2019-04-30 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US11379041B2 (en) 2016-06-12 2022-07-05 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US11468749B2 (en) 2016-06-12 2022-10-11 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US9864432B1 (en) 2016-09-06 2018-01-09 Apple Inc. Devices, methods, and graphical user interfaces for haptic mixing
US10528139B2 (en) 2016-09-06 2020-01-07 Apple Inc. Devices, methods, and graphical user interfaces for haptic mixing
US10372221B2 (en) 2016-09-06 2019-08-06 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US10901513B2 (en) 2016-09-06 2021-01-26 Apple Inc. Devices, methods, and graphical user interfaces for haptic mixing
US10175762B2 (en) 2016-09-06 2019-01-08 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US10901514B2 (en) 2016-09-06 2021-01-26 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US11662824B2 (en) 2016-09-06 2023-05-30 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US11221679B2 (en) 2016-09-06 2022-01-11 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US10620708B2 (en) 2016-09-06 2020-04-14 Apple Inc. Devices, methods, and graphical user interfaces for generating tactile outputs
US11311022B2 (en) 2017-02-21 2022-04-26 Whitewave Services, Inc. System and method for mixing polyunsaturated fatty acids into a fluid food product
US11314330B2 (en) 2017-05-16 2022-04-26 Apple Inc. Tactile feedback for locked device user interfaces
US10622538B2 (en) 2017-07-18 2020-04-14 Apple Inc. Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body
US10691211B2 (en) 2018-09-28 2020-06-23 Apple Inc. Button providing force sensing and/or haptic output
US10599223B1 (en) 2018-09-28 2020-03-24 Apple Inc. Button providing force sensing and/or haptic output
US11763971B2 (en) 2019-09-24 2023-09-19 Apple Inc. Methods to control force in reluctance actuators based on flux related parameters
US11380470B2 (en) 2019-09-24 2022-07-05 Apple Inc. Methods to control force in reluctance actuators based on flux related parameters
US11809631B2 (en) 2021-09-21 2023-11-07 Apple Inc. Reluctance haptic engine for an electronic device

Similar Documents

Publication Publication Date Title
WO2008075082A1 (en) Mobile device and method of operation thereof
GB2445436A (en) Mobile device which can sense its present situation
US8401513B2 (en) Proximity sensor, in particular microphone for reception of sound signals in the human audible sound range, with ultrasonic proximity estimation
US9706321B2 (en) Electronic device including modifiable output parameter
CN110166890B (en) Audio playing and collecting method and device and storage medium
US20100279661A1 (en) Portable electronic device
CN106375676A (en) Photographing control method and device of terminal equipment, and terminal equipment
JP2006109460A (en) Electronic apparatus having environmental light sensor
US9912797B2 (en) Audio tuning based upon device location
CN111314560A (en) Method for adjusting sound loudness and communication terminal
CN110708630B (en) Method, device and equipment for controlling earphone and storage medium
CN113132863B (en) Stereo pickup method, apparatus, terminal device, and computer-readable storage medium
CN111048111A (en) Method, device and equipment for detecting rhythm point of audio frequency and readable storage medium
CN110659542A (en) Monitoring method and device
CN108267728A (en) Calibration method, device, equipment and the storage medium of range sensor
CN113542963B (en) Sound mode control method, device, electronic equipment and storage medium
CN107135305A (en) A kind of message prompt method, device and terminal
WO2022068304A1 (en) Sound quality detection method and device
CN108769364B (en) Call control method, device, mobile terminal and computer readable medium
JP5183790B2 (en) Mobile terminal device
KR101475354B1 (en) Portable terminal having pressure sensor and method for measuring pressure thereof
CN112817554A (en) Alert sound control method, alert sound control device, and storage medium
CN112532789B (en) Ring tone processing method and device, terminal and storage medium
CN109963246A (en) Mike's detection method and equipment
EP2608497B1 (en) Electronic device including modifiable output parameter

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07848671

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07848671

Country of ref document: EP

Kind code of ref document: A1