CN117128917A - Cover angle detection - Google Patents

Cover angle detection Download PDF

Info

Publication number
CN117128917A
CN117128917A CN202310599926.6A CN202310599926A CN117128917A CN 117128917 A CN117128917 A CN 117128917A CN 202310599926 A CN202310599926 A CN 202310599926A CN 117128917 A CN117128917 A CN 117128917A
Authority
CN
China
Prior art keywords
orientation
sensor unit
processor
state
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310599926.6A
Other languages
Chinese (zh)
Inventor
F·里扎尔迪尼
L·布拉科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
STMicroelectronics SRL
Original Assignee
STMicroelectronics SRL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/183,464 external-priority patent/US20230384837A1/en
Application filed by STMicroelectronics SRL filed Critical STMicroelectronics SRL
Publication of CN117128917A publication Critical patent/CN117128917A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/22Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B7/00Measuring arrangements characterised by the use of electric or magnetic techniques
    • G01B7/30Measuring arrangements characterised by the use of electric or magnetic techniques for measuring angles or tapers; for testing the alignment of axes

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

Embodiments of the present disclosure relate to cover angle detection. The present disclosure relates to an apparatus and method for lid angle detection that is accurate even if the apparatus is activated in an upright position. The first and second sensor units measure acceleration and angular velocity when the device is in a sleep state, and calculate the orientation of the respective cover parts based on the acceleration and angular velocity measurements. After the device exits the sleep state, the processor estimates a cap angle using the calculated orientation, sets the estimated cap angle to an initial cap angle, and uses, for example: two accelerometers; two accelerometers and two gyroscopes; two accelerometers and two magnetometers; or two accelerometers, two gyroscopes and two magnetometers to update the initial cover angle.

Description

Cover angle detection
Technical Field
The present disclosure relates to cover angle detection.
Background
Cover angle detection involves determining an angle between two cover members of a foldable electronic device, such as a notebook computer and a foldable mobile device, that are folded over one another about a hinge or fold portion. Typically, one of the two cover parts comprises a display and the other of the two cover parts comprises another display or a user input device, such as a keyboard.
The angle between the two cover parts is often referred to as the cover angle or hinge angle. Typically, the cover angle of the foldable electronic device is equal to zero degrees when the foldable electronic device is in a closed state (e.g., the display of the first cover member faces the display of the second cover member), and is equal to 180 degrees when the foldable electronic device is in a fully open state (e.g., the display of the first cover member and the display of the second cover member face the same direction).
Current cover angle detection solutions are costly and have high power consumption. Furthermore, for foldable mobile devices, current lid angle detection solutions cannot accurately determine the lid angle when the foldable mobile device is activated in an upright position (e.g., the hinge or folded portion of the foldable mobile device extends in a direction parallel to gravity) or in an unstable state (e.g., when the foldable mobile device is moving or shaking).
In particular, when the cover angle detection solution is activated, the cover angle cannot be determined if the foldable mobile device is in an upright position or in an unstable state. To manage the corner case described above, the cover angle detection solution is always running (even when the foldable mobile device is in sleep mode). This ultimately results in high power consumption because the high power processor is always active. Alternatively, hall sensors or magnetometers are used to address this problem, increasing cost and power consumption.
As foldable electronic devices, and in particular foldable mobile phones, become more popular, manufacturers desire to incorporate accurate, low cost cover angle detection solutions in foldable electronic devices that also function when the device is activated in an upright position.
Disclosure of Invention
The present disclosure relates to cover or hinge angle detection for foldable devices such as foldable mobile phones. Unlike current detection methods, the lid angle detection disclosed herein is capable of detecting the lid angle if the foldable device is activated in an upright position (e.g., when the lid axis is parallel to gravity) or in an unstable state (e.g., when the foldable mobile device is moved or rocked). Further cover angle detection may continue to be performed when the device enters a sleep state.
The device comprises a high power application processor, and low power first and second sensor units positioned in respective cover members. The application processor is the main processing unit of the device and enters a sleep state when the device is in the sleep state. The first and second sensor units are multi-sensor devices comprising a plurality of sensors (e.g. accelerometers, magnetometers, gyroscopes, etc.) and are capable of performing simple algorithms. In contrast to the application processor, the first and second sensor units remain in an on state even when the device is in a sleep state.
The first sensor unit and the second sensor unit measure acceleration and angular velocity when the device is in a sleep state, and calculate the orientation of the respective cover members based on the acceleration and angular velocity measurements. When the device and the application processor exit the sleep state, the application processor estimates a cover angle using the calculated orientation and sets the estimated cover angle to an initial cover angle. The application processor then uses one or more of the acceleration, magnetometer, or gyroscope measurements to update the initial cover angle.
Drawings
In the drawings, like reference numerals identify similar features or elements. The dimensions and relative positioning of features in the drawings are not necessarily drawn to scale.
Fig. 1 is an apparatus according to one embodiment disclosed herein.
Fig. 2 is a block diagram of an apparatus according to one embodiment disclosed herein.
FIG. 3 is a flow chart of a method according to one embodiment disclosed herein.
Detailed Description
In the following description, certain specific details are set forth in order to provide a thorough understanding of aspects of the disclosed subject matter. However, the disclosed subject matter may be practiced without these specific details. In some instances, well-known structures and methods of manufacturing electronic components, foldable devices, and sensors have not been described in detail to avoid obscuring descriptions of other aspects of the present disclosure.
Throughout the specification and the appended claims, unless the context requires otherwise, the word "comprise" and variations such as "comprises" and "comprising", will be construed in an open, inclusive sense as "including but not limited to"
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrase "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily referring to the same aspect. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more aspects of the disclosure.
As described above, the current cover angle detection solution is costly and has high power consumption. Furthermore, for foldable mobile devices, current lid angle detection solutions cannot determine the lid angle when the foldable mobile device is activated in an upright position (e.g., the hinge or folded portion of the foldable mobile device extends in a direction parallel to gravity) or in an unstable state (e.g., when the foldable mobile device is moving or shaking).
The present disclosure relates to an apparatus and method for cover angle detection. The cover angle detection disclosed herein provides an accurate, low cost cover angle detection solution that also works when the foldable electronic device is activated in an upright position or in an unstable state.
Fig. 1 is an apparatus 10 according to one embodiment disclosed herein. In this embodiment, the device 10 is a foldable mobile device such as a portable smart device, a tablet computer, and a telephone. The device 10 may also be another type of device such as a laptop computer. The device 10 includes a first cover member 12, a second cover member 14, and a hinge 18.
Each of the first and second cover members 12, 14 includes a housing or shell that houses the internal components of the device 10 (e.g., processor, sensor, capacitor, resistor, amplifier, speaker, etc.). As will be discussed in further detail below, the first and second sensor units 34, 36 are housed within the first and second cover members 12, 14, respectively.
The first and second cover members 12, 14 include first and second user interfaces 22, 24, respectively. In the embodiment shown in fig. 1 and discussed below, the first user interface 22 and the second user interface 24 are displays. However, each of the first user interface 22 and the second user interface 24 may be a display (e.g., monitor, touch screen, etc.), a user input device (e.g., buttons, keyboard, etc.), and/or other types of user interfaces. In one embodiment, the first user interface 22 and the second user interface 24 are two portions of a single flexible display.
Similar to the book, the first cover part 12 and the second cover part 14 are folded over each other about a hinge 18. The first cover part 12 and the second cover part 14 rotate relative to the hinge axis 26. The hinge 18 may be any type of mechanical device that allows the first and second cover members 12, 14 to rotate relative to the hinge axis 26.
As will be discussed in further detail below, the apparatus 10 performs a cover angle detection to determine a cover angle 28 between the first cover member 12 and the second cover member 14. The cover angle 28 is the angle between a first surface 30 of the first cover member 12 (more specifically, the first user interface 22) and a second surface 32 of the second cover member 14 (more specifically, the second user interface 24). The cover angle 28 is equal to zero degrees when the foldable electronic device is in a closed state (e.g., the first surface 30 faces the second surface 32), and the cover angle 28 is equal to 180 degrees when the foldable electronic device is in a fully open state (e.g., the first surface 30 and the second surface 32 face the same direction).
Fig. 2 is a block diagram of device 10 according to one embodiment disclosed herein. The device 10 comprises a first sensor unit 34, a second sensor unit 36 and an application processor 38.
Each of the first sensor unit 34 and the second sensor unit 36 is a multi-sensor device that includes one or more types of sensors including, but not limited to, accelerometers and gyroscopes, as well as magnetometers. The accelerometer measures acceleration along one or more axes. The gyroscope measures angular velocity along one or more axes. Magnetometers measure magnetic fields along one or more axes.
Each of the first sensor unit 34 and the second sensor unit 36 also includes its own on-board memory and processor. The processor is configured to process the data generated by the sensors and execute simple programs such as finite state machines and decision tree logic.
The first and second sensor units 34 and 36 are positioned in the first and second cover members 12 and 14, respectively. As will be discussed in further detail below, the first and second sensor units 34 and 36 determine the orientation of the first and second cover members 12 and 14, respectively, for cover angle detection.
The first sensor unit 34 and the second sensor unit 36 are energy-saving, low-power devices that remain on after the device 10 enters a sleep state. In one embodiment, each of the first sensor unit 34 and the second sensor unit 36 consumes 5 to 120 microamps for processing. In the sleep state, the application processor 38 and other electronic components (e.g., speaker, sensor, processor) of the device 10 are set to a low power or off state.
The application processor 38 is a general purpose processing unit. The application processor 38 may be any type of processor, controller, or signal processor configured to process data. In one embodiment, application processor 38 is a general purpose processor of device 10 itself that is used with the processing data for cover angle detection discussed below to process data for the operating system, user applications, and other types of software of device 10. As will be discussed in further detail below, the application processor 38 processes the orientations determined by the first cover member 12 and the second cover member 14 to obtain an initial cover angle value for the device 10, and performs cover angle detection to obtain a current cover angle value.
The application processor 38 may be positioned within the first cover member 12 along with the first sensor unit 34; or together with the second sensor unit 36 within the second cover part 14.
The application processor 38 is a high power processing unit that is set to a low power or closed state when the device 10 enters a sleep state. In one embodiment, application processor 38 consumes between one tenth and a few tenths of a milliamp during processing. When in a low power or off state, the application processor 38 is unable to receive sensor measurements from the first and second sensor units 34, 36, and therefore is unable to perform lid angle detection.
Fig. 3 is a flow chart of a method 40 according to one embodiment disclosed herein. The method 40 performs cover angle detection on the device 10.
In block 42, the device 10 detects whether a screen close event has occurred. The screen off event may be detected by a first sensor unit 34, a second sensor unit 36, an application processor 38, or another electronic component (e.g., a processor, a sensor, etc.) included in the device 10.
In a screen off event, the first user interface 22 and/or the second user interface 24 of the device 10 are set to a low power or off state and no image is displayed on the screen. In one embodiment, the screen off event occurs in response to a user activating a power button of the device 10, in response to the device 10 being in a closed state (e.g., the first surface 30 of the first cover member 12 facing the second surface 32 of the second cover member 14 in fig. 1), or in response to a determined amount of user inactivity time. In the event that the device 10 detects a screen close event, the method 40 moves to block 44.
In block 44, the device 10 is set to a sleep state. As described above, in the sleep state, the application processor 38 and other electronic components (e.g., speakers, sensors, processors) of the device 10 are set to a low power or off state.
When in a low power or off state, the application processor 38 is unable to receive sensor measurements from the first and second sensor units 34, 36, and therefore is unable to perform lid angle detection. In contrast, the first sensor unit 34 and the second sensor unit 36 remain on and operational even when the device 10 enters a sleep state. The method 40 then moves to blocks 46 and 48, which may be performed simultaneously.
Note that during blocks 46 and 48, device 10 is in a sleep state. Thus, the application processor 38 is in a low power or off state, while the first sensor unit 34 and the second sensor unit 36 remain on and operational. Block 46 and block 48 are performed by the first sensor unit 34 and the second sensor unit 36, respectively.
In block 46, the first sensor unit 34, more specifically, the processor of the first sensor unit 34, determines the orientation or position of the first cover member 12 (more specifically, the first surface 30 of the first cover member 12). As described above with respect to fig. 1, the first sensor unit 34 is positioned in the first cover member 12.
Similarly, in block 48, the second sensor unit 36, more specifically a processor of the second sensor unit 36, determines an orientation or position of the second cover member 14 (more specifically, the second surface 32 of the second cover member 14). As described above with respect to fig. 1, the second sensor unit 36 is located in the second cover member 14.
The first and second sensor units 34, 36 determine the orientation of the first and second cover members 12, 14, respectively, based on acceleration and angular velocity measurements along one or more axes. Furthermore, the orientation is expressed as a quaternion.
In the case that the first sensor unit 34 comprises a 3-axis accelerometer for measuring acceleration along an X-axis, a Y-axis transverse to the X-axis and a Z-axis transverse to the X-axis and the Y-axis, and comprises a 3-axis gyroscope for measuring angular velocity along the X-axis, the Y-axis transverse to the X-axis and the Z-axis transverse to the X-axis, the quaternion q of the first cover part 12 1 Equal to (x) 1 ,y 1 ,z 1 ) Wherein x is 1 、y 1 、z 1 Representing the vector component of the quaternion representing the orientation of the first cover part 12. In a similar manner to that described above,in case the second sensor unit 36 comprises a 3-axis accelerometer and a 3-axis gyroscope, the quaternion q of the second cover part 14 2 Equal to (x) 2 ,y 2 ,z 2 ) Wherein x is 2 、y 2 、z 2 Representing the vector component of the quaternion representing the orientation of the second cover part 14.
The first and second sensor units 34, 36 repeatedly determine the orientation of the first and second cover members 12, 14, respectively, to ensure that the orientation is current and accurate. In one embodiment, the first and second sensor units 34, 36 determine the orientation of the first and second cover members 12, 14, respectively, at determined intervals (e.g., every 5, 10, 15 milliseconds, etc.).
Once the first sensor unit 34 determines the orientation of the first cover part 12 in block 46 and the second sensor unit 36 determines the orientation of the second cover part 14 at least once in block 48, the method 40 moves to block 49.
In block 49, the device 10 detects whether a screen open event has occurred. The screen open event may be detected by a first sensor unit 34, a second sensor unit 36, an application processor 38, or other electronic component (e.g., processor, sensor, etc.) included in the device 10.
In a screen-on event, either the first user interface 22 or the second user interface 24 of the device 10 is set to an on state and an image is displayed. In one embodiment, the screen on event occurs in response to a user activating a power button of the device 10, in response to the device 10 being in an open state (e.g., the first surface 30 of the first cover member 12 and the second surface 32 of the second cover member 14 facing in the same direction in fig. 1), or in response to a determined amount of user activity time. In the event that the device 10 detects a screen open event, the method 40 moves to block 50.
In block 50, the device 10 is set to an awake state. In contrast to the sleep state, in the awake state, the application processor 38 and other electronic components (e.g., speakers, sensors, processors) of the device 10 are set to an on state and fully operational. For example, the application processor 38 can receive sensor measurements from the first sensor unit 34 and the second sensor unit 36 and perform cover angle detection. The method 40 then moves to block 52. Note that during blocks 52 through 64, device 10 remains in the awake state.
In block 52, the application processor 38 retrieves the most recent, most current orientation of the first and second cover members 12 and 14 determined in blocks 46 and 48 by the first and second sensor units 34 and 36, respectively. In one embodiment, the orientations determined by the first sensor unit 34 and the second sensor unit 36 are saved in their respective internal memories, and the application processor 38 retrieves the orientations directly from the first sensor unit 34 and the second sensor unit 36. In another embodiment, the orientation determined by the first sensor unit 34 and the second sensor unit 36 is saved to a shared memory that is shared between the first sensor unit 34, the second sensor unit 36, and the application processor 38; and the application processor 38 retrieves the orientation from the shared memory. The method 40 then moves to block 54.
In block 54, in order for the application processor to process the orientation data generated by the first and second sensor units 34, 36, the application processor 38 converts the format of the orientation of the first and second cover members 12, 14 into a format used by the application processor 38. For example, in one embodiment, the orientation determined by first sensor unit 34 and second sensor unit 36 is in a half-precision floating point format, and application processor 38 converts the orientation to a single-precision floating point format.
Representing quaternion q using vector components due to memory constraints 1 In the case of (2), the quaternion q of the first cover member 12 is determined using the following equations (1) to (4) 1 Conversion to AND (x) 1 ,y 1 ,z 1 ,w 1 ) Equal quaternion q 1
x 1 ′=x 1 (1)
y 1 ′=y 1 (2)
z 1 ′=z 1 (3)
Similarly, the quaternion q of the second cover member 14 is calculated using the following equations (5) to (8) 2 Conversion to AND (x) 2 ,y 2 ,z 2 ,w 2 ) Equal quaternion q 2
x 2 ′=x 2 (5)
y 2 ′=y 2 (6)
z 2 ′=z 2 (7)
The method 40 then moves to block 56. Note that where the first sensor unit 34, the second sensor unit 36, and the application processor 38 use the same data format, block 54 may be removed from the method 40. In this case, the method 40 moves from block 52 to block 56.
In block 56, the application processor 38 determines a distance d between the orientation of the first cover member 12 and the orientation of the second cover member 14. The distance d represents the angular distance between the first cover part 12 and the second cover part 14. The distance d is calculated using the following equation (9):
d=cos -1 (2°(q 1 ′·q 2 ′) 2 -1) (9)
Wherein the dot operator represents a dot product or an inner product. The method then moves to block 58.
In block 58, application processor 38 remaps distance d to estimated lid angle lid of device 10 o . Due to the estimated lid angle lid o Is determined based on the most current orientation of the first and second cover members 12, 14 retrieved in block 52, the estimated cover angle lid o Is the estimated cover angle of the device 10 at the screen open event in block 49. As discussed above with respect to fig. 1, the cover angle is the first surface 30 of the first cover member 12 (and more specifically, the first user interface 22) and the second cover member14 (and more specifically the second user interface 24).
The distance d is remapped to the estimated lid angle lid o So that the estimated lid angle lid o Which occurs when the device 10 is in a closed state (e.g., when the first surface 30 faces the second surface 32) at zero degrees; and estimated lid angle lid o 180 degrees, which occurs when the device 10 is in a fully open state (e.g., when the first surface 30 and the second surface 32 face in the same direction). The estimated lid angle lid is calculated using the following equation (10) o
lid o =360°-(d+180) (10)
The method then moves to block 60.
In block 60, the application processor 38 applies the estimated lid angle lid o Is set to an initial cover angle of the device 10, which is the cover angle between the first surface 30 of the first cover part 12 and the second surface 32 of the second cover part 14 upon a screen opening event in block 49 and an awake state in block 50. The method 40 then moves to block 62.
In case the cover angle detection is currently unreliable or inaccurate, using the previously determined estimated cover angle lid o Is particularly useful as an initial cover angle for the device 10. For example, many cover angle detection solutions are often inaccurate when the device 10 is activated in an upright position or in an unstable state.
In one embodiment, the estimated lid angle lid o Is set to an initial lid angle with the device 10 activated in an upright position or activated in an unstable state. In the upright position, referring to fig. 1, the hinge axis 26 of the apparatus 10 is parallel to gravity. In an unstable state, the device 10 is moved by being shaken or moved, for example, by a user.
If the device 10 is not in an upright position (e.g., the hinge axis 26 is not parallel to gravity) or is not in an unstable state (e.g., the device 10 is in a stable state), then block 60 is not performed and the method 40 moves from block 58 to block 62. In another embodiment, if the device 10 is not in an upright position or in an unstable state, blocks 52, 54, 56, 58 are not performed and the method 40 moves from block 50 to block 62.
The application processor 38 determines that the device 10 is in the upright position based on acceleration measurements, gyroscope measurements, or a combination thereof generated by one or more of the first sensor unit 34 and the second sensor unit 36. For example, application processor 38 determines that device 10 is in the upright position in response to acceleration measurements and/or gyroscope measurements of hinge axis 26 of pointing device 10 parallel to gravity.
The application processor 38 determines that the device 10 is in an unstable state based on acceleration measurements, gyroscope measurements, or a combination thereof generated by one or more of the first sensor unit 34 and the second sensor unit 36. For example, application processor 38 determines that device 10 is in an unstable state in response to a difference between a current angular velocity and an average angular velocity along one or more axes or an acceleration, an acceleration change, an average acceleration, a difference between a current acceleration and an average acceleration, an angular velocity change, an average angular velocity greater than a respective threshold.
In block 62, application processor 38 determines a current cover angle of device 10. In one embodiment, application processor 38 determines the current lid angle based on the initial lid angle determined in block 60. For example, application processor 38 determines the current lid angle based on a detected lid angle change from the initial lid angle.
Since the device 10 is in the awake state and is not limited to utilizing only the first sensor unit 34 and the second sensor unit 36, the device 10 may utilize any number of different techniques for calculating the cover angle to determine the current cover angle, such techniques utilizing, for example, two accelerometers; two accelerometers and two gyroscopes; two accelerometers and two magnetometers; or two accelerometers, two gyroscopes and two magnetometers. Additionally, any of these configurations may be combined with the hall sensor and magnet. The use of two gyroscopes may also be implemented with a hall sensor and a magnet (or equivalently a "switch" sensor for detecting when the device is closed).
For example, the application processor 38 may recursively determine the current lid angle between the first lid component 12 and the second lid component 14 based on measurement signals generated by the first accelerometer of the first sensor unit 34, the second accelerometer of the second sensor unit 36, the first gyroscope of the first sensor unit 34, and the second gyroscope of the second sensor unit 36. In this example, the current cover angle is determined from a weight indicating the reliability of a measurement signal indicating the cover angle between the first cover part 12 and the second cover part 14. In some cases, the application processor 38 may also generate a first intermediate calculation from the measurement signals generated by the first accelerometer and the second accelerometer, the first intermediate calculation indicating a cover angle between the first cover part 12 and the second cover part 14; generating a second intermediate calculation indicative of the cap angle from the measurement signals generated by the first gyroscope and the second gyroscope; and determining the current lid angle as a weighted sum of the first intermediate calculation and the second intermediate calculation.
As another example, the first magnetometer of the first sensor unit 34 and the second magnetometer of the second sensor unit 36 may generate a first signal that is indicative of a measured value of the magnetic field external to the device 10 and that is indicative of the relative orientation of the first cover member 12 with respect to the second cover member 14. The application processor 38 may then acquire the first signal; generating calibration parameters indicative of calibration conditions of the first magnetometer and the second magnetometer from the first signal; generating a reliability value indicative of a reliability condition of the first signal from the first signal; calculating an intermediate value of the current lid angle based on the first signal; and calculating a current cover angle based on the calibration parameter, the reliability value, and the intermediate value. To improve accuracy, calibration parameters, reliability values, and intermediate values may also be used in combination with the current lid angle determined using the accelerometer and gyroscope discussed above.
Once the current cover angle is determined, the function of the device 10 may be controlled based on the current cover angle. For example, the power state of the device and the user interfaces displayed on the first user interface 22 and the second user interface 24 may be adjusted based on the current cover angle.
The method 40 then moves to block 64. It should be noted, however, that while block 64 is performed to ensure that the orientation of the first and second cover members 12, 14 remains accurate, block 62 is repeatedly performed (e.g., every 5, 10, 15 milliseconds, etc.). Further, at this point, block 42 is performed in synchronization with block 62 to detect whether another screen close event has occurred. The repeated execution of block 62 stops when a screen close event is detected.
In block 64, the application processor 38 resets the orientation processing logic of the first sensor unit 34 and the second sensor unit 36 (e.g., the processing logic used in blocks 46 and 48). Resetting the orientation processing logic improves accuracy because measurement errors often accumulate over time, resulting in drift in yaw estimation (yaw estimation) of the orientation of the first and second cover members 12, 14.
Upon determining that the device 10 is in a known state, a reset of the orientation processing logic of the first sensor unit 34 and the second sensor unit 36 is performed.
In a first embodiment, the resetting of the orientation processing logic is performed when the device 10 is in a steady state and a fully open state. When the first sensor unit 34 and the second sensor unit 36 are initialized, being in a stable state reduces errors caused by linear acceleration. Furthermore, the fully open state essentially forces the first sensor unit 34 and the second sensor unit 36 to start with the same yaw.
In steady state, the device 10 is not moved or rocked. The application processor 38 determines that the device 10 is in a steady state based on acceleration measurements, gyroscope measurements, or a combination thereof generated by one or more of the first sensor unit 34 and the second sensor unit 36. For example, application processor 38 determines that device 10 is in a steady state in response to a difference between a current angular velocity and an average angular velocity along one or more axes being less than a respective threshold or acceleration, acceleration change, average acceleration, a difference between a current acceleration and an average acceleration, angular velocity, change in angular velocity, average angular velocity.
In the fully open state, referring to fig. 1, the first surface 30 and the second surface 32 face in the same direction. The application processor 38 determines that the device 10 is in the fully open state based on the current cover angle determined in block 62. For example, application processor 38 determines that device 10 is in a fully open state in response to the current cover angle being within a threshold angle of 180 degrees (e.g., 1, 2, or 3 degrees, etc.).
In response to determining that the device 10 is in the steady state and the fully open state, the application processor 38 transmits a reset signal to the first sensor unit 34 and the second sensor unit 36. Upon receipt of the reset signal, the orientation processing logic of the first sensor unit 34 and the second sensor unit 36 is reset.
In a second embodiment, the resetting of the orientation processing logic is performed when the device 10 is in (1) a steady state and (2) a fully open or closed state. As described above, when the first sensor unit 34 and the second sensor unit 36 are initialized, being in a stable state reduces errors caused by linear acceleration.
As described above, in the fully open state, referring to fig. 1, the first surface 30 and the second surface 32 face in the same direction. Conversely, in the closed state, the first surface 30 and the second surface 32 face each other. The application processor 38 determines that the device 10 is in the closed state based on the current cover angle determined in block 62. For example, application processor 38 determines that device 10 is in the closed state in response to the current cover angle being within a threshold angle of 0 degrees (e.g., 1, 2, or 3 degrees, etc.).
In response to determining that the device 10 is in (1) the steady state and (2) the fully open state or the closed state, the application processor 38 transmits a reset signal to the first sensor unit 34 and the second sensor unit 36. Upon receipt of the reset signal, the orientation processing logic of the first sensor unit 34 and the second sensor unit 36 is reset.
In a second embodiment, the configuration of the first sensor unit 34 orientation processing logic and/or the second sensor unit 36 orientation processing logic changes based on whether the reset is responsive to the device 10 being in a fully open state or a closed state. More specifically, based on whether the reset is caused by the device 10 being in the fully open state or the closed state, the coordinate system (e.g., the east-north-up (ENU) coordinate system) of one of the first sensor unit 34 orientation processing logic and the second sensor unit 36 orientation processing logic is set to be aligned with the coordinate system of the other of the first sensor unit 34 orientation processing logic and the second sensor unit 36 orientation processing logic.
In the case where both the first sensor unit 34 orientation processing logic and the second sensor unit 36 orientation processing logic use the same coordinate system, the coordinate systems of both the first sensor unit 34 orientation processing logic and the second sensor unit 36 orientation processing logic are set to the respective default coordinate systems in response to a reset due to the device 10 being in the fully open state. Conversely, where both the first sensor unit 34 orientation processing logic and the second sensor unit 36 orientation processing logic use the same coordinate system, the coordinate system of one of the first sensor unit 34 orientation processing logic and the second sensor unit 36 orientation processing logic is aligned with the coordinate system of the other of the first sensor unit 34 orientation processing logic or the second sensor unit 36 orientation processing logic in response to a reset due to the device 10 being in the closed state. For example, in the next execution of method 40, the coordinate system of the first sensor unit 34 orientation processing logic is changed to be aligned with the coordinate system of the second sensor unit 36 orientation processing logic by applying the transformation matrix to the coordinate system of the first sensor unit 34 orientation processing logic.
Additionally, in the second embodiment and in the next execution of method 40, the remapping in block 58 is customized based on whether the reset is responsive to the device 10 being in a fully open state or a closed state.
In case of a reset caused by the device 10 being in a fully open state, the estimated lid angle lid is calculated using equation (10) as described above o . Conversely, in the case of resetting caused by the device 10 being in the closed state, the estimated lid angle lid is calculated using the following equation (11) o
lid o =d (11)
In one embodiment, to avoid over-resetting of the first sensor unit 34 and the second sensor unit 36, the application processor 38 transmits a reset signal if a threshold amount of time has elapsed since the previous reset signal transmission. For example, in response to determining that the device 10 is in a steady state and fully open or closed state, the application processor 38 transmits a reset signal to the first sensor unit 34 and the second sensor unit 36 if a threshold amount of time (e.g., 30 seconds, 1 minute, etc.) has elapsed since a previous reset signal transmission. Conversely, in response to determining that the device 10 is in the steady state and fully open or closed state, the application processor 38 skips (i.e., does not transmit) transmitting the reset signal to the first sensor unit 34 and the second sensor unit 36 if a threshold amount of time has not elapsed since the previous reset signal transmission.
After block 64 is complete, method 40 is repeated. In other words, the method 40 returns to block 42.
Various embodiments disclosed herein provide an apparatus and method for cover angle detection. The first sensor unit and the second sensor unit measure acceleration and angular velocity when the device is in a sleep state, and calculate the orientation of the respective cover members based on the acceleration and angular velocity measurements. After the device exits the sleep state, the application processor estimates a cap angle using the calculated orientation, sets the estimated cap angle to an initial cap angle, and updates the initial cap angle using one or more of the acceleration, magnetometer, or gyroscope measurements. As a result, the initial cover angle is accurate even in the case where the apparatus is in an upright position or an unstable state after exiting the sleep state. Furthermore, when the device is in a sleep state, estimating the respective lid orientations with the first and second sensor units reduces overall system current consumption, as the device does not have to remain in an active state.
An apparatus may be summarized as including: a first component comprising: a first user interface; and a first sensor unit including a first accelerometer, a first gyroscope, and a first processor configured to determine a first orientation of the first component based on measurements of the first accelerometer and the first gyroscope; a second component coupled to the first component and configured to fold onto the first component, the second component comprising: a second user interface; and a second sensor unit including a second accelerometer, a second gyroscope, and a second processor configured to determine a second orientation of the second component based on measurements of the second accelerometer and the second gyroscope; and a third processor configured to estimate an angle between the first component and the second component based on the first orientation and the second orientation.
The third processor may be configured to update the angle between the first component and the second component based on the measurements of the first accelerometer, the first gyroscope, the second accelerometer, and the second gyroscope.
The first processor may determine a first orientation with the device in a sleep state and the second processor determines a second orientation and the third processor may estimate an angle with the device in an awake state.
The third processor may be configured to set the estimated angle to an initial angle of the device, and the initial angle may be an angle between the first component and the second component after the device exits the sleep state and enters the awake state.
The third processor may set the estimated angle to the initial angle with the device in an upright position or an unstable state.
The third processor may be configured to: converting the first orientation and the second orientation from a first format to a second format different from the first format; determining a distance between the converted first orientation and the second orientation; and remapping the distance to the estimated angle.
The first orientation may be a first quaternion of the first component and the second orientation may be a second quaternion of the second component.
The third processor may be configured to: determining that the device is in a fully open state in which the first user interface and the second user interface face in the same direction; determining that the device is in a steady state; and resetting the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit with the device in a fully open state and a steady state.
The third processor may reset the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit when the device is in a fully open state, a steady state, and a threshold amount of time has elapsed since a previous reset of the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit.
One method can be summarized as: determining, by a first sensor unit, a first orientation of a first component of the device, the first component comprising a first user interface and a first sensor unit, the first sensor unit comprising a first accelerometer and a first gyroscope, the first sensor unit determining the first orientation based on measurements of the first accelerometer and the first gyroscope; determining a second orientation of a second component of the device by a second sensor unit, the second component configured to fold onto the first component, the second component comprising a second user interface and a second sensor unit, the second sensor unit comprising a second accelerometer and a second gyroscope, the second sensor unit determining the first orientation based on measurements of the second accelerometer and the second gyroscope; and estimating, by the third processor, an angle between the first component and the second component based on the first orientation and the second orientation.
The method may further include detecting a screen off event in which a screen of the first user interface or a screen of the second user interface is set to a low power or off state; and setting the device to a sleep state in which the third processor is set to a low power or off state, the first orientation and the second orientation being determined in response to the device being set to the sleep state.
The method may further comprise: detecting a screen-on event in which a screen of the first user interface or a screen of the second user interface is set to an on state; and setting the device to an awake state in which the third processor is set to an on state, the angle being determined in response to the device being set to the awake state.
The method may further comprise: the estimated angle is set by the third processor to an initial angle of the device, the initial angle being an angle between the first component and the second component after the device is set to the awake state.
The method may further comprise: the apparatus is determined to be in an upright position or in an unstable state, and the estimated angle is set to an initial angle in response to the apparatus being in the upright position or in the unstable state.
The method may further comprise: converting, by the third processor, the first orientation and the second orientation from a first format to a second format different from the first format;
determining, by a third processor, a distance between the converted first orientation and the second orientation; and remapping, by the third processor, the distance to the estimated cap angle.
The method may further comprise: determining, by the third processor, that the device is in a fully open state in which the first user interface and the second user interface face in the same direction; determining, by the third processor, that the device is in a steady state; and resetting the first sensor unit and the second sensor unit by the third processor with the device in the fully open state and the steady state.
A device may be summarized as including a first user interface; a first sensor unit; a first housing comprising a first user interface and a first sensor unit configured to determine a first orientation of the first housing based on measurements generated by the first sensor unit; a second user interface; a second sensor unit; a second housing coupled to the first housing, the second housing configured to move between a first position in which the second housing is located on the first housing and a second position in which the second housing is spaced apart from the first housing, the second housing including a second user interface and a second sensor unit configured to determine a second orientation of the second housing based on measurements generated by the second sensor unit; and a processor configured to estimate an angle between the first housing and the second housing based on the first orientation and the second orientation.
The first sensor unit may determine a first orientation and the second sensor unit determines a second orientation with the device in a sleep state, and the processor may estimate the angle with the device in an awake state.
The processor may be configured to set the estimated angle to an initial angle of the device, and the initial angle may be an angle between the first housing and the second housing after the device exits the sleep state and enters the awake state.
An apparatus may be summarized as including: a first component comprising a first user interface; and a first sensor unit configured to determine a first orientation of the first component; a second component coupled to the first component and configured to fold onto the first component, the second component comprising: a second user interface; and a second sensor unit configured to determine a second orientation of the second component; and a processor configured to: estimating an angle between the first component and the second component based on the first orientation and the second orientation; determining that the device is in a steady state; determining whether the device is in a fully open state or a closed state, the first user interface and the second user interface facing in the same direction in the fully open state, the first user interface and the second user interface facing each other in the closed state; and resetting the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit with the device in (1) the steady state and (2) the fully open state or the closed state.
The processor may be configured to: the coordinate system of the orientation processing logic of the first sensor unit and the coordinate system of the orientation processing logic of the second sensor unit are set to a default coordinate system with the device in a steady state and a fully open state.
The processor may be configured to: the coordinate system of the orientation processing logic of the first sensor unit is aligned with the coordinate system of the orientation processing logic of the second sensor unit with the device in a steady state and a closed state.
The coordinate system of the orientation processing logic of the first sensor unit may be aligned with the coordinate system of the orientation processing logic of the second sensor unit by a transformation matrix.
The first sensor unit may determine a first orientation and the second sensor unit determines a second orientation with the device in a sleep state, and the processor may estimate the angle with the device in an awake state.
The processor may be configured to set the estimated angle to an initial angle of the device, and the initial angle may be an angle between the first component and the second component after the device exits the sleep state and enters the awake state.
The processor may set the estimated angle to the initial angle with the device in an upright position or in an unstable state.
The processor may be configured to: converting the first orientation and the second orientation from a first format to a second format different from the first format; determining a distance between the converted first orientation and the second orientation; and remapping the distance to the estimated angle.
The estimated angle may be equal to 360 ° - (d-180), where d is the distance, with the device in steady state and fully open state.
In the case of a device in a steady state and in a closed state, the estimated angle may be equal to d, where d is the distance.
The processor may be configured to reset the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit if a threshold amount of time has elapsed from a previous reset of the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit.
A method may be summarized as including: determining, by a first sensor unit, a first orientation of a first component of the device, the first component comprising a first user interface and the first sensor unit; determining, by a second sensor unit, a second orientation of a second component of the device, the second component being configured to fold onto the first component, the second component comprising a second user interface and the second sensor unit; estimating, by the processor, an angle between the first component and the second component based on the first orientation and the second orientation; determining, by the processor, that the device is in a steady state; determining, by the processor, that the device is in a fully open state or a closed state, the first user interface and the second user interface facing in the same direction in the fully open state, the first user interface and the second user interface facing each other in the closed state; and resetting, by the processor, the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit with the device in (1) the steady state and (2) the fully open state or the closed state.
The method may further comprise: the coordinate system of the orientation processing logic of the first sensor unit and the coordinate system of the orientation processing logic of the second sensor unit are set by the processor to a default coordinate system with the device in a steady state and a fully open state.
The method may further comprise: the coordinate system of the orientation processing logic of the first sensor unit is aligned with the coordinate system of the orientation processing logic of the second sensor unit by the processor with the device in a steady state and a closed state.
The alignment may include applying a transformation matrix to a coordinate system of the orientation processing logic of the first sensor unit.
The method may further comprise: converting, by the processor, the first orientation and the second orientation from a first format to a second format different from the first format; determining, by the processor, a distance between the converted first orientation and the second orientation; and remapping, by the processor, the distance to the estimated angle.
In the case of a device in a steady state and a fully open state, the remapping may include setting the estimated angle equal to 360 ° - (d+180), where d is the distance.
In the case of a device in a steady state and a closed state, the remapping may include setting the estimated angle equal to d, where d is the distance.
An apparatus may be summarized as including: a first user interface; a first sensor unit; a first housing comprising a first user interface and a first sensor unit configured to determine a first orientation of the first housing based on measurements generated by the first sensor unit; a second user interface; a second sensor unit; a second housing coupled to the first housing, the second housing configured to move between a first position in which the second housing is located on the first housing and a second position in which the second housing is spaced apart from the first housing, the second housing including a second user interface and a second sensor unit configured to determine a second orientation of the second housing based on measurements generated by the second sensor unit; and a processor configured to: estimating an angle between the first housing and the second housing based on the first orientation and the second orientation; determining whether the device is in a steady state; determining whether the device is in a fully open state in which the first user interface and the second user interface face in the same direction or a closed state in which the first user interface and the second user interface face each other; and resetting the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit with the device in (1) the steady state and (2) the fully open state or the closed state.
The processor may be configured to: setting a coordinate system of the orientation processing logic of the first sensor unit and a coordinate system of the orientation processing logic of the second sensor unit as default coordinate systems in a case where the device is in a steady state and a fully open state; and aligning the coordinate system of the orientation processing logic of the first sensor unit with the coordinate system of the orientation processing logic of the second sensor unit with the device in the steady state and the closed state.
The various embodiments described above may be combined to provide further embodiments. These and other changes can be made to the embodiments in light of the above-detailed description. In the following claims, in general, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the present disclosure.

Claims (39)

1. An apparatus, comprising:
a first component comprising:
a first user interface; and
a first sensor unit comprising a first accelerometer, a first gyroscope, and a first processor configured to determine a first orientation of the first component based on measurements of the first accelerometer and the first gyroscope;
A second component coupled to the first component and configured to fold onto the first component, the second component comprising:
a second user interface; and
a second sensor unit comprising a second accelerometer, a second gyroscope, and a second processor configured to determine a second orientation of the second component based on measurements of the second accelerometer and the second gyroscope; and
a third processor is configured to estimate an angle between the first component and the second component based on the first orientation and the second orientation.
2. The apparatus of claim 1, wherein
The third processor is configured to update the angle between the first component and the second component based on measurements of the first accelerometer, the first gyroscope, the second accelerometer, and the second gyroscope.
3. The apparatus of claim 1, wherein
The first processor determines the first orientation and the second processor determines the second orientation with the device in a sleep state, and
the third processor estimates the angle with the device in an awake state.
4. The apparatus of claim 3 wherein
The third processor is configured to set the estimated angle to an initial angle of the device, and
the initial angle is an angle between the first component and the second component after the device exits the sleep state and enters the awake state.
5. The apparatus of claim 4, wherein the third processor sets the estimated angle to the initial angle with the apparatus in an upright position or in an unstable state.
6. The device of claim 1, wherein the third processor is configured to:
converting the first orientation and the second orientation from a first format to a second format, the second format being different from the first format;
determining a distance between the converted first orientation and the second orientation; and
the distance is remapped to the estimated angle.
7. The apparatus of claim 1, wherein the first orientation is a first quaternion of the first component and the second orientation is a second quaternion of the second component.
8. The device of claim 1, wherein the third processor is configured to:
Determining that the device is in a fully open state in which the first user interface and the second user interface face in the same direction;
determining that the device is in a steady state; and
the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit are reset with the device in the fully open state and the steady state.
9. The device of claim 8, wherein the third processor resets the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit if the device is in the fully open state, the steady state, and a threshold amount of time has elapsed since a previous reset of the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit.
10. A method, comprising:
determining, by a first sensor unit, a first orientation of a first component of a device, the first component comprising a first user interface and the first sensor unit, the first sensor unit comprising a first accelerometer and a first gyroscope, the first sensor unit determining the first orientation based on measurements of the first accelerometer and the first gyroscope;
Determining, by the second sensor unit, a second orientation of a second component of the device, the second component being configured to fold onto the first component, the second component comprising a second user interface and a second sensor unit comprising a second accelerometer and a second gyroscope, the second sensor unit determining the first orientation based on measurements of the second accelerometer and the second gyroscope; and
an angle between the first component and the second component is estimated by a third processor based on the first orientation and the second orientation.
11. The method of claim 10, further comprising:
detecting a screen close event in which a screen of the first user interface or a screen of the second user interface is set to a low power or closed state; and
the device is set to a sleep state in which the third processor is set to a low power or off state, the first orientation and the second orientation being determined in response to the device being set to the sleep state.
12. The method of claim 10, further comprising:
Detecting a screen-on event in which a screen of the first user interface or a screen of the second user interface is set to an on state; and
the device is set to an awake state in which the third processor is set to an on state, the angle being determined in response to the device being set to the awake state.
13. The method of claim 12, further comprising:
the estimated angle is set by the third processor to an initial angle of the device, the initial angle being an angle between the first component and the second component after the device is set to the awake state.
14. The method of claim 13, further comprising:
determining that the device is in an upright position or in an unstable state, and in response to the device being in the upright position or in the unstable state, setting the estimated angle to the initial angle.
15. The method of claim 10, further comprising:
converting, by the third processor, the first orientation and the second orientation from a first format to a second format, the second format being different from the first format;
Determining, by the third processor, a distance between the converted first orientation and the second orientation; and
the distance is remapped to the estimated cap angle by the third processor.
16. The method of claim 10, further comprising:
determining, by the third processor, that the device is in a fully open state in which the first user interface and the second user interface face in the same direction;
determining, by the third processor, that the device is in a steady state; and
resetting the first sensor unit and the second sensor unit by the third processor with the device in the fully open state and the steady state.
17. An apparatus, comprising:
a first user interface;
a first sensor unit;
a first housing comprising the first user interface and the first sensor unit, the first sensor unit configured to determine a first orientation of the first housing based on measurements generated by the first sensor unit;
a second user interface;
a second sensor unit;
a second housing coupled to the first housing, the second housing configured to move between a first position in which the second housing is located on the first housing and a second position in which the second housing is spaced apart from the first housing, the second housing including the second user interface and the second sensor unit, the second sensor unit configured to determine a second orientation of the second housing based on measurements generated by the second sensor unit; and
A processor configured to estimate an angle between the first housing and the second housing based on the first orientation and the second orientation.
18. The apparatus of claim 17, wherein
The first sensor unit determines the first orientation and the second sensor unit determines the second orientation with the device in a dormant state, and
the processor estimates the angle with the device in an awake state.
19. The apparatus of claim 18, wherein
The processor is configured to set the estimated angle to an initial angle of the device, and
the initial angle is an angle between the first housing and the second housing after the device exits the sleep state and enters the awake state.
20. An apparatus, comprising:
a first component comprising:
a first user interface; and
a first sensor unit configured to determine a first orientation of the first component;
a second component coupled to the first component and configured to fold onto the first component, the second component comprising:
a second user interface; and
A second sensor unit configured to determine a second orientation of the second component; and
a processor configured to:
estimating an angle between the first component and the second component based on the first orientation and the second orientation;
determining that the device is in a steady state;
determining whether the device is in a fully open state or a closed state, the first user interface and the second user interface facing in the same direction in the fully open state, the first user interface and the second user interface facing each other in the closed state; and
the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit are reset with the device in (1) the steady state and (2) the fully open state or the closed state.
21. The device of claim 20, wherein the processor is configured to:
the coordinate system of the orientation processing logic of the first sensor unit and the coordinate system of the orientation processing logic of the second sensor unit are set to a default coordinate system with the device in the steady state and the fully open state.
22. The device of claim 20, wherein the processor is configured to:
the method further includes aligning a coordinate system of the orientation processing logic of the first sensor unit with a coordinate system of the orientation processing logic of the second sensor unit with the device in the steady state and the closed state.
23. The apparatus of claim 20, wherein the coordinate system of the orientation processing logic of the first sensor unit is aligned with the coordinate system of the orientation processing logic of the second sensor unit by a transformation matrix.
24. The apparatus of claim 20, wherein
The first sensor unit determines the first orientation and the second sensor unit determines the second orientation with the device in a dormant state, and
the processor estimates the angle with the device in an awake state.
25. The apparatus of claim 24, wherein
The processor is configured to set the estimated angle to an initial angle of the device, and
the initial angle is an angle between the first component and the second component after the device exits the sleep state and enters the awake state.
26. The apparatus of claim 25, wherein the processor sets the estimated angle to the initial angle with the apparatus in an upright position or in an unstable state.
27. The device of claim 20, wherein the processor is configured to:
converting the first orientation and the second orientation from a first format to a second format, the second format being different from the first format;
determining a distance between the converted first orientation and the second orientation; and
the distance is remapped to the estimated angle.
28. The apparatus of claim 27, wherein the estimated angle is equal to 360 ° - (d+180), where d is the distance, with the apparatus in the steady state and the fully open state.
29. The device of claim 27, wherein the estimated angle is equal to d with the device in the steady state and the closed state, where d is the distance.
30. The device of claim 20, wherein the processor is configured to reset the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit if a threshold amount of time has elapsed from a previous reset of the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit.
31. A method, comprising:
determining, by a first sensor unit, a first orientation of a first component of a device, the first component comprising a first user interface and the first sensor unit;
determining, by a second sensor unit, a second orientation of a second component of the device, the second component being configured to fold onto the first component, the second component comprising a second user interface and the second sensor unit;
estimating, by a processor, an angle between the first component and the second component based on the first orientation and the second orientation;
determining, by the processor, that the device is in a steady state;
determining, by the processor, that the device is in a fully open state in which the first user interface and the second user interface face in the same direction or a closed state in which the first user interface and the second user interface face each other; and
the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit are reset by the processor with the device in (1) the steady state and (2) the fully open state or the closed state.
32. The method of claim 31, further comprising:
the coordinate system of the orientation processing logic of the first sensor unit and the coordinate system of the orientation processing logic of the second sensor unit are set by the processor to a default coordinate system with the device in the steady state and the fully open state.
33. The method of claim 31, further comprising:
the method further includes aligning, by the processor, a coordinate system of the orientation processing logic of the first sensor unit with a coordinate system of the orientation processing logic of the second sensor unit with the device in the steady state and the closed state.
34. The method of claim 31, wherein the aligning comprises applying a transformation matrix to the coordinate system of the orientation processing logic of the first sensor unit.
35. The method of claim 31, further comprising:
converting, by the processor, the first orientation and the second orientation from a first format to a second format, the second format being different from the first format;
determining, by the processor, a distance between the converted first orientation and the second orientation; and
The distance is remapped to the estimated angle by the processor.
36. The method of claim 35, wherein the remapping includes setting the estimated angle equal to 360 ° - (d+180), where d is the distance, with the device in the steady state and the fully open state.
37. The method of claim 35, wherein the remapping includes setting the estimated angle equal to d, where d is the distance, with the device in the steady state and the closed state.
38. An apparatus, comprising:
a first user interface;
a first sensor unit;
a first housing comprising the first user interface and the first sensor unit, the first sensor unit configured to determine a first orientation of the first housing based on measurements generated by the first sensor unit;
a second user interface;
a second sensor unit;
a second housing coupled to the first housing, the second housing configured to move between a first position in which the second housing is located on the first housing and a second position in which the second housing is spaced apart from the first housing, the second housing including the second user interface and the second sensor unit, the second sensor unit configured to determine a second orientation of the second housing based on measurements generated by the second sensor unit; and
A processor configured to:
estimating an angle between the first housing and the second housing based on the first orientation and the second orientation;
determining whether the device is in a steady state;
determining whether the device is in a fully open state in which the first user interface and the second user interface face in the same direction or a closed state in which the first user interface and the second user interface face each other; and
the orientation processing logic of the first sensor unit and the orientation processing logic of the second sensor unit are reset with the device in (1) the steady state and (2) the fully open state or the closed state.
39. The device of claim 38, wherein the processor is configured to:
setting a coordinate system of the orientation processing logic of the first sensor unit and a coordinate system of the orientation processing logic of the second sensor unit to a default coordinate system with the device in the steady state and the fully open state; and
the method further includes aligning a coordinate system of the orientation processing logic of the first sensor unit with a coordinate system of the orientation processing logic of the second sensor unit with the device in the steady state and the closed state.
CN202310599926.6A 2022-05-27 2023-05-25 Cover angle detection Pending CN117128917A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US17/827,395 2022-05-27
US18/183,464 US20230384837A1 (en) 2022-05-27 2023-03-14 Lid angle detection
US18/183,464 2023-03-14

Publications (1)

Publication Number Publication Date
CN117128917A true CN117128917A (en) 2023-11-28

Family

ID=88857125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310599926.6A Pending CN117128917A (en) 2022-05-27 2023-05-25 Cover angle detection

Country Status (1)

Country Link
CN (1) CN117128917A (en)

Similar Documents

Publication Publication Date Title
US7162352B1 (en) Electronic apparatus and method of correcting offset value of acceleration sensor
EP2721368B1 (en) Motion determination
JP5934296B2 (en) Calibrating sensor readings on mobile devices
US9157736B2 (en) Portable electronic device adapted to provide an improved attitude matrix
US20080042973A1 (en) System for sensing yaw rate using a magnetic field sensor and portable electronic devices using the same
US20110307213A1 (en) System and method of sensing attitude and angular rate using a magnetic field sensor and accelerometer for portable electronic devices
US20150285835A1 (en) Systems and methods for sensor calibration
US10551211B2 (en) Methods and devices with sensor time calibration
US20120278024A1 (en) Position estimation apparatus and method using acceleration sensor
KR100501721B1 (en) Pen-shaped input device using magnetic sensor and method thereof
TWI428602B (en) Method and module for measuring rotation and portable apparatus comprising the module
TW201820077A (en) Mobile devices and methods for determining orientation information thereof
US9885734B2 (en) Method of motion processing and related mobile device and microcontroller unit
CN105910593B (en) A kind of method and device of the geomagnetic sensor of calibrating terminal
CN113551690A (en) Calibration parameter acquisition method and device, electronic equipment and storage medium
US20180267074A1 (en) Systems and methods for motion detection
CN114051717A (en) Foldable electronic device for detecting folding angle and operation method thereof
JP2012037405A (en) Sensor device, electronic apparatus, and offset correction method of angular velocity sensor
US20230384343A1 (en) Lid angle detection
US20230384837A1 (en) Lid angle detection
CN117128917A (en) Cover angle detection
JP2010152587A (en) Input device, control system, handheld device and calibration method
US20240085960A1 (en) Lid angle detection
CN113936044B (en) Method and device for detecting motion state of laser equipment, computer equipment and medium
KR100387768B1 (en) Virtual window control apparatus and control methods for portable electronic equipments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination