CN112527094A - Human body posture detection method and electronic equipment - Google Patents

Human body posture detection method and electronic equipment Download PDF

Info

Publication number
CN112527094A
CN112527094A CN201910883512.XA CN201910883512A CN112527094A CN 112527094 A CN112527094 A CN 112527094A CN 201910883512 A CN201910883512 A CN 201910883512A CN 112527094 A CN112527094 A CN 112527094A
Authority
CN
China
Prior art keywords
user
posture
electronic device
relative
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910883512.XA
Other languages
Chinese (zh)
Inventor
李令言
唐頔朏
张树本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910883512.XA priority Critical patent/CN112527094A/en
Priority to PCT/CN2020/105299 priority patent/WO2021052016A1/en
Publication of CN112527094A publication Critical patent/CN112527094A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a human body posture detection method and electronic equipment, the method is applied to the electronic equipment, and the method is used for accurately identifying the bad use posture of a user so as to prompt the user to use the electronic equipment healthily in time and improve user experience. The method comprises the following steps: when the electronic equipment is in a bright screen state, the electronic equipment acquires sensor data and image information acquired by a front camera; then determining the equipment posture according to the sensor data, and determining the relative posture of the face of the user relative to the electronic equipment according to the image information; and finally, determining the head pose of the user according to the relative pose of the face of the user relative to the electronic equipment and the equipment pose.

Description

Human body posture detection method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a human body posture detection method and electronic equipment.
Background
With the continuous development of electronic devices, more and more electronic devices with display screens are widely used in daily life and work of people, such as mobile phones. Moreover, it is not difficult to find that the user is in a habit of many people to operate the mobile phone with his head down, and the user can enjoy the convenience of the mobile phone and the body is affected. For example, if the user lowers his head for a long time, the user may put a large pressure on his cervical vertebrae, which may cause cervical spondylosis. Therefore, a solution is needed to accurately identify the bad using posture of the user, so as to prompt the user to use the electronic device healthily in time.
Disclosure of Invention
The application provides a human body posture detection method and electronic equipment, which are used for accurately identifying bad use postures of users so as to remind the users of using the electronic equipment healthily in time.
In a first aspect, an embodiment of the present application provides a human body posture detection method, including: when the electronic equipment is in a bright screen state, the electronic equipment acquires sensor data and image information acquired by a front camera; then determining the equipment posture according to the sensor data, and determining the relative posture of the face of the user relative to the electronic equipment according to the image information; and finally, determining the head posture of the user according to the relative posture of the face of the user relative to the electronic equipment and the equipment posture.
In the embodiment of the application, the method is used for accurately identifying the bad use posture of the user so as to prompt the user to use the electronic equipment healthily in time and improve the user experience.
In one possible implementation, before the electronic device determines the device pose from the sensor data, the method further includes: the method comprises the steps that the electronic equipment obtains system operation information and user operation information of the electronic equipment; and then determining that the electronic equipment is currently in a state used by a user according to the image information and at least one item of information of system operation information and user operation information.
In the embodiment of the application, the method can eliminate scenes that the user does not interact with the mobile phone, and the accuracy of the result is improved.
In one possible implementation, the electronic device acquires key feature points of a face part in the image information; and then, matching and mapping the key characteristic points and the characteristic points in the preset human head model, and determining the relative posture of the human face relative to the electronic equipment according to the matching and mapping result.
In the embodiment of the application, the method utilizes a low-power-consumption normally-open device to acquire information, for example, picture information acquired by a front camera, and the information of the acceleration of gravity acquired by a normally-open acceleration sensor and a gyroscope sensor can continuously calculate the relative posture of the face.
In one possible implementation, the electronic device calculates a rotation matrix of the face relative to the electronic device in an upright state according to the relative pose of the face of the user relative to the electronic device and the device pose; and calculating the attitude angle of the head of the user in the geodetic coordinate system according to the rotation matrix, wherein the attitude angle comprises a pitch angle, a flip angle and a yaw angle and is used for indicating the head attitude of the user.
In the embodiment of the application, the method can calculate the face absolute attitude angle with high precision so as to judge whether the user is in a bad attitude or not, and can cover a wide range of bad attitude scenes.
In one possible implementation, when the head pose satisfies a set condition, the electronic device determines that the head pose is a bad posture; and outputting the posture adjustment prompt information. The method utilizes the posture adjustment prompt information to carry out real-time reminding, and can also generate a health analysis report, thereby being beneficial to improving the user experience.
In a second aspect, embodiments of the present application provide an electronic device including a sensor, a touch screen, a processor, and a memory. Wherein the memory is used to store one or more computer programs; the one or more computer programs stored in the memory, when executed by the processor, enable the electronic device to implement any of the possible design methodologies of any of the aspects described above.
In a third aspect, the present application further provides an apparatus including a module/unit for performing the method of any one of the possible designs of any one of the above aspects. These modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a fourth aspect, this embodiment also provides a computer-readable storage medium, which includes a computer program and when the computer program runs on an electronic device, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
In a fifth aspect, the present application further provides a computer program product, which when run on a terminal, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
Fig. 1 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an android operating system provided in an embodiment of the present application;
fig. 3 is a schematic flow chart of a human body posture detection method according to an embodiment of the present application;
fig. 4 is a schematic view of a detection scenario provided in an embodiment of the present application;
fig. 5A and fig. 5B are schematic diagrams of a coordinate system transformation method according to an embodiment of the present application;
fig. 5C is a schematic flowchart of a method for determining an equipment attitude according to an embodiment of the present disclosure;
fig. 6A and fig. 6B are schematic diagrams of a user's face relative to an electronic device according to an embodiment of the present application;
fig. 6C is a schematic flowchart of a method for determining a relative posture according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a head gesture recognition provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail below with reference to the drawings in the following embodiments of the present application.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise. It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Hereinafter, some terms referred to in the embodiments of the present application will be explained so as to be easily understood by those skilled in the art.
The embodiments of the present application relate to at least one, including one or more; wherein a plurality means greater than or equal to two. In addition, it is to be understood that the terms first, second, etc. in the description of the present application are used for distinguishing between the descriptions and not necessarily for describing a sequential or chronological order.
An application (app) related to the embodiment of the present application may be simply referred to as an application, and is a software program capable of implementing one or more specific functions. Generally, a plurality of applications, for example, an instant messaging application, a video application, an audio application, an image capturing application, and the like, may be installed in the electronic device. The instant messaging application may include, for example, a short message application, WeChat (Wechat), WhatsApp Messenger, Link (Line), photo sharing (instagram), Kakao Talk, and nailing. The image capture class application may, for example, include a camera application (system camera or third party camera application). Video-like applications may include, for example, Youtube, Twitter, tremble, love art, Tencent video, and so on. Audio-like applications may include, for example, cool dog music, dried shrimp, QQ music, and so forth. The application mentioned in the following embodiments may be an application installed when the electronic device leaves a factory, or an application downloaded from a network or acquired by another electronic device during the use of the electronic device by a user.
The embodiment of the present application provides a human body posture detection method, which may be applied to any electronic device, such as a mobile phone, a tablet computer, a wearable device (e.g., a watch, a bracelet, a smart helmet, etc.), an in-vehicle device, a smart home, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like. In the human body posture detection method provided by the embodiment of the application, the electronic device can recognize the device posture according to the gyroscope, the gravity acceleration sensor and other sensors, in addition, the electronic device can determine the human head posture according to the human face image and the device posture shot by the front camera, and if the determined human head posture is a bad human body posture, the user is timely reminded to correct the bad human body posture. The method enhances the intelligent degree of the electronic equipment to a certain extent, is beneficial to correcting bad use habits of users, and improves user experience. The following embodiments are mainly described by taking a mobile phone as an example.
For example, fig. 1 shows a schematic structural diagram of a mobile phone.
As shown in fig. 1, the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller can be a nerve center and a command center of the mobile phone. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge a mobile phone, or may be used to transmit data between the mobile phone and a peripheral device. The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to a mobile phone. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to a mobile phone, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the handset antenna 1 is coupled to the mobile communication module 150 and the handset antenna 2 is coupled to the wireless communication module 160 so that the handset can communicate with the network and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The display screen 194 is used to display a display interface of an application and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone may include 1 or N display screens 194, with N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. In some embodiments, camera 193 may include at least one camera, such as a front camera and a rear camera.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system, software codes of at least one application program (such as an Aichi art application, a WeChat application, etc.), and the like. The data storage area can store data (such as images, videos and the like) generated in the use process of the mobile phone and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as pictures, videos, and the like are saved in an external memory card.
The mobile phone can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine the motion attitude of the handset. In some embodiments, the angular velocity of the handpiece about three axes (i.e., the x, y, and z axes) may be determined by the gyro sensor 180B.
The gyro sensor 180B may be used for photographing anti-shake. The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset calculates altitude from the barometric pressure measured by barometric pressure sensor 180C to assist in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The mobile phone can detect the opening and closing of the flip leather sheath by using the magnetic sensor 180D. In some embodiments, when the mobile phone is a flip phone, the mobile phone may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set. The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone in various directions (typically three axes). When the mobile phone is static, the size and the direction of gravity can be detected. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The handset can measure distance by infrared or laser. In some embodiments, the scene is photographed and the cell phone may utilize the distance sensor 180F for ranging to achieve fast focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. The mobile phone can detect that the mobile phone is held by a user and close to the ear for conversation by using the proximity light sensor 180G so as to automatically extinguish the screen and achieve the purpose of saving electricity. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The mobile phone may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the mobile phone is in a pocket, so as to prevent accidental touch. The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the battery 142 is heated by the phone when the temperature is below another threshold to avoid abnormal shutdown of the phone due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the mobile phone performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The mobile phone may receive a key input, and generate a key signal input related to user setting and function control of the mobile phone. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the mobile phone by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195.
It will be understood that the components shown in fig. 1 are not intended to be limiting, and that the handset may include more or fewer components than those shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used.
Fig. 2 shows a software structure block diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 2, the software structure of the electronic device may be a layered architecture, for example, the software may be divided into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer (FWK), an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 2, the application layer may include a camera, settings, a skin module, a User Interface (UI), a three-party application, and the like. The three-party application program may include WeChat, QQ, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer may include some predefined functions. As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The hardware layer may include various sensors, such as an acceleration sensor, a gyro sensor, a touch sensor, and the like, which are referred to in the embodiments of the present application.
The following describes the work flow of software and hardware of a mobile phone by combining the human body posture detection method according to the embodiment of the present application.
As an example, after the sensors (e.g., gravity sensor, gyroscope, and acceleration sensor) in the hardware layer acquire the sensor data, the sensor data may be sent to the system library through the kernel layer. And the system library judges the current equipment posture of the mobile phone according to the sensor data. In some embodiments, the system library layer may determine the pose angle of the handset in the geodetic coordinate system. In addition, after the image sensor (such as a front camera) in the hardware layer collects the image data, the image data can be sent to the system library through the kernel layer. And the system library judges the attitude angle of the user face relative to the mobile phone according to the image data, and finally, the mobile phone determines the attitude angle of the user head in the geodetic coordinate system according to the attitude angle of the user face relative to the mobile phone and the equipment attitude angle.
Based on the above electronic device structure, an embodiment of the present application provides a human body posture detection method, which is applicable to the above electronic device, as shown in fig. 3. The method comprises the following steps.
Step 301, when the electronic device is in a bright screen state, the electronic device acquires sensor data and image information acquired by a front camera.
The sensor data may be sensor data such as a gyroscope and gravitational acceleration.
Step 302, the electronic device determines the device pose according to the sensor data, and determines the relative pose of the user face with respect to the electronic device according to the image information collected by the front-facing camera.
Specifically, when the electronic device is in a static state or a low-speed uniform linear motion state, the electronic device may calculate a device attitude angle of the electronic device by using data acquired by a gyroscope and a gravity accelerometer, where the device attitude angle refers to a device attitude angle of the electronic device in a geodetic coordinate system.
As shown in fig. 4, the electronic device may acquire a face image of the user through the front-facing camera, so that the electronic device may acquire key feature points from the face image, and then determine a relative posture of the face of the user with respect to the electronic device according to the key feature points. In another possible embodiment, the electronic device may solve the camera external parameters by using the distance between the face and the electronic device, and then determine the relative pose of the face of the user with respect to the electronic device according to the camera external parameters and the key feature points in the face image. It should be noted that the distance between the human face and the electronic device may be determined according to infrared light emitted by the infrared sensor, or may be determined according to ultrasonic waves emitted by the ultrasonic sensor. Besides, the focal length information in the face image can be used for obtaining the focal length information. The embodiments of the present application do not limit this.
Step 303, the electronic device determines the head pose of the user according to the relative pose of the face of the user with respect to the electronic device and the device pose.
It should be noted that before the electronic device acquires the sensor data and the image information acquired by the front camera, the electronic device may further determine that the electronic device is currently used by the user according to at least one of the information of the image captured by the front camera, the system operation information, and the user operation information. For example, the system operation information of the mobile phone shows that the WeChat application is currently being operated, and the mobile phone receives the input voice operation of the user, so that it can be determined that the mobile phone is currently being used by the user. For another example, if the sight line information of the user is detected (for example, eyes are detected in front of the gaze) according to the image shot by the camera in front of the mobile phone, it can be determined that the mobile phone is currently used by the user. In other possible implementations, the electronic device may determine an attitude angle of the face relative to the electronic device according to image information acquired by the front-facing camera, and may determine that the user is watching a mobile phone screen by combining the attitude angle and a sight angle of eyeballs of the face in the image, so that it may be determined that the mobile phone is currently used by the user.
In one possible implementation, when the front-facing camera does not detect the face image, the head pose of the user may also be determined according to the device pose.
After step 303 is executed, if the head posture is determined to be a bad posture, the electronic device outputs a posture adjustment prompt message.
It should be noted that, in a scene that the child eye protection mode of the electronic device is turned on, if the electronic device determines that the head posture of the human body is a bad human body posture, it needs to further determine whether the portrait acquired by the front camera is a face of the child, and if the portrait is a face of the child, the electronic device sends posture adjustment prompt information to the user.
It should be noted that, in the embodiment of the present application, a specific manner of outputting the posture adjustment prompt information is not limited. For example, the sound output may be a voice output, a vibration output, an indicator light output, or a specific sound output (such as a buzzer, a specific music, a prolonged sound, etc.). Of course any combination thereof is possible. In the embodiment of the present application, the number of times of outputting the posture adjustment prompt information is not limited, and for example, the posture adjustment prompt information may be output once or only periodically. When the output form is a voice output, the present embodiment does not limit the specific contents of the voice output. As long as it can remind the user to correct the posture. For example, the voice content may include the fatigue degree of the cervical vertebra of the user, or include a suggested posture adjustment mode on the basis of the fatigue degree, and the like.
In one possible embodiment, to avoid excessive interference, the electronic device may input a bullet box to the user that includes a later reminder and a no longer reminder, which the user may select. If the user selects the later reminding for multiple times and the electronic equipment detects that the user is continuously in the bad posture, the blocking reminding (such as full-screen reminding) is carried out. In addition, the electronic device can record in the background the time the user used the handset, as well as the various head pose angles during use.
In one possible embodiment, the user may also generate a health analysis report generated by the electronic device based on the detected head pose angles at various times, for example, the report including a relationship between head pose and human health. The electronic device may periodically report according to a time set by a user. Such as daily, weekly, monthly, the report may include a user-defined periodic head posture profile, such as normal posture time to weight ratio, poor posture to weight ratio, fitness evaluation, possible health effects, and targeted recommendations.
For example, if it is detected that the head posture of the user belongs to an adverse posture (the angle of head bending and head lowering exceeds a certain value), the electronic device may estimate the cervical vertebra pressure of the user in the posture, and when the cervical vertebra pressure exceeds a set threshold, the electronic device performs pop-up reminding on the user, for example, informing the adverse posture and the corresponding correction angle. For another example, the electronic device determines whether there is a bad posture such as head bending or probe head bending, and the severity of the influence on health, according to the posture angle of the head of the user. Research shows that when a human body is upright, the weight born by the cervical vertebra is the weight of the head, but when the head is inclined, the stress of the cervical vertebra is changed. Since the weight of the head is constant, the extra weight borne by the cervical vertebrae comes entirely from the tension created by the muscles. When the head is as low as 15 degrees, the cervical vertebrae are subjected to 2 times the pressure when vertical. When the angle reaches 45 degrees, the angle is 3.7 times of the angle when the angle is vertical. When the angle of the human head is lowered to 60 degrees, the pressure born by the cervical vertebra is 4.5 times of the vertical bearing. Therefore, the result can be referred to set each preset angle interval and the upper and lower limit numerical values of the cervical vertebra pressure of each preset angle interval.
In the embodiment of the application, sensors such as gyroscope, acceleration of gravity belong to the normally open device of low-power consumption, and it is less to the equipment consumption shadow, and in addition, leading camera can ask the user whether agree to normally open at the user, and when the user selection agrees, leading camera belongs to normally open device, consequently can gather image information in real time, continuously acquires data information, and then confirms the head gesture in real time.
Therefore, the method provided by the embodiment of the application can accurately calculate the head posture and then accurately judge whether the user is in a bad posture, can cover various use scenes, can remind and generate a health analysis report in real time, and is beneficial to improving user experience.
The gesture detection method provided by the embodiment of the present application will be described in detail below with reference to the accompanying drawings and application scenarios, taking an electronic device as a mobile phone as an example.
Before the device state is identified, data collected by a built-in sensor of the mobile phone needs to be converted from a mobile phone coordinate system to a geodetic reference coordinate system. The reason is that: although various sensors such as an acceleration sensor, a gyroscope, a magnetometer, and a direction sensor built in the smart phone can sense different motions, directions, and external environments, these data are based on a coordinate system of the smart phone, and the collected data may change when the position or direction where the smart phone is placed changes. In reality, due to the individuation of the use habits of the mobile phone user, if the mobile phone is held in a hand or placed in a trousers pocket or a handbag, the identification result of the equipment state can be directly influenced. In other words, in practical applications, in view of the variety of the use habits of users and the arbitrary placement position of the mobile phone, it is necessary to convert the data collected by the built-in sensor of the mobile phone from the mobile phone coordinate system to a uniform reference coordinate system (e.g., a geodetic coordinate system), so that the converted data of the sensor has a clearer physical meaning, which is helpful for accurately identifying the device state of the electronic device.
As shown in FIG. 5A, one way of defining the geodetic coordinate system is as follows: the positive direction of the x axis is tangent to the ground of the current position of the mobile phone and directly points to the east; the positive direction of the y axis is also tangent to the ground and points to the magnetic north pole, and the planes of the x axis and the z axis are horizontal planes; the positive z-axis direction is then directed towards the sky perpendicular to the horizontal plane.
As shown in fig. 5B, the mobile phone coordinate system is determined in relation to the mobile phone screen, and one way of defining the mobile phone coordinate system is as follows: the positive direction of the X axis is the direction pointed to the right by the center of the plane of the mobile phone screen, and the negative direction of the X axis is the reverse direction; the positive direction of the Y axis is the upward direction of the plane center of the mobile phone screen and is vertical to the X axis, and the negative direction of the Y axis is on the contrary; the positive direction of the Z axis is the direction perpendicular to the plane of the mobile phone screen from the center of the screen plane to the positive direction, and the negative direction of the Z axis is the reverse direction.
The embodiment of the application provides a conversion formula for converting a mobile phone coordinate system into a geodetic reference coordinate system, as shown in formula 1.
Figure BDA0002206604420000091
Wherein, X/Y/Z is sensor data of a mobile phone coordinate system, R represents a rotation matrix, and X, Y and Z are sensor data of a geodetic coordinate system.
Wherein, R is formed by compounding three basic rotation matrixes, and R is shown as formula 2.
Figure BDA0002206604420000092
Figure BDA0002206604420000101
The variables a, p and r respectively represent azimuth, pitch and roll, and the azimuth represents an included angle between the magnetic north pole and the Y axis of the mobile phone coordinate system; the pitch represents the included angle between the X axis of the mobile phone coordinate system and the horizontal plane, and the roll represents the included angle between the Y axis of the mobile phone coordinate system and the horizontal plane.
That is, based on the coordinate system conversion method, the mobile phone may determine, according to the converted sensor data of the geodetic coordinate system, a state of the mobile phone in the geodetic coordinate system, for example, a vertical screen state, or a horizontal screen state, or a vertical screen state or a horizontal screen state with a certain inclination angle.
Specifically, the device attitude angle of the mobile phone in the geodetic coordinate system is determined through the converted data of the gyroscope sensor and the gravity sensor, and the device attitude angle can be used for representing the device attitude of the mobile phone. The equipment attitude angle refers to an angle required to rotate three coordinate axes of the mobile phone coordinate system to be consistent with the geodetic coordinate system. As shown in fig. 5C, the cell phone's gravity accelerometer collects gravitational acceleration data (components along three axes in the current cell phone coordinate system). In fig. 5C, assuming that the noise data in the gravity accelerometer data is ignored, the rotation angle and the rotation matrix of the current device state relative to the horizontal placement state of the mobile phone can be calculated directly through the components of the three axes.
Specifically, the specific process of the device attitude calculation is as follows: firstly, acquiring gravity acceleration data a from an accelerometer in real time; then pulling out the queue tail data with the length of N, and storing the acceleration value to the head of the queue; in the second step, the filtered acceleration value a' is calculated according to the rules of the smoothing filter. And thirdly, calculating the attitude angle of the equipment, namely the pitch angle and the flip angle, by a processor of the mobile phone according to the acceleration a'.
In practical applications, the output signal of the gravity accelerometer carries noise due to environmental influences. Noise is a high frequency signal relative to the output of the accelerometer, so a sliding mean filtering method can be adopted for smoothing. The moving average filtering method belongs to a finite impulse response filter in a digital filter. The principle is as follows: the N measured data are regarded as a queue, the length of the queue is fixed to be N, each time new sampling is carried out, the sampling result is put into the tail of the queue, and one data at the head of the original queue is removed, so that N 'latest' data are always in the queue, and compared with a common mean value filtering method (a method that N data must be read in each time), the method has the advantage that the processing speed is greatly increased. In the embodiment of the application, an improved sliding mean filtering method is adopted, so that data with the largest and smallest modulus values in N data are removed, and the noise simulation effect can be improved. The improved filtering formula is formula 3:
Figure BDA0002206604420000102
wherein,
Figure BDA0002206604420000103
is the filter output, XkAnd the k-th data represents the current sequence, wherein MAX is the data with the maximum modulus value in the N data, and MIN is the data with the minimum modulus value in the N data.
In the embodiment of the application, the angle of the face of the user in front of the user is predefined as a reference 0 degree. As shown in fig. 6A, the pose angle of the face with respect to the mobile phone refers to the rotation angle of the face in three coordinate axes of the camera coordinate system. In fig. 6A, the positive direction of the z-axis is perpendicular to the face and points to the front of the user, the positive direction of the y-axis is vertically upward, the positive direction of the x-axis is horizontally leftward (left side of the user), and the xyz-axes form a right-handed spiral relationship. Fig. 6B shows the attitude angles of the 3 faces with respect to the cellular phone, namely, the Pitch angle (Pitch), Roll angle (Roll), and Yaw angle (Yaw).
As shown in fig. 6C, the specific calculation method of the pose angle of the face with respect to the mobile phone may be: the front camera inputs the acquired image to a face detection module of the mobile phone, the face detection module extracts key points of the face part in the image, then coordinates of the key points and point coordinates of a 3D model of the head are matched and mapped, and a rotation angle of the face relative to the camera and a corresponding rotation matrix are calculated through a non-iterative passive-n-point (EPnP) algorithm.
Based on the equipment attitude angle and the attitude angle of the human face relative to the mobile phone, the electronic equipment can calculate the attitude angle of the human face relative to the human body in an upright state in a geodetic coordinate system, namely the head attitude.
As shown in fig. 7, an Inertial Measurement Unit (IMU) is fixed on a camera, and a target, i.e. a human face, can be regarded as rigid motion, OGxGyGzGIs the IMU coordinate system, OWxWyWzWIs the target coordinate system, ONxNyNzNAs a reference coordinate system, OBxByBzBIs a target coordinate system, OVxVyVzVIs the camera coordinate system. At an initial moment the reference coordinate system coincides with the target coordinate system, and at a certain moment the target is rotated by a certain angle, the reference coordinate system is separated from the target coordinate system. On the basis of obtaining a mobile phone rotation matrix, sequentially calculating an attitude angle and a rotation matrix of a face in a mobile phone coordinate system, a rotation matrix of the mobile phone coordinate system relative to the mobile phone coordinate system, a rotation matrix of the current attitude of the mobile phone relative to a horizontal state, and a rotation matrix of the horizontal state relative to a vertical shooting state of the mobile phone.
Wherein, the target coordinate system OWxWyWzWTo camera coordinate system OVxVyVzVMay be solved by the POSIT or PnP algorithms. Due to the target coordinate system OWxWyWzWAnd a target coordinate system OBxByBzBThe pose relationship between the two is constant in the attitude measurement process, tkTime reference coordinate system ONxNyNzNTo OBxByBzBThe rotation matrix of the target coordinate system can be expressed as
Figure BDA0002206604420000111
Wherein:
Figure BDA0002206604420000112
a rotation matrix from a target coordinate system to a target coordinate system;
Figure BDA0002206604420000113
is tkA rotation matrix from the camera coordinate system to the target coordinate system and from the target coordinate system to the camera coordinate system at the moment;
Figure BDA0002206604420000114
is t0And rotating matrixes from the target coordinate system to the camera coordinate system and from the target coordinate system to the camera coordinate system at the moment.
Then, the corresponding relationship between the Euler angle and the rotation matrix can be obtained
Figure BDA0002206604420000115
Corresponding roll, pitch and yaw angles, i.e. rotation from the reference coordinate system to tk3 euler attitude angles of the target coordinate system of time. When the pose of the camera is not constant and the coordinate system of the camera changes relative to the reference coordinate system, a rotation matrix from the reference coordinate system to the coordinate system of the camera needs to be added,
Figure BDA0002206604420000116
is tkAnd a rotation matrix from the reference coordinate system to the camera coordinate system can be obtained according to the terminal posture detection. So absolute attitude rotation matrix
Figure BDA0002206604420000117
The solving method of (2) is as follows:
Figure BDA0002206604420000118
can obtain the product
Figure BDA0002206604420000119
Then, the corresponding relationship between the Euler angle and the rotation matrix can be obtained
Figure BDA00022066044200001110
And (5) corresponding flip angle and pitch angle.
That is, according to the rotation matrix of the face relative to the upright mobile phone, the attitude angle of the face relative to the face in the upright state of the human body, i.e. the head attitude, is finally calculated. It can be seen that the principle of face absolute pose recognition is essentially mapping of coordinate systems, and transformation between rotation matrices of coordinate systems.
In the embodiment of the application, the pitch angle and the roll-over angle of the head of a person can be obtained according to the calculation process, and the posture of the current user can be judged to be lying, side lying or other postures according to the angle values of the pitch angle and the roll-over angle.
According to the method, the bad using posture of the user can be accurately identified, so that the user can be reminded of using the electronic equipment healthily in time, and the using experience of the user is improved.
In other embodiments of the present application, an embodiment of the present application discloses an electronic device, which may include, as shown in fig. 8: a touch screen 801, wherein the touch screen 801 includes a touch panel 807 and a display screen 808; one or more processors 802; a memory 803; one or more application programs (not shown); and one or more computer programs 804, the sensors 805, the various devices described above may be connected by one or more communication buses 806. Wherein the one or more computer programs 804 are stored in the memory 803 and configured to be executed by the one or more processors 802, the one or more computer programs 804 comprising instructions which may be used to perform the steps as in the respective embodiment of fig. 3.
The embodiment of the present application further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device is enabled to execute the relevant method steps to implement the human body posture detection method in the above embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps, so as to implement the human body posture detection method in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the human body posture detection method of the touch screen in the above method embodiments.
In addition, the electronic device, the computer storage medium, the computer program product, or the chip provided in the embodiments of the present application are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A human body posture detection method is applied to electronic equipment and is characterized by comprising the following steps:
acquiring sensor data and image information acquired by a front camera when the electronic equipment is in a bright screen state;
determining a device pose from the sensor data and a relative pose of a user face with respect to the electronic device from the image information;
and determining the head posture of the user according to the relative posture of the face of the user relative to the electronic equipment and the equipment posture.
2. The method of claim 1, prior to determining a device pose from the sensor data, further comprising:
acquiring system operation information and user operation information of the electronic equipment;
and determining that the electronic equipment is currently in a state used by a user according to the image information and at least one item of information of system operation information and user operation information.
3. The method of claim 1 or 2, wherein determining the relative pose of the face of the user with respect to the electronic device from the image information comprises:
acquiring key characteristic points of the face part in the image information;
and matching and mapping the key characteristic points and the characteristic points in the preset human head model, and determining the relative posture of the human face relative to the electronic equipment according to the matching and mapping result.
4. The method of any one of claims 1 to 3, wherein determining the head pose of the user based on the relative pose of the face of the user with respect to the electronic device and the device pose comprises:
calculating a rotation matrix of the face relative to the electronic equipment in an upright state according to the relative posture of the face of the user relative to the electronic equipment and the equipment posture;
and calculating the attitude angle of the head of the user in the geodetic coordinate system according to the rotation matrix, wherein the attitude angle comprises a pitch angle, a flip angle and a yaw angle and is used for indicating the head attitude of the user.
5. The method of any of claims 1 to 4, after determining the head pose of the user, further comprising:
when the head posture meets a set condition, determining that the head posture is a bad posture;
and outputting the posture adjustment prompt information.
6. An electronic device comprising a display screen, a processor, and a memory;
the memory for storing one or more computer programs;
the memory stores one or more computer programs that, when executed by the processor, cause the electronic device to perform:
acquiring sensor data and image information acquired by a front camera when the electronic equipment is in a bright screen state;
determining a device pose from the sensor data and a relative pose of a user face with respect to the electronic device from the image information;
and determining the head posture of the user according to the relative posture of the face of the user relative to the electronic equipment and the equipment posture.
7. The electronic device of claim 6, wherein the one or more computer programs stored in the memory, when executed by the processor, cause the electronic device to further perform:
acquiring system operation information and user operation information of the electronic equipment;
and determining that the electronic equipment is currently in a state used by a user according to the image information and at least one item of information of system operation information and user operation information.
8. The electronic device of claim 6 or 7, wherein the one or more computer programs stored in the memory, when executed by the processor, cause the electronic device to further perform:
acquiring key characteristic points of the face part in the image information;
and matching and mapping the key characteristic points and the characteristic points in the preset human head model, and determining the relative posture of the human face relative to the electronic equipment according to the matching and mapping result.
9. The electronic device of any of claims 6-8, wherein the one or more computer programs stored in the memory, when executed by the processor, cause the electronic device to further perform:
calculating a rotation matrix of the face relative to the electronic equipment in an upright state according to the relative posture of the face of the user relative to the electronic equipment and the equipment posture;
and calculating the attitude angle of the head of the user in the geodetic coordinate system according to the rotation matrix, wherein the attitude angle comprises a pitch angle, a flip angle and a yaw angle and is used for indicating the head attitude of the user.
10. The electronic device of any of claims 6-9, wherein the one or more computer programs stored in the memory, when executed by the processor, cause the electronic device to further perform:
when the head posture meets a set condition, determining that the head posture is a bad posture;
and outputting the posture adjustment prompt information.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the human gesture detection method of any one of claims 1 to 5.
12. A chip, wherein the chip is coupled with a memory for executing a computer program stored in the memory to perform the human body posture detection method of any one of claims 1 to 5.
CN201910883512.XA 2019-09-18 2019-09-18 Human body posture detection method and electronic equipment Pending CN112527094A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910883512.XA CN112527094A (en) 2019-09-18 2019-09-18 Human body posture detection method and electronic equipment
PCT/CN2020/105299 WO2021052016A1 (en) 2019-09-18 2020-07-28 Body posture detection method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910883512.XA CN112527094A (en) 2019-09-18 2019-09-18 Human body posture detection method and electronic equipment

Publications (1)

Publication Number Publication Date
CN112527094A true CN112527094A (en) 2021-03-19

Family

ID=74883988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910883512.XA Pending CN112527094A (en) 2019-09-18 2019-09-18 Human body posture detection method and electronic equipment

Country Status (2)

Country Link
CN (1) CN112527094A (en)
WO (1) WO2021052016A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221664A (en) * 2021-06-17 2021-08-06 杭州晓鸟科技有限公司 Key point detection-based sitting posture auxiliary system and auxiliary method
CN114469077A (en) * 2022-01-26 2022-05-13 北京国承万通信息科技有限公司 Health detection method, health detection system and wearable health detection equipment
CN115240385A (en) * 2022-07-12 2022-10-25 深圳闪回科技有限公司 Method and device for detecting placement position of mobile phone
WO2023207862A1 (en) * 2022-04-29 2023-11-02 华为技术有限公司 Method and apparatus for determining head posture
WO2024198082A1 (en) * 2023-03-31 2024-10-03 广东花至美容科技有限公司 Visual-based face area positioning method, apparatus, and wearable device

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990055A (en) * 2021-03-29 2021-06-18 北京市商汤科技开发有限公司 Posture correction method and device, electronic equipment and storage medium
CN113190313B (en) * 2021-04-29 2024-09-03 北京小米移动软件有限公司 Display control method and device, electronic equipment and storage medium
CN113191319B (en) * 2021-05-21 2022-07-19 河南理工大学 Human body posture intelligent recognition method and computer equipment
CN115389927B (en) * 2021-05-24 2024-05-10 荣耀终端有限公司 Method and system for measuring and calculating motor damping
CN114071211B (en) * 2021-09-24 2024-01-09 北京字节跳动网络技术有限公司 Video playing method, device, equipment and storage medium
CN114038016A (en) * 2021-11-16 2022-02-11 平安普惠企业管理有限公司 Sitting posture detection method, device, equipment and storage medium
CN117942067A (en) * 2022-10-20 2024-04-30 华为技术有限公司 Body type measuring method and electronic equipment
CN117173382B (en) * 2023-10-27 2024-01-26 南京维赛客网络科技有限公司 Virtual digital human state correction method, system and storage medium in VR interaction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927250A (en) * 2014-04-16 2014-07-16 北京尚德智产投资管理有限公司 User posture detecting method achieved through terminal device
CN103955272A (en) * 2014-04-16 2014-07-30 北京尚德智产投资管理有限公司 Terminal equipment user posture detecting system
CN104239860A (en) * 2014-09-10 2014-12-24 广东小天才科技有限公司 Sitting posture detection and reminding method and device during use of intelligent terminal
CN110222651A (en) * 2019-06-10 2019-09-10 Oppo广东移动通信有限公司 A kind of human face posture detection method, device, terminal device and readable storage medium storing program for executing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9149210B2 (en) * 2010-01-08 2015-10-06 Medtronic, Inc. Automated calibration of posture state classification for a medical device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927250A (en) * 2014-04-16 2014-07-16 北京尚德智产投资管理有限公司 User posture detecting method achieved through terminal device
CN103955272A (en) * 2014-04-16 2014-07-30 北京尚德智产投资管理有限公司 Terminal equipment user posture detecting system
CN104239860A (en) * 2014-09-10 2014-12-24 广东小天才科技有限公司 Sitting posture detection and reminding method and device during use of intelligent terminal
CN110222651A (en) * 2019-06-10 2019-09-10 Oppo广东移动通信有限公司 A kind of human face posture detection method, device, terminal device and readable storage medium storing program for executing

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221664A (en) * 2021-06-17 2021-08-06 杭州晓鸟科技有限公司 Key point detection-based sitting posture auxiliary system and auxiliary method
CN114469077A (en) * 2022-01-26 2022-05-13 北京国承万通信息科技有限公司 Health detection method, health detection system and wearable health detection equipment
WO2023207862A1 (en) * 2022-04-29 2023-11-02 华为技术有限公司 Method and apparatus for determining head posture
CN115240385A (en) * 2022-07-12 2022-10-25 深圳闪回科技有限公司 Method and device for detecting placement position of mobile phone
WO2024198082A1 (en) * 2023-03-31 2024-10-03 广东花至美容科技有限公司 Visual-based face area positioning method, apparatus, and wearable device

Also Published As

Publication number Publication date
WO2021052016A1 (en) 2021-03-25

Similar Documents

Publication Publication Date Title
CN112527094A (en) Human body posture detection method and electronic equipment
CN114816210B (en) Full screen display method and device of mobile terminal
CN110114747B (en) Notification processing method and electronic equipment
CN110874168A (en) Display method and electronic equipment
WO2020029306A1 (en) Image capture method and electronic device
EP4016274A1 (en) Touch control method and electronic device
CN112259191A (en) Method and electronic device for assisting fitness
CN113573390B (en) Antenna power adjusting method, terminal device and storage medium
CN112584037B (en) Method for saving image and electronic equipment
CN114095599B (en) Message display method and electronic equipment
CN111602108A (en) Application icon display method and terminal
CN110647731A (en) Display method and electronic equipment
WO2022267783A1 (en) Method for determining recommended scene, and electronic device
CN111902695B (en) Method and terminal for acquiring user motion trail
CN115150542B (en) Video anti-shake method and related equipment
CN114422686B (en) Parameter adjustment method and related device
CN116048831B (en) Target signal processing method and electronic equipment
CN111249728A (en) Image processing method and image processing device
WO2022017270A1 (en) Appearance analysis method, and electronic device
CN114637392A (en) Display method and electronic equipment
CN114740986A (en) Handwriting input display method and related equipment
CN114445522A (en) Brush effect graph generation method, image editing method, device and storage medium
CN114812381A (en) Electronic equipment positioning method and electronic equipment
CN114639114A (en) Vision detection method and electronic equipment
CN112149483A (en) Blowing detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210319

RJ01 Rejection of invention patent application after publication