CN111930228A - Method, device, equipment and storage medium for detecting user gesture - Google Patents

Method, device, equipment and storage medium for detecting user gesture Download PDF

Info

Publication number
CN111930228A
CN111930228A CN202010675662.4A CN202010675662A CN111930228A CN 111930228 A CN111930228 A CN 111930228A CN 202010675662 A CN202010675662 A CN 202010675662A CN 111930228 A CN111930228 A CN 111930228A
Authority
CN
China
Prior art keywords
user
preset
gesture
face
acceleration value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010675662.4A
Other languages
Chinese (zh)
Inventor
李腾飞
李梓淳
方迟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010675662.4A priority Critical patent/CN111930228A/en
Publication of CN111930228A publication Critical patent/CN111930228A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a method, apparatus, device, and storage medium for detecting a user gesture. A method of detecting a user gesture, comprising: acquiring an acceleration value of the terminal; judging whether the acceleration value meets a preset condition or not; acquiring an image containing the face of the user to acquire the face direction of the user; and if the acceleration value meets the preset condition, judging an included angle between the face direction of the user and the direction of the terminal so as to determine the posture of the user. The method and the device can accurately determine the gesture of the user using the terminal, so that the accurate prompt of the user can be given, and the use experience of the user is improved.

Description

Method, device, equipment and storage medium for detecting user gesture
Technical Field
The present disclosure relates to the field of computer software technologies, and in particular, to a method, an apparatus, a device, and a storage medium for detecting a user gesture.
Background
At present, the crowd who uses of all kinds of mobile terminal is extensive, and people often can use under the scene of lying or lying on one's side, use mobile terminal with this kind of incorrect posture for a long time, can seriously influence with eye health, especially when teenagers possess more and more electronic equipment down, but owing to lack supervision and instruction, the teenagers are changeed in the electronic mobile terminal equipment such as use cell-phone of postures such as lying or lying on one's side, produce the healthy (myopia, strabismus) problem very easily.
The traditional mobile phone cannot judge the using posture of the user and is difficult to prompt the user.
Disclosure of Invention
The present disclosure has been made to solve the above problems, and an object thereof is to provide a method, an apparatus, a device, and a storage medium for accurately detecting a user posture. This disclosure provides this summary in order to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In order to solve the above technical problem, an embodiment of the present disclosure provides a method for detecting a user gesture, which adopts the following technical solutions:
acquiring an acceleration value of the terminal;
judging whether the acceleration value meets a preset condition or not;
acquiring an image containing the face of the user to acquire the face direction of the user;
and if the acceleration value meets the preset condition, judging an included angle between the face direction of the user and the direction of the terminal so as to determine the posture of the user.
In order to solve the above technical problem, an embodiment of the present disclosure further provides a device for detecting a user gesture, which adopts the following technical solution, including:
the acceleration value acquisition module is used for acquiring the acceleration value of the terminal;
the condition judgment module is used for judging whether the acceleration value meets a preset condition or not;
the image acquisition module is used for acquiring an image containing the face of the user so as to acquire the face direction of the user;
the gesture determining module is used for judging an included angle between the face direction of the user and the direction of the terminal to determine the gesture of the user if the acceleration value meets the preset condition;
the acceleration value comprises at least one of three acceleration values in a horizontal transverse direction, a horizontal vertical direction and a vertical direction;
the preset conditions comprise a first preset condition, wherein the first preset condition comprises that at least one of three acceleration values in the horizontal transverse direction, the horizontal vertical direction and the vertical direction accords with a preset acceleration value range;
the preset conditions comprise second preset conditions, and the second preset conditions are continuously in accordance with the first preset conditions and meet preset duration.
In order to solve the above technical problem, an embodiment of the present disclosure further provides a computer device, which adopts the following technical solutions:
comprising a memory having a computer program stored therein and a processor implementing the method as described above when executing the computer program.
In order to solve the above technical problem, an embodiment of the present disclosure further provides a computer-readable storage medium, which adopts the following technical solutions:
the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the method as described above.
According to the technical scheme disclosed by the disclosure, compared with the prior art, the method and the device can accurately determine the posture of the user using the terminal, so that the accurate prompt of the user can be given, and the user experience is improved.
Drawings
FIG. 1 is an exemplary system architecture diagram in which the present disclosure may be applied;
FIG. 2 is a flow diagram of one embodiment of a method of detecting user gestures according to the present disclosure;
FIG. 3 is a schematic diagram of one embodiment of an apparatus to detect user gestures in accordance with the present disclosure;
FIG. 4 is a schematic block diagram of one embodiment of a computer device according to the present disclosure.
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure; the terms "including" and "having," and any variations thereof, in the description and claims of this disclosure and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of the present disclosure or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to make the technical solutions of the present disclosure better understood by those skilled in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
[ System Structure ]
First, the structure of the system of one embodiment of the present disclosure is explained. As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, 104, a network 105, and a server 106. The network 105 serves as a medium for providing communication links between the terminal devices 101, 102, 103, 104 and the server 106.
In the present embodiment, an electronic device (e.g., the terminal device 101, 102, 103, or 104 shown in fig. 1) on which the method of detecting a user gesture is executed may perform transmission of various information through the network 105. Network 105 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G/5G connection, a Wi-Fi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a UWB connection, and other now known or later developed wireless connection means.
A user may use terminal devices 101, 102, 103, 104 to interact with a server 106 via a network 105 to receive or send messages or the like. Various client applications, such as a video live and play application, a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal device 101, 102, 103, or 104.
The terminal device 101, 102, 103 or 104 may be various electronic devices having a touch screen display and/or supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (moving picture experts group compressed standard audio layer 3), MP4 (moving picture experts group compressed standard audio layer 4) players, head mounted display devices, laptop portable computers, desktop computers, and the like.
The server 106 may be a server that provides various services, such as a background server that provides support for pages displayed on the terminal devices 101, 102, 103, or 104.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Here, the terminal device may implement the embodiment method of the present disclosure independently or by running an application in an android system in cooperation with other electronic terminal devices, or may run an application in other operating systems, such as an iOS system, a Windows system, a hongmeng system, and the like, to implement the embodiment method of the present disclosure.
[ method of detecting user posture ]
Referring to FIG. 2, a flow diagram of one embodiment of a method of detecting user gestures in accordance with the present disclosure is shown. The method comprises the following steps:
s21, acquiring an acceleration value of the terminal;
in one or more embodiments, the acceleration value may be at least one of three acceleration values, a horizontal lateral direction x, a horizontal vertical direction y, and a vertical direction z, for example;
here, for example, the change in the three acceleration values x, y, and z of the acceleration sensor is used, but it is needless to say that the change in the single acceleration value of the acceleration sensor or the change in the acceleration value section of the acceleration sensor may be used.
S22, judging whether the acceleration value meets the preset condition;
in one or more embodiments, the preset conditions include a first preset condition and a second preset condition, and if the acceleration value meets the first preset condition, it is determined whether the second preset condition is met, where the first preset condition may be that at least one of the three acceleration values x, y, and z meets a preset acceleration value range, for example.
In one or more embodiments, the preset acceleration value range may be set to the range in the following table, for example, for lying on the side according to the posture of the user, but may be set to other ranges according to other postures of the user.
Acceleration of a vehicle Normal range Normal typical value Range of lying on side Typical value of lying on side
x -3~+3 0 -10~+10 ±9.5
y -5~+9 +6 -2~+2 -1~+1
z 0~+10 +6 -2~+2 0
In one or more embodiments, the second preset condition is that the first preset condition is continuously met for a preset time, for example, the time meeting the first preset condition lasts for at least 3 seconds, for example, if the change of the three acceleration values x, y, and z of the acceleration sensor is +9.5 for x acceleration, 0 for y acceleration, and 0 for z acceleration, and lasts for 4 seconds, the second preset condition is met; of course, the second preset condition may be satisfied by using a change of a single acceleration value of the acceleration sensor, for example, when any combination of the change of the x acceleration being +10 and lasting for 3 seconds, or the change of the y acceleration being-2 and lasting for 5 seconds, and the change of the z acceleration being-2 and lasting for 3 seconds is detected; or the acceleration sensor is used for detecting the change condition of the acceleration value interval, for example, the x acceleration is detected to change from-3 to +10 and lasts for more than 3 seconds, and then the second preset condition is met.
S23, acquiring an image containing the face of the user to acquire the face direction of the user;
in one or more embodiments, acquiring the image containing the face of the user to acquire the direction of the face of the user includes acquiring the image containing the face of the user by a camera of the terminal, such as by a front-facing camera of the terminal.
In one or more embodiments, the user's facial orientation is determined, for example, by identifying the relative positions of the user's eyes and nose or mouth, although the user's facial orientation may also be determined by identifying the relative positions of other parts of the user.
S24, if the acceleration value meets the preset condition, judging the included angle between the face direction of the user and the direction of the terminal to determine the posture of the user;
in one or more embodiments, the included angle between the direction of the face of the user and the direction of the terminal is in a range from 0 ° to 180 °, for example, a parallel relationship when the included angle is 0 ° or 180 °, and a perpendicular relationship when the included angle is 90 °, where the parallel relationship refers to, for example, the long side of the terminal being parallel to the up-down direction of the head of the user, and the perpendicular relationship refers to, for example, the long side of the terminal being perpendicular to the up-down direction of the head of the user.
In one or more embodiments, determining the directional relationship of the user to the terminal includes acquiring the direction of the user through a camera of the terminal. Here, the camera of the terminal may be a front camera or a rear camera that the terminal has. For example, the head of the user is shot through a front camera, the up-down direction of the head is determined to be vertical, and the relationship between the terminal and the user is parallel if the long edge of the current terminal is vertical.
When it is determined that, for example, the up-down direction of the head of the user is parallel to the long side direction of the terminal, it is determined that the current posture of the user is, for example, lying on the side; when it is determined that, for example, the vertical direction of the head of the user is perpendicular to the longitudinal direction of the terminal, it is determined that the posture of the user is a normal posture and the mobile phone is in a landscape state.
In one or more embodiments, the method further includes determining to prompt if the user gesture is a preset gesture.
For example, when the user is determined to be lying on his side in a preset posture, a prompt is issued, such as a pop-up frame prompt, a voice prompt, a black screen prompt, a vibration prompt, and other various prompt modes.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
[ example method ]
The method of one embodiment of the present disclosure is described below, comprising the steps of:
step 1, acquiring the change conditions of the x acceleration value, the y acceleration value and the z acceleration value of an acceleration sensor of a terminal.
And 2, judging whether the acceleration value accords with a preset acceleration value range, for example, the acceleration value x is-3- +3, the acceleration value y is-5- +9, the acceleration value z is 0- +10 in a normal vertical posture, and the specific numerical value range and the preset acceleration value range can be preset according to the posture of the user.
And 3, judging whether the acceleration value can last for at least 3 seconds if the acceleration value meets the range, wherein the change situation of the acceleration values of x, y and z of the acceleration sensor is the situation that the acceleration value changes to +9.5, the acceleration value of y is 0 and the acceleration value of z is 0, and the change situation lasts for 4 seconds.
Step 4, acquiring an image containing the face of the user to acquire the face direction of the user when the acceleration value lasts for at least 3 seconds; for example, an image containing the face of the user is captured by a front camera to acquire the direction of the face of the user.
And 5, judging an included angle between the face direction of the user and the direction of the terminal to determine the posture of the user.
The included angle between the face direction of the user and the direction of the terminal is in the range of 0 ° to 180 °, for example, the included angle is parallel when the included angle is 0 ° or 180 °, and the included angle is perpendicular when the included angle is 90 °. When the included angle between the vertical direction of the head of the user and the long side direction of the terminal is determined to be 180 degrees and parallel, judging that the current posture of the user is, for example, lying on the side; when the included angle between the vertical direction of the head of the user and the long side direction of the terminal is determined to be 90 degrees and vertical, the posture of the user is determined to be a normal posture, and the mobile phone is in a horizontal screen side-playing state.
And 6, judging whether the posture of the user is a preset posture or not, and prompting if the posture of the user is the preset posture.
For example, when the user is determined to be lying on his side in a preset posture, a prompt is issued, such as a pop-up frame prompt, a voice prompt, a black screen prompt, a vibration prompt, and other various prompt modes.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read Only Memory (ROM), or a Random Access Memory (RAM).
[ means for detecting user posture ]
As shown in fig. 3, in order to implement the technical solution in the embodiment of the present disclosure, the present disclosure provides an apparatus, which may be specifically applied to various electronic terminal devices.
The apparatus for detecting a user gesture according to this embodiment includes: an acceleration value acquisition module 301, a condition judgment module 302, an image acquisition module 303, a posture determination module 304 and a prompt module 305.
An acceleration value obtaining module 301, configured to obtain an acceleration value of the terminal;
in one or more embodiments, the acceleration value may be at least one of three acceleration values, a horizontal lateral direction x, a horizontal vertical direction y, and a vertical direction z, for example;
here, for example, the change in the three acceleration values x, y, and z of the acceleration sensor is used, but it is needless to say that the change in the single acceleration value of the acceleration sensor may be used, or the change in the acceleration value section may be detected by the acceleration sensor.
A condition judgment module 302, configured to judge whether the acceleration value meets a preset condition;
in one or more embodiments, the preset conditions include a first preset condition and a second preset condition, and if the acceleration value meets the first preset condition, it is determined whether the second preset condition is met, where the first preset condition may be that at least one of the three acceleration values x, y, and z meets a preset acceleration value range, for example.
In one or more embodiments, the preset acceleration value range may be set to the range in the following table, for example, for lying on the side according to the posture of the user, but may be set to other ranges according to other postures of the user.
Acceleration of a vehicle Normal range Normal typical value Range of lying on side Typical value of lying on side
x -3~+3 0 -10~+10 ±9.5
y -5~+9 +6 -2~+2 -1~+1
z 0~+10 +6 -2~+2 0
In one or more embodiments, the second preset condition is that the first preset condition is met for a preset time duration, for example, the time duration meeting the first preset condition lasts at least 3 seconds, for example, if the change of the three acceleration values x, y, and z of the acceleration sensor is +9.5 for x acceleration, 0 for y acceleration, and 0 for z acceleration, and lasts for 4 seconds, the second preset condition is met; of course, the second preset condition may be satisfied by using a change of a single acceleration value of the acceleration sensor, for example, when any combination of the change of the x acceleration being +10 and lasting for 3 seconds, or the change of the y acceleration being-2 and lasting for 5 seconds, and the change of the z acceleration being-2 and lasting for 3 seconds is detected; or the acceleration sensor is used for detecting the change condition of the acceleration value interval, for example, the x acceleration is detected to change from-3 to +10 and lasts for more than 3 seconds, and then the second preset condition is met.
An image acquisition module 303, configured to acquire an image containing a face of a user to acquire a direction of the face of the user;
in one or more embodiments, acquiring the image containing the face of the user to acquire the direction of the face of the user includes acquiring the image containing the face of the user by a camera of the terminal, such as by a front-facing camera of the terminal.
In one or more embodiments, the user's facial orientation is determined, for example, by identifying the relative positions of the user's eyes and nose or mouth, although the user's facial orientation may also be determined by identifying the relative positions of other parts of the user.
And the posture determining module 304 is configured to determine an included angle between the face direction of the user and the direction of the terminal to determine the posture of the user if the acceleration value meets a preset condition.
In one or more embodiments, the included angle between the direction of the face of the user and the direction of the terminal is in a range from 0 ° to 180 °, for example, a parallel relationship when the included angle is 0 ° or 180 °, and a perpendicular relationship when the included angle is 90 °, where the parallel relationship refers to, for example, the long side of the terminal being parallel to the up-down direction of the head of the user, and the perpendicular relationship refers to, for example, the long side of the terminal being perpendicular to the up-down direction of the head of the user.
In one or more embodiments, determining the directional relationship of the user to the terminal includes acquiring the direction of the user through a camera of the terminal. Here, the camera of the terminal may be a front camera or a rear camera that the terminal has. For example, the head of the user is shot through a front camera, the up-down direction of the head is determined to be vertical, and the relationship between the terminal and the user is parallel if the long edge of the current terminal is vertical.
When it is determined that, for example, the up-down direction of the head of the user is parallel to the long side direction of the terminal, it is determined that the current posture of the user is, for example, lying on the side; when it is determined that, for example, the vertical direction of the head of the user is perpendicular to the longitudinal direction of the terminal, it is determined that the posture of the user is a normal posture and the mobile phone is in a landscape state.
In one or more embodiments, the prompting module 305 is further included to determine to prompt if the gesture of the user is a preset gesture.
For example, when the user is determined to be lying on his side in a preset posture, a prompt is issued, such as a pop-up frame prompt, a voice prompt, a black screen prompt, a vibration prompt, and other various prompt modes.
It should be understood that although each block in the block diagrams of the figures may represent a module, a portion of which comprises one or more executable instructions for implementing the specified logical function(s), the blocks are not necessarily executed sequentially. Each module and functional unit in the device embodiments in the present disclosure may be integrated into one processing module, or each unit may exist alone physically, or two or more modules or functional units are integrated into one module. The integrated modules can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium. The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
[ device for detecting user posture ]
In order to solve the technical problem, an embodiment of the present disclosure further provides an electronic device. Referring now to fig. 4, a schematic diagram of an electronic device (e.g., a terminal device or a server in fig. 1) 400 suitable for implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 4, the electronic device 400 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 401 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage device 406 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM402, and the RAM403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 406 including, for example, magnetic tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate wirelessly or by wire with other devices to exchange data. While the figures illustrate an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 409, or from the storage means 406, or from the ROM 402. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 401.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (hypertext transfer protocol), and may interconnect with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects the internet protocol addresses from the at least two internet protocol addresses and returns the internet protocol addresses; receiving an internet protocol address returned by the node evaluation equipment; wherein the obtained internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, the method comprising:
acquiring an acceleration value of the terminal;
judging whether the acceleration value meets a preset condition or not;
acquiring an image containing the face of the user to acquire the face direction of the user;
and if the acceleration value meets the preset condition, judging an included angle between the face direction of the user and the direction of the terminal so as to determine the posture of the user.
In accordance with one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, characterized in that,
the preset conditions comprise a first preset condition and a second preset condition, and if the acceleration value meets the first preset condition, whether the second preset condition is met is judged.
In accordance with one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, characterized in that,
the acceleration value comprises at least one of three acceleration values in a horizontal transverse direction, a horizontal vertical direction and a vertical direction;
the first preset condition comprises that at least one of three acceleration values in the horizontal transverse direction, the horizontal vertical direction and the vertical direction accords with a preset acceleration value range.
In accordance with one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, characterized in that,
the second preset condition is that the first preset condition is continuously met and the preset duration is met.
In accordance with one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, characterized in that,
the included angle between the face direction of the user and the direction of the terminal is 0-180 degrees.
In accordance with one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, characterized in that,
acquiring the image including the face of the user to acquire the direction of the face of the user includes acquiring the image including the face of the user by an image pickup device of the terminal.
In accordance with one or more embodiments of the present disclosure, there is provided a method of detecting a user gesture, further comprising,
and judging whether the posture of the user is a preset posture or not, and prompting if the posture of the user is the preset posture.
According to one or more embodiments of the present disclosure, there is provided an apparatus for detecting a user gesture, comprising:
the acceleration value acquisition module is used for acquiring the acceleration value of the terminal;
the condition judgment module is used for judging whether the acceleration value meets a preset condition or not;
the image acquisition module is used for acquiring an image containing the face of the user so as to acquire the face direction of the user;
the gesture determining module is used for judging an included angle between the face direction of the user and the direction of the terminal to determine the gesture of the user if the acceleration value meets the preset condition;
the acceleration value comprises at least one of three acceleration values in a horizontal transverse direction, a horizontal vertical direction and a vertical direction;
the preset conditions comprise a first preset condition, wherein the first preset condition comprises that at least one of three acceleration values in the horizontal transverse direction, the horizontal vertical direction and the vertical direction accords with a preset acceleration value range;
the preset conditions comprise second preset conditions, and the second preset conditions are that the time meeting the first preset conditions lasts for at least 3 seconds.
In accordance with one or more embodiments of the present disclosure, there is provided an apparatus for detecting a user gesture, characterized in that,
the included angle between the face direction of the user and the direction of the terminal is 0-180 degrees;
acquiring the image including the face of the user to acquire the direction of the face of the user includes acquiring the image including the face of the user by an image pickup device of the terminal.
In accordance with one or more embodiments of the present disclosure, there is provided an apparatus for detecting a user gesture, further comprising,
and the prompting module is used for judging whether the gesture of the user is a preset gesture or not and prompting the user if the gesture of the user is the preset gesture.
According to one or more embodiments of the present disclosure, there is provided a computer device comprising a memory having stored therein a computer program and a processor implementing the method according to any one of the above when the computer program is executed by the processor.
According to one or more embodiments of the present disclosure, a computer-readable storage medium is provided, characterized in that a computer program is stored thereon, which, when being executed by a processor, implements the method according to any one of the above.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (12)

1. A method of detecting a user gesture, comprising:
acquiring an acceleration value of the terminal;
judging whether the acceleration value meets a preset condition or not;
acquiring an image containing the face of the user to acquire the face direction of the user;
and if the acceleration value meets the preset condition, judging an included angle between the face direction of the user and the direction of the terminal so as to determine the posture of the user.
2. The method of detecting a user gesture as recited in claim 1,
the preset conditions comprise a first preset condition and a second preset condition, and if the acceleration value meets the first preset condition, whether the second preset condition is met is judged.
3. The method of detecting a user gesture as recited in claim 2,
the acceleration value comprises at least one of three acceleration values in a horizontal transverse direction, a horizontal vertical direction and a vertical direction;
the first preset condition comprises that at least one of three acceleration values in the horizontal transverse direction, the horizontal vertical direction and the vertical direction accords with a preset acceleration value range.
4. The method of detecting a user gesture as recited in claim 2,
the second preset condition is that the first preset condition is continuously met and the preset duration is met.
5. The method of detecting a user gesture as recited in claim 1,
the included angle between the face direction of the user and the direction of the terminal is 0-180 degrees.
6. The method of detecting a user gesture as recited in claim 1,
acquiring the image including the face of the user to acquire the direction of the face of the user includes acquiring the image including the face of the user by an image pickup device of the terminal.
7. The method of detecting a user gesture as recited in claim 1, further comprising,
and judging whether the posture of the user is a preset posture or not, and prompting if the posture of the user is the preset posture.
8. An apparatus for detecting a user gesture, comprising:
the acceleration value acquisition module is used for acquiring the acceleration value of the terminal;
the condition judgment module is used for judging whether the acceleration value meets a preset condition or not;
the image acquisition module is used for acquiring an image containing the face of the user so as to acquire the face direction of the user;
the gesture determining module is used for judging an included angle between the face direction of the user and the direction of the terminal to determine the gesture of the user if the acceleration value meets the preset condition;
the acceleration value comprises at least one of three acceleration values in a horizontal transverse direction, a horizontal vertical direction and a vertical direction;
the preset conditions comprise a first preset condition, wherein the first preset condition comprises that at least one of three acceleration values in the horizontal transverse direction, the horizontal vertical direction and the vertical direction accords with a preset acceleration value range;
the preset conditions comprise second preset conditions, and the second preset conditions are continuously in accordance with the first preset conditions and meet preset duration.
9. The apparatus for detecting a user gesture according to claim 8,
the included angle between the face direction of the user and the direction of the terminal is 0-180 degrees;
acquiring the image including the face of the user to acquire the direction of the face of the user includes acquiring the image including the face of the user by an image pickup device of the terminal.
10. The apparatus for detecting a user gesture according to claim 8, further comprising,
and the prompting module is used for judging whether the gesture of the user is a preset gesture or not and prompting the user if the gesture of the user is the preset gesture.
11. A computer device comprising a memory having stored therein a computer program and a processor implementing the method of any one of claims 1-7 when executing the computer program.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202010675662.4A 2020-07-14 2020-07-14 Method, device, equipment and storage medium for detecting user gesture Pending CN111930228A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010675662.4A CN111930228A (en) 2020-07-14 2020-07-14 Method, device, equipment and storage medium for detecting user gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010675662.4A CN111930228A (en) 2020-07-14 2020-07-14 Method, device, equipment and storage medium for detecting user gesture

Publications (1)

Publication Number Publication Date
CN111930228A true CN111930228A (en) 2020-11-13

Family

ID=73313924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010675662.4A Pending CN111930228A (en) 2020-07-14 2020-07-14 Method, device, equipment and storage medium for detecting user gesture

Country Status (1)

Country Link
CN (1) CN111930228A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031773A (en) * 2021-03-24 2021-06-25 Oppo广东移动通信有限公司 Prompting method, electronic device and computer readable storage medium
CN113077614A (en) * 2021-03-19 2021-07-06 北京有竹居网络技术有限公司 Attitude detection device, control method, terminal, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102934157A (en) * 2011-03-04 2013-02-13 松下电器产业株式会社 Display device and method of switching display direction
CN103955272A (en) * 2014-04-16 2014-07-30 北京尚德智产投资管理有限公司 Terminal equipment user posture detecting system
WO2015158258A1 (en) * 2014-04-16 2015-10-22 苏州尚德智产通信技术有限公司 User posture detection method, device and system
CN107426423A (en) * 2017-07-17 2017-12-01 深圳天珑无线科技有限公司 Reminding method, terminal and the computer-readable storage medium of posture are used based on terminal
CN107993262A (en) * 2017-10-25 2018-05-04 深圳市金立通信设备有限公司 Terminal device uses posture reminding method, terminal and computer-readable recording medium
CN108089729A (en) * 2016-11-23 2018-05-29 中兴通讯股份有限公司 Terminal and its method for realizing reading posture detection
CN108563387A (en) * 2018-04-13 2018-09-21 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
US20210192528A1 (en) * 2018-05-03 2021-06-24 Huawei Technologies Co., Ltd. Facial Recognition-Based Payment Method, Apparatus, and Terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102934157A (en) * 2011-03-04 2013-02-13 松下电器产业株式会社 Display device and method of switching display direction
CN103955272A (en) * 2014-04-16 2014-07-30 北京尚德智产投资管理有限公司 Terminal equipment user posture detecting system
WO2015158258A1 (en) * 2014-04-16 2015-10-22 苏州尚德智产通信技术有限公司 User posture detection method, device and system
CN108089729A (en) * 2016-11-23 2018-05-29 中兴通讯股份有限公司 Terminal and its method for realizing reading posture detection
CN107426423A (en) * 2017-07-17 2017-12-01 深圳天珑无线科技有限公司 Reminding method, terminal and the computer-readable storage medium of posture are used based on terminal
CN107993262A (en) * 2017-10-25 2018-05-04 深圳市金立通信设备有限公司 Terminal device uses posture reminding method, terminal and computer-readable recording medium
CN108563387A (en) * 2018-04-13 2018-09-21 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
US20210192528A1 (en) * 2018-05-03 2021-06-24 Huawei Technologies Co., Ltd. Facial Recognition-Based Payment Method, Apparatus, and Terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
许运程等: "基于Kinect的体感交互技术的坐姿检测方法", 《科技创新与应用》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077614A (en) * 2021-03-19 2021-07-06 北京有竹居网络技术有限公司 Attitude detection device, control method, terminal, and storage medium
CN113031773A (en) * 2021-03-24 2021-06-25 Oppo广东移动通信有限公司 Prompting method, electronic device and computer readable storage medium

Similar Documents

Publication Publication Date Title
US11546410B2 (en) Device and method for adaptively changing task-performing subjects
CN109993150B (en) Method and device for identifying age
CN110188719B (en) Target tracking method and device
CN111414543B (en) Method, device, electronic equipment and medium for generating comment information sequence
CN112488783B (en) Image acquisition method and device and electronic equipment
CN110633126B (en) Information display method and device and electronic equipment
CN110032978A (en) Method and apparatus for handling video
WO2021088790A1 (en) Display style adjustment method and apparatus for target device
CN111930228A (en) Method, device, equipment and storage medium for detecting user gesture
CN110795196A (en) Window display method, device, terminal and storage medium
CN111935442A (en) Information display method and device and electronic equipment
CN110046571B (en) Method and device for identifying age
CN111652675A (en) Display method and device and electronic equipment
CN110189364B (en) Method and device for generating information, and target tracking method and device
CN112256371A (en) Method and device for displaying information and electronic equipment
US20220245920A1 (en) Object display method and apparatus, electronic device, and computer readable storage medium
CN111586295B (en) Image generation method and device and electronic equipment
CN113191257B (en) Order of strokes detection method and device and electronic equipment
CN112153091B (en) Method and device for determining relevance of equipment
CN111444813A (en) Method, device, equipment and storage medium for identifying attribute classification of target object
CN112346630B (en) State determination method, device, equipment and computer readable medium
CN110197230B (en) Method and apparatus for training a model
CN110188712B (en) Method and apparatus for processing image
WO2023207360A1 (en) Image segmentation method and apparatus, electronic device, and storage medium
CN111327763A (en) Method, device, equipment and storage medium for quickly making call

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201113