CN113194165A - Method for determining direction of electronic equipment, electronic equipment and readable storage medium - Google Patents
Method for determining direction of electronic equipment, electronic equipment and readable storage medium Download PDFInfo
- Publication number
- CN113194165A CN113194165A CN202110394992.0A CN202110394992A CN113194165A CN 113194165 A CN113194165 A CN 113194165A CN 202110394992 A CN202110394992 A CN 202110394992A CN 113194165 A CN113194165 A CN 113194165A
- Authority
- CN
- China
- Prior art keywords
- face
- hardware
- parameter
- module
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a method for determining the direction of electronic equipment, the electronic equipment and a readable storage medium, wherein the method comprises the following steps: acquiring a hardware direction parameter; the hardware direction parameters are acquired by a direction sensor; controlling an image sensor to be started so as to acquire a human face image by using the image sensor; determining face direction parameters according to the face image; and determining the direction of the electronic equipment according to the hardware direction parameter and the face direction parameter. Through the mode, the power consumption of the electronic equipment can be reduced.
Description
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to a method for determining a direction of an electronic device, and a readable storage medium.
Background
With the increasing popularity of electronic devices, electronic devices have become indispensable social and entertainment tools in people's daily life. Electronic devices such as mobile phones and tablet computers often need to determine their own directions to implement some functions, such as intelligent rotation, according to their own directions.
At present, the orientation of the electronic device is mainly determined by combining an orientation sensor and an image sensor. Among other things, electronic devices require a normally open image sensor and capture images of a person's face in time. The power consumption of the electronic device is also increased due to the need for a normally-on image sensor.
Disclosure of Invention
A first aspect of an embodiment of the present application provides a method for determining a direction of an electronic device, where the method includes: acquiring a hardware direction parameter; the hardware direction parameters are acquired by a direction sensor; controlling an image sensor to be started so as to acquire a human face image by using the image sensor; determining face direction parameters according to the face image; and determining the direction of the electronic equipment according to the hardware direction parameter and the face direction parameter.
A second aspect of the embodiments of the present application provides an electronic device, where the electronic device includes a processor and a memory connected to the processor, where the memory is used to store program data, and the processor is used to execute the program data to implement the foregoing method for determining an orientation of the electronic device.
A third aspect of the embodiments of the present application provides a computer-readable storage medium, in which program data is stored, and when the program data is executed by a processor, the program data is used to implement the foregoing method for determining the orientation of an electronic device.
The beneficial effect of this application is: different from the prior art, the method and the device have the advantages that the hardware direction parameters are obtained by acquiring the hardware direction parameters through the direction sensor, then the image sensor is controlled to be started, the image sensor is used for acquiring the face image, then the face direction parameters are determined according to the face image, finally the direction of the electronic equipment is determined according to the hardware direction parameters and the face direction parameters, the image sensor is controlled to be started after the hardware direction parameters are acquired, therefore, the image sensor can be prevented from being started for a long time, and the power consumption of the electronic equipment can be reduced.
Drawings
In order to more clearly illustrate the technical solutions in the present application, the drawings required in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor. Wherein:
fig. 1 is a schematic flowchart of an embodiment of a method for determining an orientation of an electronic device according to the present application;
FIG. 2 is a schematic flowchart of another embodiment of a method for determining the orientation of an electronic device according to the present application;
FIG. 3 is a schematic diagram illustrating an adjustment of a display orientation according to an orientation of an electronic device according to the present application;
FIG. 4 is another schematic flow chart diagram illustrating another embodiment of a method for determining orientation of an electronic device according to the present application;
FIG. 5 is a schematic diagram of the interaction of modules in the processor of the present application;
FIG. 6 is a block diagram of an embodiment of an electronic device provided herein;
FIG. 7 is a block diagram of an embodiment of a computer-readable storage medium provided herein.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first" and "second" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a method for determining a direction of an electronic device according to the present application. The electronic device is not limited to mobile phones, computers and wearable devices. The execution subject of the present embodiment may be a processor in an electronic device.
The method may comprise the steps of:
step S11: acquiring a hardware direction parameter; the hardware direction parameters are acquired by a direction sensor.
The direction Sensor (Orientation Sensor) can sense the change of the direction of the electronic equipment body and can collect the change to obtain the hardware direction parameters. The direction sensor may be a gyroscope, but is not limited to this, and may also be a device including a gravity sensor or other device capable of sensing the orientation change of the electronic device body.
Specifically, the processor may communicate with the orientation sensor, so that the hardware orientation parameters acquired by the orientation sensor may be acquired.
The processor can judge the relative state between the vertical direction and the horizon of the current electronic equipment according to the hardware direction parameters. The range of the hardware orientation parameter may be 0 to 360 °. Here, the vertical direction of the electronic apparatus is also a portrait screen direction or a length direction of the electronic apparatus.
Generally, if the vertical direction of the electronic device is parallel to the horizon, i.e. the hardware direction parameter is 90 ° or 270 °, the electronic device can be controlled to display horizontally; if the vertical direction of the electronic equipment is perpendicular to the horizon, namely the hardware direction parameter is 0 degree/360 degrees or 180 degrees, the electronic equipment can be controlled to display vertically, and therefore the requirement of a user for intelligently rotating the display direction can be met. However, in some special scenarios, the orientation of the electronic device determined by using only the hardware orientation parameters collected by the orientation sensor is not accurate enough. For example, when the user lies on one side, the vertical direction of the electronic device is parallel to the horizon, but the direction in which the user observes the electronic device is parallel to the vertical direction of the electronic device, which indicates that the display direction required by the user is vertical display, and if the electronic device is controlled to display horizontally according to the hardware direction parameters, the accuracy of the intelligent rotation function of the electronic device is reduced, so that the user experience is reduced. It will be appreciated that the parallelism and perpendicularity may be subject to some error, i.e., may be approximately equal to the parallelism and perpendicularity within some error.
Therefore, the processor can control the image sensor to be started in the embodiment, so that the image sensor is utilized to collect face images, then face direction parameters are determined according to the face images, finally, the direction of the electronic equipment is determined according to the hardware direction parameters and the face direction parameters, and the direction of the electronic equipment is determined by integrating the hardware direction parameters and the face direction parameters, so that the accuracy of the direction of the electronic equipment can be improved, the accuracy of functions based on the direction of the electronic equipment such as intelligent rotation can also be improved, and further the user experience can be improved.
Step S12: and controlling the image sensor to be started so as to acquire a human face image by using the image sensor.
Among them, an Image Sensor (Image Sensor) can acquire an Image of two-dimensional information of an environmental object. In this embodiment, the image sensor may be a Charge Coupled Device (CCD) image sensor, or may be a Complementary Metal Oxide Semiconductor (CMOS) image sensor, but is not limited thereto. Generally, an electronic device (e.g., a mobile phone) includes a camera, and the camera includes an image sensor.
Optionally, one or more direction sensors may be included in the electronic device, and one or more image sensors may also be included. The hardware direction parameters can be obtained by synthesizing data collected by a plurality of direction sensors, and the face angle data can be obtained by synthesizing data collected by a plurality of image sensors. In some embodiments, in order to improve the accuracy of the face angle data, the processor may control the image sensor to continuously capture a plurality of face images within a preset time range, so that the face direction parameter can be determined according to the plurality of face images. The preset time range is, for example, 0.1 second, 0.5 second, 1 second, or the like.
In this embodiment, the turning on of the image sensor is triggered. Specifically, the processor controls the image sensor to be turned on after acquiring the hardware direction parameter, so that the image sensor can be prevented from being turned on for a long time, and the power consumption of the electronic device can be reduced. Secondly, when the processor acquires the hardware direction parameters, the processor controls the image sensor to be started, and the face image can be collected in time, so that the direction of the electronic equipment can be determined in time by using the face image.
In some embodiments, when the processor acquires the hardware direction parameter, the current hardware direction parameter may be compared with the previous hardware direction parameter, and if the two hardware direction parameters are different, it is indicated that the direction of the electronic device is changed, and at this time, the image sensor is controlled to be turned on, so that the number of times of invalid turning on of the image sensor may be reduced, and the power consumption of the electronic device may be further reduced.
It is understood that after the image sensor collects the face image, the processor may control the image sensor to be turned off to reduce power consumption of the electronic device.
The processor may be in communication with the image sensor so that the image sensor may be controlled to turn on to capture an image of a human face using the image sensor. It is understood that the facial image captured by the image sensor may be stored in a memory, and the processor may be connected to the memory to obtain the facial image.
Step S13: and determining the face direction parameters according to the face image.
Specifically, after the face image is acquired, the processor may determine the face direction parameter according to the face image. The processor can perform preprocessing such as correction and face recognition on the face image to obtain a preprocessed image, and then process the preprocessed image according to the related face direction detection model, so that face direction parameters can be obtained. It can be understood that, the specific content of determining the face direction parameter according to the face image may refer to related technologies, which are not described herein.
In this embodiment, the face direction parameter is an angle value of the face with respect to the vertical direction of the electronic device, and the range may be 0 ° to 360 °. In other embodiments, the face direction parameter may also be an angle value of the face with respect to a width direction or a diagonal direction of the electronic device, which is not limited herein.
In some embodiments, the determining of the face direction parameter according to the plurality of face images may be determining a face direction initial parameter according to each face image, and then calculating an average value of the plurality of face direction initial parameters as the face direction parameter, so that the accuracy of the face direction parameter can be improved.
Step S14: and determining the direction of the electronic equipment according to the hardware direction parameter and the face direction parameter.
Specifically, the face direction parameter is used to correct the hardware direction parameter. For example, when the user lies on one side, at this time, the vertical direction of the electronic device is parallel to the horizon, the measured hardware direction parameter is 90 °, but the face of the user faces the electronic device at this time, and the measured face direction parameter is 0 °, which indicates that the electronic device does not deflect relative to the face of the user at this time, the hardware direction parameter needs to be corrected to 0 °, so as to obtain the direction of the electronic device. Correspondingly, the processor can adjust the display direction of the electronic equipment to be vertical display corresponding to 0 degrees according to the direction of the electronic equipment, so that the actual watching requirement of a user is met, and the user experience is improved.
In the embodiment, the hardware direction parameters are acquired, wherein the hardware direction parameters are acquired by the direction sensor, then the image sensor is controlled to be started, so that the image sensor is used for acquiring the face image, then the face direction parameters are determined according to the face image, finally the direction of the electronic equipment is determined according to the hardware direction parameters and the face direction parameters, and after the hardware direction parameters are acquired, the image sensor is controlled to be started, so that the image sensor can be prevented from being started for a long time, and the power consumption of the electronic equipment can be reduced.
Referring to fig. 2 to 4, fig. 2 is a schematic flowchart illustrating a method for determining an orientation of an electronic device according to another embodiment of the present application, fig. 3 is a schematic diagram illustrating a display orientation being adjusted according to the orientation of the electronic device according to the present application, and fig. 4 is a schematic flowchart illustrating another embodiment of the method for determining the orientation of the electronic device according to the present application. The execution subject of the embodiment is a processor in an electronic device. The processor may include an ADSP module 31, a system framework layer 32, an AON module 33, and an image module 34.
The method may comprise the steps of:
step S21: and the ADSP module acquires the hardware direction parameters sent by the direction sensor.
Step S22: the ADSP module sends the hardware direction parameters to the system framework layer so that the system framework layer sends a first instruction to the AON module, and the first instruction is used for requesting the face direction parameters.
In the present embodiment, steps S21-S22 correspond to step S11 in the above-described embodiment.
The electronic device further comprises hardware 35, such as an orientation sensor, an image sensor. Specifically, an adsp (advanced digital signal processor) module may communicate with the hardware 35. The ADSP module communicates with the direction sensor to obtain the hardware direction parameter sent by the direction sensor, and then sends the hardware direction parameter to the system Framework layer 32(Framework) so that the system Framework layer 32 sends a first instruction to the aon (always on) module. The ADSP module 31 is used for digital signal processing and is located at a Hardware abstraction layer (Hardware). The android system provides a Hardware layer to encapsulate access to a drive of Linux, and simultaneously provides a uniform Hardware interface and a uniform Hardware form for an upper layer.
Step S23: the AON module receives a first instruction sent by the system frame layer and sends a second instruction to the image module, and the second instruction is used for controlling the image sensor to be started so as to acquire a face image by using the image sensor.
In the present embodiment, step S23 corresponds to step S12 in the above-described embodiment.
The image module 34 is located at a hardware abstraction layer and is used for communicating with the image sensor, so that the image sensor can be controlled to be turned on to acquire a face image by using the image sensor. The image module 34 may further include an image application module 341 and an image driving module 342, and it is understood that due to the limitation of using different programming languages, the AON module 33 cannot be directly connected to the image driving module 342, and the image driving module 342 needs to be connected through the image application module 341 so that the image driving module 342 drives the image sensor to capture the face image.
In the related art, the AON module 33 is located at a hardware abstraction layer, and the AON module 33 and the image driving module 342 can directly communicate with each other, so that the AON module 33 can directly acquire a face image, and can send a face direction parameter to the ADSP module 31 after the face direction parameter is determined, and the ADSP module 31 determines the direction of the electronic device according to the hardware direction parameter and the face direction parameter. However, the above scheme depends on the technical capability of the SOC (System on a chip), and needs to open the bottom layer communication channel between the image module 34 and the ADSP module 31, and the electronic device produced in the early stage cannot provide the capability due to the defect of the SOC, so that the direction of the electronic device cannot be determined by integrating the hardware direction parameter and the face direction parameter.
In this embodiment, the AON module 33 is located on the system frame layer 32, reporting the hardware direction parameter to the system frame layer 32 through the ADSP module 31 to trigger the system frame layer 32 to request the face angle parameter from the AON module 33, and since the AON module 33 is disposed on the system frame layer 32, the limitation of the chip platform is broken, the image module 34 and the ADSP module 31 may not have a bottom layer communication channel, and the direction of the electronic device may also be determined based on the hardware direction parameter and the face direction parameter, thereby improving the accuracy of the direction of the electronic device.
Step S24: the AON module processes the face image to obtain face direction parameters, wherein the face direction parameters are angle values of the face relative to the vertical direction of the electronic equipment.
After the image sensor acquires the face image, the AON module 33 may acquire the face image, process the face image to obtain face direction parameters, and then send the face direction parameters to the system framework layer 32, and the system framework layer 32 continues to perform data processing.
In the present embodiment, step S24 corresponds to step S13 in the above-described embodiment. It can be understood that, for the description of each step in this embodiment, reference may be made to the above embodiments, which are not described herein again.
Step S25: the system framework layer receives the face direction parameters sent by the AON module;
step S26: and the system framework layer determines the direction of the electronic equipment according to the hardware direction parameter and the human face direction parameter.
In the present embodiment, steps S25-S26 correspond to step S14 in the above-described embodiment.
In some embodiments, after determining the orientation of the electronic device, the system framework layer 32 may report the orientation of the electronic device to a business party needing the data, so as to adjust the display orientation of the electronic device. The business party may be an application 36, which may be reported to, for example, a video playing application (e.g., love art, my video), an album application (e.g., my album), or a navigation application (e.g., gold navigation, Baidu navigation), etc.
In some embodiments, depending on the orientation of the electronic device, adjusting the display orientation of the electronic device may be: if the direction of the electronic equipment is the first direction, adjusting the display direction of the electronic equipment to be horizontal display; and if the direction of the electronic equipment is the second direction, adjusting the display direction of the electronic equipment to be vertical display. The first direction and the second direction are two different directions. The first direction and the second direction may or may not be perpendicular.
In a specific implementation scenario, when the face faces the screen, the face angle value is 0 ° in the vertical state, the face is turned to the left to be a value greater than 0 ° and gradually approaching 180 °, and the face is turned to the right to be a value less than 360 ° and gradually diminishing to a value approaching 180. Specifically, the face faces the screen, and when the face turns 45 degrees to the left, the face angle value returned by the AON module 33 is 45 degrees; when the face is turned 45 degrees to the right, the value of the returned face angle is 315 degrees (360 degrees to 45 degrees). In this embodiment, positive values may be returned collectively, and in other embodiments, negative values may also be returned.
Referring to fig. 5, in some implementation scenarios, for example, a mobile phone includes a display screen, the display screen has two short sides and two long sides that are parallel to each other, the short sides and the long sides are connected to each other, and a vertical direction of an electronic device is a direction of the long sides. The following briefly introduces two display states of the non-reference face after the mobile phone is rotated and the reference face after the mobile phone is rotated: firstly, the hardware angle value of the mobile phone is 0 degree/360 degrees, and the display state of the mobile phone without reference to the human face after rotation is vertical display; the actual state of the face is right opposite to the display screen, namely the face does not deflect relative to the display screen, the face angle parameter corresponding to the face image is 0 degree/360 degrees, so that the display state of the reference face after the mobile phone rotates is vertical display, and at the moment, a user is in a standing state or a sitting state, for example. Secondly, the hardware angle value of the mobile phone is 90 degrees, and the display state of the mobile phone without reference to the human face after rotation is horizontal display; the actual state of the face is just opposite to the display screen, and the face angle parameter corresponding to the face image is 0 degree/360 degrees, so that the display state of the reference face after the mobile phone rotates is also displayed vertically, and at the moment, the user is in a side-lying state, for example.
In some embodiments, in the case that the hardware orientation parameter is 0 °/360 °, and the face angle parameter is 0 °/360 °, 90 °, and 270 °, the orientation of the electronic device is 0 °/360 °, so that the system framework layer 32 may record the orientation of the electronic device as a first parameter (for example, 0), and correspondingly, the display state of the electronic device may be adjusted to be displayed vertically according to the first parameter; in addition, when the hardware direction parameter is 90 °/270 °, and the face angle parameter is 0 °/360 °, the direction of the electronic device is 0 °/360 °, so that the system framework layer 32 may record the direction of the electronic device as the first parameter, and correspondingly, the display state of the electronic device may be adjusted to be displayed vertically according to the first parameter.
In some embodiments, in the case that the hardware orientation parameter is 270 °, and the face angle parameter is 90 °/270 °, the orientation of the electronic device is 270 °, so that the system framework layer 32 may record the orientation of the electronic device as a second parameter (for example, 1), and correspondingly, the display state of the electronic device may be adjusted to be displayed horizontally according to the second parameter.
In some embodiments, in the case that the hardware orientation parameter is 90 ° and the face angle parameter is 90 °/270 °, the orientation of the electronic device is 90 °, so that the system framework layer 32 may record the orientation of the electronic device as a third parameter (for example, 2), and correspondingly, the display state of the electronic device may be adjusted to be displayed horizontally according to the third parameter.
In some embodiments, in the case that the hardware orientation parameter is 180 °, and the face angle parameter is 0 °/360 °, 90 ° and 270 °, the orientation of the electronic device is 0 °/360 °, so that the system framework layer 32 may record the orientation of the electronic device as a fourth parameter (e.g., 3), and correspondingly, the display state of the electronic device may be adjusted to be displayed vertically according to the fourth parameter. At this time, the electronic apparatus is in an inverted state, and in the present embodiment, the display direction of the electronic apparatus in the inverted state is still displayed in a portrait orientation. In some embodiments, the electronic device is turned up and down in the vertical direction, and the album application can directly ignore the state of turning up and down, so in this state, the album application can keep the previous display state unchanged, for example, the previous display state is horizontal display (the second parameter or the third parameter), and thus turning up and down is the original display state; if the vertical display (first parameter) is originally performed, the electronic equipment is turned upside down after being turned upside down, and the vertical display is still maintained.
It can be understood that the combination of the hardware orientation parameter and the face angle parameter may not be limited to the above-mentioned ones, for example, the orientation of the electronic device may be marked as the first parameter when the hardware orientation parameter is 85 ° to 95 ° or 265 ° to 275 °, and the face angle parameter is 0 ° to 5 ° or 355 ° to 360 °, and correspondingly, the display state of the electronic device may be adjusted to be displayed vertically according to the first parameter. It can be seen that the combination of the orientation of the electronic device and the display orientation may be set according to practical situations, and is not limited herein.
In some embodiments, the system framework layer 32 includes a sensor management module (not shown), and the sensor management module is configured to receive the hardware direction parameter sent by the ADSP module 31 and send a first instruction to the AON module 33, and receive the face direction parameter sent by the AON module 33 and determine the direction of the electronic device according to the hardware direction parameter and the face direction parameter. For a detailed description, reference is made to the following examples.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating interaction of modules in a processor according to the present application. This embodiment is described with reference to specific modules in a processor.
The AON module includes an AON Service module 531(AON Service) and an AON libs module 532. The AON libs module 532 records the bottom layer code for calculating the angle parameters of the human face according to the human face image. The system framework layer includes a sensor management module 521 (SensorManager). The ADSP module 51 and the image module 54(camera) are located at a Hardware abstraction layer (Hardware), the sensor service module 57 and the AON libs module 532 are located at a Native code layer (Native), and the sensor management module 521 and the AON service module 531 are located at a system framework layer.
The method for determining the direction of the electronic device may specifically include (1) to (7):
(1) the ADSP module 51 sends Hardware direction parameters (Hardware organization) to the sensor service module 57.
(2) After receiving the hardware direction parameters, the sensor service module 57 continues to send the hardware direction parameters to the sensor management module 521 (SensorManager).
(3) After receiving the hardware direction parameter, the sensor management module 521 generates a first instruction, and sends the first instruction to the AON service module 531, where the first instruction is used to request a Face angle parameter (Face angle).
(4) After receiving the first instruction, the AON service module 531 generates a second instruction, and sends the second instruction to the image module 54 through the AON libs module 532, so as to request a face image.
(5) After the image module 54 acquires the face image, the face image is sent to the AON libs module 532, the AON libs module 532 calculates the face image to obtain a face angle parameter, and the face angle parameter is sent to the AON service module 531.
(6) The AON service module 531 receives the face angle parameter and returns the face angle parameter to the sensor management module 521.
(7) Sensor service module 57 synthesizes face angle parameter and hardware direction parameter, determines the direction of electronic equipment, then sends the direction of electronic equipment to Application 56(APP1, APP2, APP3 … …) in the Application layer (Application), from this, the Application can realize functions such as intelligence rotation according to the direction of electronic equipment.
Referring to fig. 6, fig. 6 is a schematic diagram of a frame of an embodiment of an electronic device provided in the present application.
The electronic device 600 includes: a processor 610 and a memory 620 connected to the processor 610, the memory 620 being adapted to store program data, the processor 610 being adapted to execute the program data to implement the steps of any of the above-described method embodiments.
The electronic device 600 includes, but is not limited to, a television, a desktop computer, a laptop computer, a handheld computer, a wearable device, a head-mounted display, a reader device, a portable music player, a portable game console, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a cicada-phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR), a Virtual Reality (VR) device.
In particular, the processor 610 is configured to control itself and the memory 620 to implement the steps of any of the method embodiments described above. Processor 610 may also be referred to as a CPU (Central Processing Unit). The processor 610 may be an integrated circuit chip having signal processing capabilities. The Processor 610 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 610 may be implemented collectively by a plurality of integrated circuit chips.
Referring to fig. 7, fig. 7 is a block diagram illustrating an embodiment of a computer-readable storage medium according to the present application.
The computer readable storage medium 700 stores program data 710, which program data 710, when executed by a processor, is adapted to carry out the steps of any of the above-described method embodiments.
The computer-readable storage medium 700 may be a medium that can store a computer program, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or may be a server that stores the computer program, and the server can send the stored computer program to another device for running or can run the stored computer program by itself.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.
Claims (10)
1. A method for determining orientation of an electronic device, the method comprising:
acquiring a hardware direction parameter; the hardware direction parameters are acquired by a direction sensor;
controlling an image sensor to be started so as to acquire a human face image by using the image sensor;
determining face direction parameters according to the face image;
and determining the direction of the electronic equipment according to the hardware direction parameter and the face direction parameter.
2. The method of claim 1,
the acquiring of the hardware direction parameter includes:
the ADSP module acquires the hardware direction parameters sent by the direction sensor;
and the ADSP module sends the hardware direction parameters to a system frame layer so as to enable the system frame layer to send a first instruction to the AON module, wherein the first instruction is used for requesting the face direction parameters.
3. The method of claim 2,
the control image sensor is started to collect a face image by using the image sensor, and the control image sensor comprises:
the AON module receives a first instruction sent by the system frame layer and sends a second instruction to the image module, and the second instruction is used for controlling the image sensor to be started so as to acquire a face image by using the image sensor.
4. The method of claim 3,
the determining of the face direction parameters according to the face image comprises:
the AON module processes the face image to obtain a face direction parameter, wherein the face direction parameter is an angle value of a face relative to the vertical direction of the electronic equipment.
5. The method according to any one of claims 2 to 4,
determining the direction of the electronic device according to the hardware direction parameter and the face direction parameter, including:
the system framework layer receives the face direction parameters sent by the AON module;
and the system frame layer determines the direction of the electronic equipment according to the hardware direction parameter and the face direction parameter.
6. The method of claim 5, wherein the system framework layer comprises a sensor management module,
the sensor management module is used for receiving the hardware direction parameters sent by the ADSP module, sending a first instruction to the AON module, receiving the face direction parameters sent by the AON module, and determining the direction of the electronic equipment according to the hardware direction parameters and the face direction parameters.
7. The method of claim 1,
after determining the direction of the electronic device according to the hardware direction parameter and the face direction parameter, the method further includes:
and adjusting the display direction of the electronic equipment according to the direction of the electronic equipment.
8. The method of claim 7,
the adjusting the display direction of the electronic device according to the direction of the electronic device includes:
if the direction of the electronic equipment is the first direction, adjusting the display direction of the electronic equipment to be horizontal display;
and if the direction of the electronic equipment is the second direction, adjusting the display direction of the electronic equipment to be vertical display.
9. An electronic device, comprising a processor and a memory coupled to the processor,
the memory is for storing program data, and the processor is for executing the program data to implement the method of determining orientation of an electronic device according to any of claims 1-8.
10. A computer-readable storage medium, in which program data are stored, which program data, when being executed by a processor, are adapted to carry out the method of determining the orientation of an electronic device according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110394992.0A CN113194165A (en) | 2021-04-13 | 2021-04-13 | Method for determining direction of electronic equipment, electronic equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110394992.0A CN113194165A (en) | 2021-04-13 | 2021-04-13 | Method for determining direction of electronic equipment, electronic equipment and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113194165A true CN113194165A (en) | 2021-07-30 |
Family
ID=76975659
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110394992.0A Pending CN113194165A (en) | 2021-04-13 | 2021-04-13 | Method for determining direction of electronic equipment, electronic equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113194165A (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107102801A (en) * | 2017-03-28 | 2017-08-29 | 北京小米移动软件有限公司 | Terminal screen spinning solution and device |
CN111885265A (en) * | 2020-07-31 | 2020-11-03 | Oppo广东移动通信有限公司 | Screen interface adjusting method and related device |
-
2021
- 2021-04-13 CN CN202110394992.0A patent/CN113194165A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107102801A (en) * | 2017-03-28 | 2017-08-29 | 北京小米移动软件有限公司 | Terminal screen spinning solution and device |
CN111885265A (en) * | 2020-07-31 | 2020-11-03 | Oppo广东移动通信有限公司 | Screen interface adjusting method and related device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11205282B2 (en) | Relocalization method and apparatus in camera pose tracking process and storage medium | |
US11102398B2 (en) | Distributing processing for imaging processing | |
EP4064683A1 (en) | Control method, electronic device, computer-readable storage medium, and chip | |
US20200334836A1 (en) | Relocalization method and apparatus in camera pose tracking process, device, and storage medium | |
EP3786894A1 (en) | Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium | |
WO2019219065A1 (en) | Video analysis method and device | |
WO2021013230A1 (en) | Robot control method, robot, terminal, server, and control system | |
CN108737897B (en) | Video playing method, device, equipment and storage medium | |
CN110059686B (en) | Character recognition method, device, equipment and readable storage medium | |
CN112907725B (en) | Image generation, training of image processing model and image processing method and device | |
CN110544272A (en) | face tracking method and device, computer equipment and storage medium | |
CN111753784A (en) | Video special effect processing method and device, terminal and storage medium | |
WO2022042425A1 (en) | Video data processing method and apparatus, and computer device and storage medium | |
CN111882642B (en) | Texture filling method and device for three-dimensional model | |
CN110413837A (en) | Video recommendation method and device | |
WO2018184260A1 (en) | Correcting method and device for document image | |
CN114095437B (en) | Method, device, electronic equipment and storage medium for transmitting data packet | |
US20240193945A1 (en) | Method for determining recommended scenario and electronic device | |
CN114283195B (en) | Method for generating dynamic image, electronic device and readable storage medium | |
WO2022161011A1 (en) | Method for generating image and electronic device | |
CN115150542B (en) | Video anti-shake method and related equipment | |
CN114741559A (en) | Method, apparatus and storage medium for determining video cover | |
CN113095163B (en) | Video processing method, device, electronic equipment and storage medium | |
CN114708289A (en) | Image frame prediction method and electronic equipment | |
CN113194165A (en) | Method for determining direction of electronic equipment, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |