CN116567141A - Data processing method and electronic equipment - Google Patents

Data processing method and electronic equipment Download PDF

Info

Publication number
CN116567141A
CN116567141A CN202310836173.6A CN202310836173A CN116567141A CN 116567141 A CN116567141 A CN 116567141A CN 202310836173 A CN202310836173 A CN 202310836173A CN 116567141 A CN116567141 A CN 116567141A
Authority
CN
China
Prior art keywords
state
sensor
target
screen
acc
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310836173.6A
Other languages
Chinese (zh)
Other versions
CN116567141B (en
Inventor
刘铁良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310836173.6A priority Critical patent/CN116567141B/en
Publication of CN116567141A publication Critical patent/CN116567141A/en
Application granted granted Critical
Publication of CN116567141B publication Critical patent/CN116567141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The data processing method and the electronic equipment are applied to the technical field of terminals, and when a folding screen mobile phone is in an intermediate state, a target display interface can be accurately displayed, so that user satisfaction is improved. The method comprises the following steps: acquiring gesture data, wherein the gesture data are used for indicating the folding gesture of the terminal equipment and the display state of each screen in the terminal equipment, and the display state comprises a screen-on state and a screen-off state; under the condition that the terminal equipment is in a target state according to the gesture data, acquiring target acceleration data corresponding to a target display interface by utilizing acceleration data corresponding to a first acceleration sensor, wherein the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface; and displaying a target display interface based on the target acceleration data.

Description

Data processing method and electronic equipment
Technical Field
The disclosure relates to the technical field of terminals, and in particular relates to a data processing method and electronic equipment.
Background
As the technology of folding screen is mature, more and more folding screen mobile phones are available in the market. The folding screen handset may include an a-side, a B-side, and a C-side. As shown in fig. 1, the surface a is a display interface on the left of the inner screen of the folding-screen mobile phone, the surface B is a display interface on the right of the inner screen of the folding-screen mobile phone, and the surface C is a display interface on the outer screen of the folding-screen mobile phone. In general, under the android system of google standard, the folding screen mobile phone can only acquire acceleration data reported by an acceleration sensor corresponding to the B face.
When the folding screen handset is in an intermediate state, for example, the tent state shown in (a) and (b) in fig. 2 and the desk calendar state shown in (a) and (b) in fig. 3. The display interface of the folding screen mobile phone is a C face. Because the folding screen mobile phone can only acquire the acceleration data corresponding to the B face, when the folding screen mobile phone is in the middle state, the folding screen mobile phone cannot acquire the acceleration data corresponding to the C face. At this time, the folding screen mobile phone indicates the display direction of the C-plane according to the acceleration data corresponding to the B-plane, which affects the final display effect and may cause a display inversion phenomenon.
Disclosure of Invention
The embodiment of the disclosure provides a data processing method and electronic equipment, which can accurately display a target display interface when a folding screen mobile phone is in a target state, thereby improving user satisfaction.
In order to achieve the above object, the embodiments of the present disclosure adopt the following technical solutions:
in a first aspect, the present disclosure provides a data processing method, the method comprising: firstly, acquiring gesture data, wherein the gesture data are used for indicating the folding gesture of terminal equipment and the display state of each screen in the terminal equipment, and the display state comprises a screen-on state and a screen-off state; under the condition that the terminal equipment is in a target state according to the gesture data, acquiring target acceleration data corresponding to a target display interface by utilizing acceleration data corresponding to a first acceleration sensor, wherein the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface; and displaying a target display interface based on the target acceleration data.
Based on the data processing method of the first aspect, the state of the terminal device can be determined by acquiring the folded posture of the terminal device and the display states of the respective screens. When the terminal equipment is in a target state, the display interface corresponding to the first acceleration sensor and the target display interface have opposite relations, so that the opposite relations and the acceleration data corresponding to the first acceleration sensor can be utilized to accurately obtain the target acceleration data corresponding to the target display interface. And then, the target display interface is adjusted by utilizing the target acceleration data, so that the target display interface can be correctly displayed, and the use experience of a user is improved.
With reference to the first aspect, in another possible implementation manner, acquiring gesture data includes: acquiring pose data of a first pose sensor and display state data of a second state sensor; determining the folding gesture of the terminal equipment according to the gesture data; and determining the display state of each screen in the terminal equipment according to the display state data. The method and the device judge whether the terminal equipment is in a target state or not according to the gesture data. Since the target state is related to the folded posture, the display state data, based on this implementation, the folded posture of the terminal device is determined with the first posture sensor, and the display state data of the terminal device is determined with the second state sensor. And then whether the terminal equipment is in the target state can be more accurately obtained according to the folding gesture and the display state data.
With reference to the first aspect, in another possible implementation manner, the target state includes any one of a first state, a second state, and a third state; the first state comprises that the folding gesture of the terminal equipment is that the folding angle of the inner screen of the terminal equipment is within a first target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the second state comprises that the folding angle of the inner screen is within a second target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the third state comprises that the folding angle of the inner screen is in a third target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state. A concrete content of the target state is provided.
With reference to the first aspect, in another possible implementation manner, obtaining target acceleration data corresponding to the target display interface by using acceleration data corresponding to the first acceleration sensor includes: and carrying out inverse processing on the acceleration data corresponding to the first acceleration sensor to obtain target acceleration data, and storing the target acceleration data in a preset buffer position. Because the display interface corresponding to the first acceleration sensor is opposite to the target display interface, the acceleration data corresponding to the first acceleration sensor is processed in a reverse mode based on the mode, so that accurate target acceleration data corresponding to the target display interface can be obtained, and then the target display interface can be accurately displayed by utilizing the target acceleration data.
With reference to the first aspect, in another possible implementation manner, the preset buffer location includes a buffer location corresponding to the first acceleration sensor and a buffer location corresponding to the second acceleration sensor; storing the target acceleration data in a preset cache location, including: and storing the target acceleration data in a buffer position corresponding to the second acceleration sensor in the preset buffer position.
Based on the related art, the terminal device obtains the acceleration data corresponding to the B-plane, and the target acceleration data corresponding to the target display interface (i.e., the C-plane) is usually cached in the cache location corresponding to the first acceleration sensor (i.e., the sub ACC). In order to enable the terminal device to acquire the target acceleration data corresponding to the target display interface, the target acceleration data is stored in a buffer position corresponding to the second acceleration sensor (namely, the main ACC), so that the terminal device can acquire the target acceleration data, and better display service is provided for a user according to the target acceleration data.
With reference to the first aspect, in another possible implementation manner, before acquiring the gesture data of the first gesture sensor and the display state data of the second state sensor, the method further includes: detecting a starting operation of a user, and monitoring the first pose sensor and the second state sensor to acquire pose data of the first pose sensor and display state data of the second state sensor. A trigger condition is provided for monitoring a first position sensor and a second state sensor.
With reference to the first aspect, in another possible implementation manner, before acquiring the gesture data of the first gesture sensor and the display state data of the second state sensor, the method further includes: and responding to a preset event, monitoring the first pose sensor and the second state sensor to acquire pose data of the first pose sensor and display state data of the second state sensor. The preset event includes listening for a first acceleration sensor and a second acceleration sensor, or listening for a motion sensor. Another trigger condition is provided for monitoring the first position sensor and the second state sensor.
With reference to the first aspect, in another possible implementation manner, the terminal device is a folding screen terminal device.
In a second aspect, an embodiment of the present disclosure provides a data processing apparatus, which may be applied to an electronic device, for implementing the method in the first aspect. The functions of the data processing device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above functions, for example, an acquisition module, a determination module, a display module, a monitoring module, and the like.
The system comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to acquire gesture data, the gesture data are used for indicating the folding gesture of the terminal equipment and the display state of each screen in the terminal equipment, and the display state comprises a bright screen state and a screen-off state.
The determining module is configured to obtain target acceleration data corresponding to a target display interface by utilizing acceleration data corresponding to the first acceleration sensor under the condition that the terminal equipment is in a target state according to the gesture data, wherein the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface.
And a display module configured to display a target display interface based on the target acceleration data.
With reference to the second aspect, in a possible implementation manner, the acquiring module is further configured to acquire pose data of the first pose sensor and display state data of the second state sensor. The determining module is further configured to determine the folding gesture of the terminal device according to the gesture data; and determining the display state of each screen in the terminal equipment according to the display state data.
With reference to the second aspect, in one possible implementation manner, the target state includes any one of a first state, a second state, and a third state; the first state comprises that the folding gesture of the terminal equipment is that the folding angle of the inner screen of the terminal equipment is within a first target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the second state comprises that the folding angle of the inner screen is within a second target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the third state comprises that the folding angle of the inner screen is in a third target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state.
With reference to the second aspect, in one possible implementation manner, the determining module is further configured to perform inverse processing on acceleration data corresponding to the first acceleration sensor, obtain target acceleration data, and store the target acceleration data in a preset buffer position.
With reference to the second aspect, in one possible implementation manner, the preset buffer location includes a buffer location corresponding to the first acceleration sensor and a buffer location corresponding to the second acceleration sensor; the determining module is further configured to store the target acceleration data in a cache position corresponding to the second acceleration sensor in the preset cache positions.
With reference to the second aspect, in a possible implementation manner, the data processing apparatus further includes a listening module. The monitoring module is configured to detect the starting operation of a user and monitor the first pose sensor and the second state sensor so as to acquire pose data of the first pose sensor and display state data of the second state sensor.
With reference to the second aspect, in a possible implementation manner, the monitoring module is further configured to monitor the first pose sensor and the second state sensor in response to a preset event, so as to obtain pose data of the first pose sensor and display state data of the second state sensor, where the preset event includes monitoring the first acceleration sensor and the second acceleration sensor, or monitoring the motion sensor.
With reference to the second aspect, in one possible implementation manner, the terminal device is a folding screen terminal device.
In a third aspect, the present disclosure provides an electronic device comprising: a memory, a display screen, and one or more processors; the memory, display screen and processor are coupled. Wherein the memory is for storing computer program code, the computer program code comprising computer instructions; the processor is configured to execute one or more computer instructions stored by the memory when the electronic device is running, to cause the electronic device to perform the data processing method according to any one of the first aspects described above.
In a fourth aspect, the present disclosure provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the data processing method of any one of the first aspects.
In a fifth aspect, the present disclosure provides a computer program product for, when run on an electronic device, causing the electronic device to perform the data processing method as in any of the first aspects.
In a sixth aspect, there is provided an apparatus (e.g. the apparatus may be a system-on-a-chip) comprising a processor for supporting a first terminal device to implement the functionality referred to in the first aspect above. In one possible design, the apparatus further comprises a memory for holding program instructions and data necessary for the first terminal device. When the device is a chip system, the device can be formed by a chip, and can also comprise the chip and other discrete devices.
It should be appreciated that the advantages of the second to sixth aspects may be referred to in the description of the first aspect, and are not described herein.
Drawings
Fig. 1 is a schematic structural diagram of a folding screen mobile phone according to an embodiment of the present disclosure.
Fig. 2 is a schematic view of a scenario in which a folding screen mobile phone is in a tent state according to an embodiment of the present disclosure.
Fig. 3 is a schematic view of a scenario in which a folding screen mobile phone is in a desk calendar state according to an embodiment of the present disclosure.
Fig. 4 is a schematic view of a folding screen mobile phone according to an embodiment of the present disclosure.
Fig. 5 is a second view of a scene of different folding angles of a folding screen mobile phone according to an embodiment of the present disclosure.
Fig. 6 is a schematic diagram of a coordinate system of an acceleration sensor in a folding screen mobile phone according to an embodiment of the disclosure.
Fig. 7 is a schematic hardware structure of a terminal device according to an embodiment of the present disclosure.
Fig. 8 is a schematic software structure of a terminal device according to an embodiment of the present disclosure.
Fig. 9 is a schematic flow chart of a data processing method according to an embodiment of the disclosure.
Fig. 10 is a schematic diagram of a buffering scenario provided in an embodiment of the present disclosure.
Fig. 11 is a second flowchart of a data processing method according to an embodiment of the disclosure.
Fig. 12 is a third flowchart of a data processing method according to an embodiment of the disclosure.
Fig. 13 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be described below with reference to the drawings in the embodiments of the present disclosure. Wherein, in the description of the present disclosure, "/" means that the related objects are in a "or" relationship, unless otherwise specified, for example, a/B may represent a or B; the "and/or" in the present disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural. Also, in the description of the present disclosure, unless otherwise indicated, "a plurality" means two or more than two. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural. In addition, in order to clearly describe the technical solutions of the embodiments of the present disclosure, in the embodiments of the present disclosure, the words "first", "second", and the like are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ. Meanwhile, in the embodiments of the present disclosure, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "e.g." in the examples of this disclosure should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
In addition, the network architecture and the service scenario described in the embodiments of the present disclosure are for more clearly describing the technical solution of the embodiments of the present disclosure, and do not constitute a limitation on the technical solution provided by the embodiments of the present disclosure, and as a person of ordinary skill in the art can know, with evolution of the network architecture and appearance of a new service scenario, the technical solution provided by the embodiments of the present disclosure is equally applicable to similar technical problems.
As the technology of the hinge of the inward folding machine matures, more and more folding screen mobile phones are in the field of view of the public. The folding screen mobile phone can be folded in a certain angle, so that the display modes of the folding screen mobile phone are more various. For example, after the folding screen mobile phone is unfolded, the screen can be displayed through the inner screen. After the folding screen mobile phone is buckled, the picture can be displayed through the outer screen.
In the process of unfolding or buckling the folding screen mobile phone, various intermediate states exist. Illustratively, the plurality of intermediate states includes a tent state and a desk calendar state. When the folding screen mobile phone is folded left and right, as shown in (a) of fig. 2, a display effect of the folding screen mobile phone in a tent state, which is folded left and right, is exemplarily shown. As shown in fig. 3 (a), a display effect of a left-right folded folding screen mobile phone in a desk calendar state is exemplarily shown. When the folding screen mobile phone is folded longitudinally, as shown in (b) of fig. 2, a display effect of the folding screen mobile phone folded longitudinally in a tent state is exemplarily shown. As shown in fig. 3 (b), a display effect of the folding screen mobile phone folded in the longitudinal direction in the desk calendar state is exemplarily shown.
In some scenarios, various intermediate states of a folding screen handset may be defined by different folding angles. Based on the folding angle, the folding screen mobile phone can enter different states.
Referring to fig. 4 (a), when the folding angle of the folding screen mobile phone (i.e., the angle between the a-plane and the B-plane) is within a first preset angle range, the folding screen mobile phone is considered to be in an unfolded state. The first preset angle range is, for example, 150 ° to 180 °. Referring to (b) of fig. 4, when the folding angle of the folding screen mobile phone is in the second preset angle range, the folding screen mobile phone is considered to be in the stand state. The second preset angle range is, for example, 45 ° to 150 °. Referring to fig. 4 (c), when the folding angle of the folding screen mobile phone is in the third preset angle range, the folding screen mobile phone is considered to be in a folded state. The third preset angle range is, for example, 0 ° to 45 °.
When the folding screen mobile phone is in a support state, the state of the folding screen mobile phone can be further divided according to the folding angle. Referring to fig. 5 (a), when the folding angle of the folding screen mobile phone is in the fourth preset angle range, the B surface of the folding screen mobile phone is in a horizontally placed state, and the duration of the state exceeds 1s, the folding screen mobile phone is considered to be in a micro notebook state. The fourth preset angle range is, for example, 80 ° to 150 °. When the folding angle of the folding screen mobile phone is not in the fourth preset angle range (namely, the folding angle is more than 150 degrees or less than 80 degrees), or the B surface of the folding screen mobile phone is not in the horizontal placement state, the folding screen mobile phone is not in the miniature notebook state.
Referring to (B) of fig. 5, when the folding angle of the folding-screen mobile phone is in the fifth preset angle range, the a-plane and the B-plane are placed in a "ζangle" state, and the duration of the state exceeds 1s, the folding-screen mobile phone is considered to be in a tent state. The fifth preset angle range is, for example, 45 ° to 80 °. When the folding angle of the folding screen mobile phone is not in the fifth preset angle range (namely, the included angle between the A face and the B face is larger than 80 degrees or smaller than 45 degrees), or the A face and the B face do not show a placing state of an angle, the folding screen mobile phone is considered not to be in a tent state.
Referring to (c) of fig. 5, when the folding screen mobile phone plays music or schedule, the included angle between the a-plane and the B-plane is in the fifth preset angle range, and the duration of the state exceeds 1s, the folding screen mobile phone is considered to be in the desk calendar state. When the folding angle of the folding screen mobile phone is not in the fifth preset angle range (namely, the included angle between the A face and the B face is larger than 80 degrees or smaller than 45 degrees), or the folding screen mobile phone does not play music or schedules, the folding screen mobile phone is not in a desk calendar state.
When the folding angle of the folding screen mobile phone is in a sixth preset angle range, the screen of the folding screen mobile phone is in a vertical placement state, and the duration of the state exceeds 1s, the folding screen mobile phone is considered to be in a vertical screen support state. The sixth preset angle range is, for example, 45 ° to 150 °. When the folding angle of the folding screen mobile phone is not in a sixth preset angle range (namely, the included angle between the A face and the B face is larger than 150 degrees or smaller than 45 degrees), or the screen of the folding screen mobile phone is not in a vertical placement state, the folding screen mobile phone is not considered to be in a vertical screen support state.
It should be noted that, the first preset angle range, the second preset angle range, the third preset angle range, the fourth preset angle range, the fifth preset angle range and the sixth preset angle range may be preset according to requirements, and may be adjusted at any time based on actual use, which does not limit the above ranges.
Fig. 6 illustrates a coordinate system of an acceleration sensor in a folding screen phone. As shown in fig. 6, the folding screen mobile phone includes a first acceleration sensor and a second acceleration sensor. Among them, the first acceleration sensor may also be referred to as a sub acceleration sensor, i.e., a sub Accelerometer (sub ACC). The second acceleration sensor may also be referred to as a main acceleration sensor, i.e. a main ACC. The main acceleration sensor is positioned on the bottom surface (namely the B surface of the folding screen mobile phone), and the auxiliary acceleration sensor is positioned on the top surface (namely the A surface of the folding screen mobile phone). Wherein the acceleration sensor of the B-plane can be identified by 0x 1. The gyro sensor of the B-side can be identified by 0x 4. The acceleration sensor of the a-plane can be identified by 0x 20001. The gyro sensor of the a-plane can be identified by 0x 20003. It should be noted that the a-plane acceleration sensor and the gyro sensor may also be represented by other marks, which is not limited in this disclosure.
Because only the acceleration data of the main ACC can be obtained in the folding screen mobile phone under the android system of the Google standard, when the folding screen mobile phone is in an intermediate state (for example, a desk calendar state), the display interface of the folding screen mobile phone is a C face, and the acceleration sensor related to the C face is a secondary ACC and not the main ACC. In the related art, a folding screen mobile phone adjusts the display direction of the C face according to acceleration data of the main ACC. This may cause display deviation of the C-plane, for example, display inversion of the C-plane. This will directly affect the user's experience of using the folding screen phone.
Therefore, the embodiment of the disclosure provides a data processing method, when a folding screen mobile phone is in an intermediate state, the display direction of the folding screen mobile phone can be accurately adjusted by using the data processing method, so that the folding screen mobile phone can be accurately displayed on a target display interface. Therefore, the user obtains better use experience in the switching of the horizontal display and the vertical display of the folding screen mobile phone.
Fig. 7 shows a schematic hardware structure of a terminal device. As shown in fig. 7, the terminal device may include: processor 710, external memory interface 720, internal memory 721, universal serial bus (universal serial bus, USB) interface 730, charge management module 740, power management module 741, battery 742, antenna 1, antenna 2, mobile communication module 750, wireless communication module 760, audio module 770, speaker 770A, receiver 770B, microphone 770C, headset interface 730D, sensor module 780, keys 790, motor 791, indicator 792, camera 793, display 794, and subscriber identity module (subscriber identification module, SIM) card interface 795, among others. The sensor module 780 may include a pressure sensor 780A, a gyroscope sensor 780B, an air pressure sensor 780C, a magnetic sensor 780D, an acceleration sensor 780E, a distance sensor 780F, a proximity light sensor 780G, a fingerprint sensor 780H, a temperature sensor 780J, a touch sensor 780K, an ambient light sensor 780L, a bone conduction sensor 780M, and the like.
It will be appreciated that the structure illustrated in this embodiment does not constitute a specific limitation on the terminal device. In other embodiments, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 710 may include one or more processing units such as, for example: processor 710 may include an application processor (application processor, AP), modem, graphics processor (graphics processing unit, GPU), image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The charge management module 740 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 741 is configured to connect the battery 742, and the charge management module 740 and the processor 710. The power management module 741 receives input from the battery 742 and/or the charge management module 740 and provides power to the processor 710, the internal memory 721, the display 794, the camera 793, the wireless communication module 760, and the like.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 750, the wireless communication module 760, the modem, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the terminal device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 750 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied to a terminal device.
The wireless communication module 760 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (bl) terminal device, global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (freq) terminal device, short-range wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied on the terminal device. The wireless communication module 760 may be one or more devices that integrate at least one communication processing module. The wireless communication module 760 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 710. The wireless communication module 760 may also receive signals to be transmitted from the processor 710, frequency modulate them, amplify them, and convert them to electromagnetic waves for radiation via the antenna 2.
The terminal device implements display functions through a GPU, a display screen 794, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 794 and the application processor.
The display 794 is used to display images, video, and the like. A series of graphical user interfaces (graphical user interface, GUI) may be displayed on the display 794 of the terminal device.
The terminal device may implement a photographing function through an ISP, a camera 793, a video codec, a GPU, a display screen 794, an application processor, and the like.
The camera 793 is used to capture still images or video.
The external memory interface 720 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device.
Internal memory 721 may be used to store computer-executable program code that includes instructions. The processor 710 executes various functional applications of the terminal device and data processing by executing instructions stored in the internal memory 721.
The terminal device can implement audio functions by an audio module 770, a speaker 770A, a receiver 770B, a microphone 770C, an ear-headphone interface 730D, an application processor, and so forth. Such as music playing, recording, etc. The terminal device may also include a pressure sensor 780A, a barometric pressure sensor 780C, a gyroscope sensor 780B, a magnetic sensor 780D, an acceleration sensor 780E, a distance sensor 780F, a proximity light sensor 780G, an ambient light sensor 780L, a fingerprint sensor 780H, a temperature sensor 780J, a touch sensor 780K, a bone conduction sensor 780M, keys 790, a motor 791, an indicator 792, and the like.
The SIM card interface 795 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 795 or withdrawn from the SIM card interface 795 to enable contact and separation with the terminal equipment. The terminal device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 795 may support a Nano SIM card, micro SIM card, etc. The same SIM card interface 795 may be used to insert multiple cards simultaneously. SIM card interface 795 may also be compatible with external memory cards. The terminal equipment interacts with the network through the SIM card to realize the functions of communication, data communication and the like.
Further, on the above components, an operating system such as a hong Meng operating system, an iOS operating system, an Android operating system, a Windows operating system, and the like is run. An operating application may be installed on the operating system. In other embodiments, there may be multiple operating systems running within the terminal device.
It should be understood that the hardware modules included in the terminal device shown in fig. 7 are only described by way of example, and the specific structure of the terminal device is not limited. In fact, the terminal device provided in the embodiments of the present disclosure may further include other hardware modules having an interaction relationship with the hardware modules illustrated in the drawings, which is not specifically limited herein. For example, the terminal device may also include a flash, a miniature projection device, etc. As another example, if the terminal device is a PC, the terminal device may further include a keyboard, a mouse, and the like.
The software system of the terminal equipment can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. In the embodiment of the invention, an Android system with a layered architecture is taken as an example, and the software structure of terminal equipment is illustrated.
Fig. 8 is a software configuration block diagram of a terminal device of an embodiment of the present disclosure. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into six layers, from top to bottom, an application layer (APP), an application framework layer (FWK), a system library and Android runtime (Android run), a hardware abstraction layer (hardware abstraction layer, HAL), a sensor subsystem framework layer, and a sensor subsystem driver, respectively.
The application layer may include a series of applications. By way of example, applications may include music, calendars, maps, bluetooth, games, video, cameras, conversations, navigation, browser, etc. Applications (APP).
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
By way of example, the application framework layer may include a view system, a window manager, a notification manager, a content provider, a system service, an activity manager, and a resource manager, to which the disclosed embodiments do not limit in any way.
Wherein the window manager (windowmanager service): for managing graphical user interface (graphical user interface, GUI) resources used on screen, in particular: the method comprises the steps of obtaining the screen size, creating and destroying the window, displaying and hiding the window, layout of the window, management of focus, input method and wallpaper management and the like.
System services (SystemService) are used to implement system monitoring and management functions, resource allocation and scheduling functions, security and rights management functions, network communication and protocol support, user interface and interaction support functions, and background task execution functions, etc. The system service can ensure the normal operation of the Android system, and provides an efficient, stable and safe processing environment for users. Through the collaborative work of system services, the Android system can effectively meet the demands of users and complete various tasks.
As shown in fig. 8, a sensor service (sensor service) may be provided in the system service. An application in the application layer may launch a sensor service by calling a preset API. Sensor service may interact with a Sensor hardware interface definition language (Sensor Hardware Interface Definition Language, sensor HIDL) service in a hardware abstraction layer (hardware abstraction layer, HAL) during operation.
Activity Manager (Activity Manager): for managing the lifecycle of each application. Applications typically run in the operating system in the form of activities. For each Activity, there will be an application record (activaterecord) in the Activity manager corresponding to it, which records the status of the application's Activity. The Activity manager may schedule the application's Activity process using this Activity record as an identification.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional image engines (e.g., SGL), etc.
The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The two-dimensional image engine is a drawing engine for 2D drawing.
Android runtime (Android run) includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The HAL layer includes Sensor HIDL and a high-pass inter-core communication interface (QMI inter-core communication module). Among them, the Sensor HIDL includes a main acceleration Sensor module (main ACC module) and a sub acceleration Sensor module (sub ACC module). The main ACC module and the auxiliary ACC module receive target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC sent by the sensor client management module through the QMI inter-core communication module and the Gao Tongtong communication interface (QMI communication module), so that the follow-up application can adjust the display direction of the target display interface based on the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, and the target display interface can be displayed correctly. The QMI inter-core communication module is used for realizing communication among different processors in the android system.
The sensor subsystem framework includes a plurality of functional modules. For example: QMI communication module, sensor client manager (Sensor client Sensor), and event distribution management module, etc.
Wherein the QMI communication module is a standard interface. The QMI communication module is used for realizing the functions of data transmission, network connection management, communication protocol support, asynchronous event processing and the like. The QMI communication module and the QMI inter-core communication module in the present disclosure enable interaction between Sensor HIDL and Sensor client management module in HAL layer.
The sensor client management module is a software component. The sensor client management module is used for realizing sensor detection and identification, sensor data reading, sensor event monitoring, sensor parameter setting, energy consumption management and the like in the terminal equipment. It enables the terminal device to operate and utilize the sensor conveniently, realizing more functions. The sensor client management module in the present disclosure is configured to send a terminal device or an instruction applied in the terminal device to the event distribution management module, so as to obtain corresponding sensor data information. Exemplary sensor data information includes at least one of acceleration data, gyroscope data, light data, motion data, gesture data.
The event distribution management module is a software component for managing and handling events. Its main functions include event registration and monitoring, event distribution, event processing, filtering and forwarding of events, event priority and sequence management, event thread management, etc. to help the application program to implement event driving. The event distribution management module in the present disclosure may control the acceleration virtual sensor to monitor the attitude sensor and the common sensor in response to the instructions of the sensor client management module. The event distribution management module can also acquire target acceleration data corresponding to the main acceleration sensor and target acceleration data corresponding to the auxiliary acceleration sensor through the exchange module.
The sensor subsystem driver includes a plurality of functional modules, such as: acceleration virtual sensor, gesture sensor, common sensor, main acceleration sensor, vice acceleration sensor, motion sensor and exchange module.
The acceleration virtual sensor can respond to the instruction of the event distribution management module and monitor the gesture sensor and the common sensor so as to determine gesture data of the terminal equipment according to gesture data corresponding to the gesture sensor and display state data corresponding to the common sensor.
The gesture sensor is used for sensing the inclination angle, the rotation angle, the direction and the like of the terminal equipment. The attitude sensor may be formed by a plurality of sensors, such as an acceleration sensor, a gyro sensor, and a magnetometer sensor.
The common sensor refers to a common type of sensor. For example, common sensors include: acceleration Sensor, gyroscope Sensor (gyroscillope), magnetometer Sensor (Magnetometer), light Sensor (Ambient Light Sensor), distance Sensor (Proximity Sensor), global positioning system (Global Positioning System, GPS), fingerprint Sensor (Fingerprint Sensor), barometer Sensor (Barometer), and the like.
The main acceleration sensor is used for detecting the acceleration and the movement direction of the B surface of the terminal equipment in three axial directions. The auxiliary acceleration sensor is used for detecting the acceleration and the movement direction of the A face of the terminal equipment in three axial directions.
The motion sensor is used for sensing and recording the motion state and activity of the terminal equipment. The motion sensor can acquire motion information of the terminal equipment by detecting parameters such as acceleration, angular velocity, magnetic field and the like.
The exchange module is used for receiving the notification message from the event distribution management module, and processing the acceleration data corresponding to the main ACC and the acceleration data corresponding to the auxiliary ACC in response to the notification message so as to obtain the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC. The exchange module is also used for uploading target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC.
A data processing method provided by an embodiment of the present disclosure is described in detail below with reference to the accompanying drawings. The method can be applied to the terminal equipment. The terminal device may be, for example, a folding screen cell phone. As shown in fig. 9, the method specifically includes:
step 901, the terminal device detects a startup operation of a user, and the terminal device starts up and sends a first notification to an acceleration virtual sensor (ACC virtual sensor).
The starting operation is used for triggering the terminal equipment to start.
The power-on operation may be any one of a long-press power operation, a double-click operation, a knuckle click, and a multi-finger selection operation, for example. In some examples, when a user needs to use the terminal device, the user may trigger the terminal device to boot through a boot operation. After the terminal device is started, various application programs can be utilized to provide multiple functions for users.
After the terminal device is started, the terminal device can send a first notification to the ACC virtual sensor, so that gesture data of the terminal device are obtained.
In some examples, the first notification may be a snoop request. The monitoring request is used for indicating the ACC virtual sensor to monitor the gesture sensor and the common sensor so as to acquire the gesture data corresponding to the gesture sensor and the display state data corresponding to the common sensor. And finally, acquiring the gesture data of the terminal equipment according to the gesture data corresponding to the gesture sensor and the display state data corresponding to the common sensor.
The pose data of the terminal device includes pose data and display state data of the terminal device. The pose data of the terminal device is used for representing the folding pose of the terminal device. The display state data of the terminal device is used for representing the display state of each screen in the terminal device. For example, the display state data of the terminal device includes data corresponding to the display state of each screen in the terminal device, data corresponding to whether or not in the sleep state, data corresponding to the duration of maintaining a certain state, and the like. Wherein each screen in the terminal device comprises an inner screen (i.e. a face and a B face) and an outer screen (i.e. a C face).
In some examples, the folded pose of the terminal device may be derived from pose data corresponding to the pose sensor. As can be seen by way of example in connection with fig. 2-5, the collapsed position of the terminal device includes an expanded position, a stand position, a collapsed position, a mini-notebook position, a tent position, a desk calendar position, a vertical screen stand position, and the like.
In some examples, a common sensor is not specific to a particular sensor, but refers to a common class of sensors. Different sensors may be used to measure, monitor or detect various physical quantities or environmental parameters. Exemplary, common sensors include: acceleration Sensor, gyroscope Sensor (gyroscillope), magnetometer Sensor (Magnetometer), light Sensor (Ambient Light Sensor), distance Sensor (Proximity Sensor), global positioning system (Global Positioning System, GPS), fingerprint Sensor (Fingerprint Sensor), barometer Sensor (Barometer), and the like.
The acceleration sensor is used for detecting the acceleration and the movement direction of the terminal equipment in three axial directions. The gyro sensor is used to detect the rotational speed and angle of the terminal device. Magnetometer sensors, also known as electronic compasses, are used to detect the magnetic field around a terminal device in order to obtain directional information of the terminal device. The light sensor is used for detecting the change of the ambient light intensity around the terminal equipment so as to adjust the screen brightness. The distance sensor is used to detect the distance between the terminal device and the object, typically to automatically turn off the screen or adjust the brightness when the earpiece is close. The GPS sensor determines the geographic location of the terminal device using a satellite positioning system. The fingerprint sensor (Fingerprint Sensor) is used for fingerprint identification and unlocking of the terminal device. An air pressure sensor (Barometer) is used to measure the atmospheric pressure. Typically, atmospheric pressure is used to determine altitude. It will be appreciated that the above plurality of sensors is only one example, and that common sensors may also include other sensors, as the disclosure is not limited in this regard.
According to the display state data corresponding to the common sensor, the display state of each screen in the terminal equipment can be obtained. Illustratively, the display status data of the terminal device includes data corresponding to the time stamp information, data corresponding to the timing information recorded by the timer, and the like, which is not limited by the present disclosure.
In some examples, the terminal device sends the first notification to the ACC virtual sensor when the terminal device is powered on. The process of sending the first notification to the ACC virtual sensor by the terminal equipment comprises the following steps: firstly, a terminal device sends a monitoring request to a sensor service (sensor service) in a system service (system service), wherein the monitoring request is used for requesting to acquire gesture data of the terminal device. Then, the sensor service receives the snoop request. Then, the Sensor service transmits the listening state request to the Sensor HIDL service in response to the listening request, and the Sensor HIDL service transmits the listening state request to the Sensor client manager through the QMI inter-core communication module and the QMI communication module. And after receiving the monitoring state request, the Sensor client manager sends the monitoring state request to the event distribution management module. After receiving the monitoring state request, the event distribution management module sends a first notification to the ACC virtual sensor so that the ACC virtual sensor monitors the gesture sensor and the common sensor, and gesture data of the terminal device are obtained. The Sensor HIDL service comprises a main ACC module and a secondary ACC module. The main ACC module and the auxiliary ACC module are used for receiving target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC.
Step 902, the ACC virtual sensor receives a first notification.
In step 903, in response to the first notification, the ACC virtual sensor listens to the pose sensor and the common sensor, so as to obtain pose data corresponding to the listening pose sensor and display state data corresponding to the common sensor.
After the ACC virtual sensor receives the first notification, responding to the first notification, and monitoring the gesture sensor and the common sensor to obtain gesture data corresponding to the gesture sensor and display state data corresponding to the common sensor. Then, based on the pose data corresponding to the pose sensor and the display state data corresponding to the common sensor, the pose data of the terminal equipment can be determined.
Step 904, the ACC virtual sensor sends pose data corresponding to the pose sensor and display state data corresponding to the common sensor to the event distribution management module.
In step 905, the event distribution management module receives pose data corresponding to the pose sensor and display state data corresponding to the common sensor, and determines pose data of the terminal device according to the pose data corresponding to the pose sensor and the display state data corresponding to the common sensor.
In some examples, the event distribution management module may determine a folded pose of the terminal device according to pose data corresponding to the pose sensor. The event distribution management module can determine the display state of each screen in the terminal equipment according to the display state data corresponding to the common sensor. And then according to the folding gesture of the terminal equipment and the display state of each screen in the terminal equipment, gesture data of the terminal equipment are obtained.
For example, the event distribution management module may determine that the terminal device is in an expanded state according to pose data corresponding to the pose sensor. The event distribution management module can determine that the inner screen of the terminal equipment is in a screen-off state and the outer screen of the terminal equipment is in a screen-on state according to display state data corresponding to the common sensor. And then according to the folding gesture of the terminal equipment and the display state of each screen in the terminal equipment, obtaining gesture data of the terminal equipment to be in an unfolding state of the terminal equipment, wherein the inner screen is in a screen-off state, and the outer screen is in a screen-on state.
Step 906, the event distribution management module determines that the terminal device is in a target state based on the gesture data of the terminal device.
After determining the posture data of the terminal equipment, comparing the posture data of the terminal equipment with the state data corresponding to the target state, and determining that the terminal equipment is in the target state under the condition that the posture data of the terminal equipment is consistent with the state data corresponding to the target state. And under the condition that the posture data of the terminal equipment is inconsistent with the state data corresponding to the target state, determining that the terminal equipment is not in the target state.
In some examples, the target state includes three states, a first state, a second state, and a third state, respectively. The first state comprises that the terminal equipment is in a tent state, an inner screen of the terminal equipment is in a screen-off state, and an outer screen of the terminal equipment is in a screen-on state. The second state comprises that the terminal equipment is in a bracket state, the inner screen of the terminal equipment is in a screen-off state, and the outer screen is in a screen-on state. The third state comprises that the terminal equipment is in an unfolding state, the inner screen of the terminal equipment is in a screen-off state, and the outer screen of the terminal equipment is in a screen-on state. It is to be understood that the target state is not limited to the above three states, and other states may be added according to actual requirements, which is not limited by the present disclosure.
Step 907, the event distribution management module sends a second notification to the switching module in case the terminal device is in the target state.
Wherein the second notification may be an exchange request. The exchange request is used for indicating to process the acceleration data corresponding to the main ACC and the acceleration data corresponding to the auxiliary ACC according to a preset rule to obtain target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC.
In combination with the foregoing, the terminal device can only acquire the acceleration data reported by the main ACC. The main ACC is the ACC corresponding to the B-plane. When the terminal equipment is in the target state, the target display interface of the terminal equipment is a C surface. The coordinate system corresponding to the target display interface and the coordinate system of the main ACC are not the same coordinate system. The coordinate system corresponding to the target display interface and the coordinate system of the auxiliary ACC show opposite relations. Therefore, the target acceleration data corresponding to the target display interface can be obtained according to the auxiliary ACC.
According to the secondary ACC, the target acceleration data corresponding to the target display interface can be obtained by creating a preset rule according to the inverse relation between the coordinate system corresponding to the target display interface and the coordinate system of the secondary ACC. And obtaining target acceleration data corresponding to the target display interface based on a preset rule. The preset rule is that when the terminal device is in a target state, the acceleration data corresponding to the auxiliary ACC is processed in a reverse mode to obtain processed acceleration data, and the processed acceleration data is used as target acceleration data corresponding to a target display interface.
Because the terminal equipment can only acquire the acceleration data reported by the main ACC, the exchange module is required to exchange the target acceleration data (i.e. the processed acceleration data) corresponding to the target display interface with the acceleration data corresponding to the main ACC, so that the terminal equipment can acquire the target acceleration data corresponding to the target display interface.
In some examples, the switching module embeds a cache unit. The cache unit comprises a cache pool corresponding to the main ACC and a cache pool corresponding to the auxiliary ACC. The exchange module exchanges the target acceleration data corresponding to the target display interface with the acceleration data corresponding to the main ACC, so that the target acceleration data corresponding to the target display interface is stored in the buffer pool corresponding to the main ACC, and the acceleration data corresponding to the main ACC is stored in the buffer pool corresponding to the auxiliary ACC. And then, the data in the cache pool corresponding to the main ACC and the cache pool corresponding to the auxiliary ACC can be reported to an event distribution management module.
Illustratively, the relationship between the target state and the cached data is shown in Table 1 below.
TABLE 1
According to table 1, when the target state of the terminal device is that the terminal device is in a tent state, the inner screen is in a screen-off state, and the outer screen is in a screen-on state, the cache data in the cache pool corresponding to the main ACC in the cache unit is-X pair 1, Y pair 1, and-Z pair 1. The cache data in the cache pool corresponding to the auxiliary ACC are-X main 1, Y main 1 and Z main 1.
The X pair 1, the Y pair 1 and the Z pair 1 are acceleration data acquired by the auxiliary ACC when the terminal equipment is in a tent state, an inner screen of the terminal equipment is in a screen-off state and an outer screen of the terminal equipment is in a screen-on state under a coordinate system corresponding to the auxiliary ACC. The X main unit 1, the Y main unit 1 and the Z main unit 1 are acceleration data acquired by the main ACC when the terminal equipment is in a tent state, the inner screen is in a screen-off state and the outer screen is in a screen-on state under a coordinate system corresponding to the main ACC.
When the target state of the terminal equipment is that the terminal equipment is in a bracket state, the inner screen is in a screen-off state, and the outer screen is in a screen-on state, the cache data in the cache pool corresponding to the main ACC in the cache unit are-X pair 2, Y pair 2 and-Z pair 2. The cache data in the cache pool corresponding to the auxiliary ACC are-X main 2, Y main 2 and Z main 2.
The X pair 2, the Y pair 2 and the Z pair 2 are acceleration data acquired by the auxiliary ACC when the terminal equipment is in a tent state, the inner screen is in a screen-off state and the outer screen is in a screen-on state under a coordinate system corresponding to the auxiliary ACC. The X main unit 2, the Y main unit 2 and the Z main unit 2 are acceleration data acquired by the main ACC when the terminal equipment is in a tent state, the inner screen is in a screen-off state and the outer screen is in a screen-on state under a coordinate system corresponding to the main ACC.
When the target state of the terminal equipment is that the terminal equipment is in an unfolding state, the inner screen is in a screen-off state, and the outer screen is in a screen-on state, cache data in a cache pool corresponding to the main ACC in the cache unit are-X pair 3, Y pair 3 and-Z pair 3. The cache data in the cache pool corresponding to the auxiliary ACC are-X main 3, Y main 3 and Z main 3.
The X pair 3, the Y pair 3 and the Z pair 3 are acceleration data acquired by the auxiliary ACC when the terminal equipment is in a tent state, the inner screen is in a screen-off state and the outer screen is in a screen-on state under a coordinate system corresponding to the auxiliary ACC. The X main unit 3, the Y main unit 3 and the Z main unit 3 are used for representing acceleration data acquired by the main ACC when the terminal equipment is in a tent state, the inner screen is in a screen-off state and the outer screen is in a screen-on state under the coordinate system corresponding to the main ACC.
Step 908, the switching module receives a second notification.
In step 909, in response to the second notification, the exchange module processes the acceleration data corresponding to the main ACC and the acceleration data corresponding to the auxiliary ACC according to a preset rule, so as to obtain target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC.
After the exchange module receives the second notification, the exchange module responds to the second notification to process the acceleration data corresponding to the main ACC and the acceleration data corresponding to the auxiliary ACC, so as to obtain target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC. According to the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, the terminal equipment can acquire the target acceleration data corresponding to the target display interface, so that the target display interface can be correctly displayed.
In some examples, the processing, by the exchange module, the acceleration data corresponding to the primary ACC and the acceleration data corresponding to the secondary ACC, and obtaining the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC may include: firstly, the exchange module performs inverse processing on acceleration data corresponding to the auxiliary ACC according to a preset rule to obtain processed acceleration data. The processed acceleration data is the target acceleration data corresponding to the target display interface. And then exchanging the acceleration data corresponding to the main ACC with the processed acceleration data to obtain target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC. The target acceleration data corresponding to the main ACC are processed acceleration data. The target acceleration data corresponding to the secondary ACC is acceleration data corresponding to the primary ACC.
In other examples, in order to ensure consistency between the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC, the target acceleration data corresponding to the secondary ACC may be obtained by performing a subtraction process on the acceleration data corresponding to the primary ACC.
In general, after the main ACC obtains the acceleration data corresponding to the main ACC, the main ACC stores the acceleration data corresponding to the main ACC in the buffer pool corresponding to the main ACC. After the auxiliary ACC acquires the acceleration data corresponding to the auxiliary ACC, the auxiliary ACC stores the acceleration data corresponding to the auxiliary ACC into a cache pool corresponding to the auxiliary ACC. When interruption of the main ACC comes, the exchange module acquires acceleration data corresponding to the main ACC from a cache pool corresponding to the main ACC. When the interruption of the secondary ACC comes, the exchange module acquires acceleration data corresponding to the secondary ACC from a cache pool corresponding to the secondary ACC.
As shown in fig. 10, if the exchange module receives the second notification, the exchange module may cause the primary ACC to store the acceleration data corresponding to the primary ACC into the buffer pool corresponding to the secondary ACC. Then, the acceleration data corresponding to the primary ACC in the buffer pool corresponding to the secondary ACC is set as target acceleration data corresponding to the secondary ACC. When the interruption of the auxiliary ACC comes, the exchange module acquires target acceleration data corresponding to the auxiliary ACC from a cache pool corresponding to the auxiliary ACC, and uploads the target acceleration data corresponding to the auxiliary ACC to the event distribution management module.
Meanwhile, the exchange module enables the auxiliary ACC to store acceleration data corresponding to the auxiliary ACC into a cache pool corresponding to the main ACC. The exchange module performs inverse processing on the acceleration data corresponding to the auxiliary ACC in the buffer pool corresponding to the main ACC according to a preset rule to obtain processed acceleration data (namely, target acceleration data corresponding to a target display interface). And replacing the acceleration data corresponding to the auxiliary ACC by the processed acceleration data, and taking the processed acceleration data as target acceleration data corresponding to the main ACC. When interruption of the main ACC comes, the exchange module acquires target acceleration data corresponding to the main ACC from a cache pool corresponding to the main ACC, and uploads the target acceleration data corresponding to the main ACC to the event distribution management module.
In other examples, if the exchange module receives the second notification, after the primary ACC stores the acceleration data corresponding to the primary ACC in the buffer pool corresponding to the secondary ACC, the exchange module may further perform the inverse processing on the acceleration data corresponding to the primary ACC in the buffer pool corresponding to the secondary ACC, to obtain the target acceleration data corresponding to the secondary ACC.
In some examples, after the secondary ACC obtains the acceleration data corresponding to the secondary ACC, if the exchange module receives the second notification, the exchange module performs inverse processing on the acceleration data corresponding to the secondary ACC to obtain processed acceleration data, and uses the processed acceleration data as the target acceleration data corresponding to the primary ACC. And then storing the target acceleration data corresponding to the main ACC in a buffer pool corresponding to the main ACC. When interruption of the main ACC comes, the exchange module acquires target acceleration data corresponding to the main ACC from a cache pool corresponding to the main ACC, and uploads the target acceleration data corresponding to the main ACC to the event distribution management module.
When the main ACC acquires the acceleration data corresponding to the main ACC, the exchange module takes the acceleration data corresponding to the main ACC as the target acceleration data corresponding to the auxiliary ACC, and stores the target acceleration data corresponding to the auxiliary ACC in a cache pool corresponding to the auxiliary ACC. When the interruption of the auxiliary ACC comes, the exchange module acquires target acceleration data corresponding to the auxiliary ACC from a cache pool corresponding to the auxiliary ACC, and uploads the target acceleration data corresponding to the auxiliary ACC to the event distribution management module.
In other examples, when the main ACC obtains the acceleration data corresponding to the main ACC, the exchange module performs the inverse processing on the acceleration data corresponding to the main ACC to obtain the target acceleration data corresponding to the auxiliary ACC, and stores the target acceleration data corresponding to the auxiliary ACC in the buffer pool corresponding to the auxiliary ACC.
In some examples, the arrival time of the break of the primary ACC and the arrival time of the break of the secondary ACC may be preset. Illustratively, the exchange module presets that the interruption of the primary ACC and the interruption of the secondary ACC occur at a fixed period. For example, the fixed period is 100ms.
In some examples, the exchange module may implement the above-described process of processing the acceleration data corresponding to the primary ACC and the acceleration data corresponding to the secondary ACC based on a soft first-in first-out (First Input First Output, FIFO) mechanism, so as to obtain the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC.
Step 910, the exchange module sends target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC to the event distribution management module.
Step 911, the event distribution management module receives and reports the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC.
After the event distribution management module receives the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, the event distribution management module reports the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, so that the terminal equipment adjusts the display direction of the target display interface according to the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, and the target display interface is correctly displayed.
In some examples, after the terminal device receives the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC, the terminal device may directly use the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC. When the terminal equipment has a specific processing mechanism, the terminal equipment can automatically decide to adopt the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC to adjust the display direction of the target display interface. And adjusting the display direction of the target display interface by adopting the acceleration data corresponding to the original main ACC and the acceleration data corresponding to the auxiliary ACC. The present disclosure is not limited in this regard. Wherein a specific processing mechanism may be used to solve the problems presented in the background.
In some examples, reporting, by the event distribution management module, the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC may include: first, the event distribution management module transmits a reply message to the Sensor client manager, the reply message including target acceleration data corresponding to the primary ACC and target acceleration data corresponding to the secondary ACC. The Sensor client manager then receives the reply message. In response to the reply message, the Sensor client manager transmits an acceleration data reply message to the Sensor HIDL service through the QMI inter-core communication module and the QMI communication module, the acceleration data reply message including target acceleration data corresponding to the primary ACC and target acceleration data corresponding to the secondary ACC. The Sensor HIDL service receives the acceleration data reply message. And in response to the acceleration data reply message, sending target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC to the SensorService in the SystemService. And the subsequent terminal equipment executes corresponding adjustment operation on the display interface based on the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC in the sensor service.
In some examples, after receiving the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, if the display direction of the target display interface needs to be adjusted, the display direction of the target display interface may be adjusted using the target acceleration data corresponding to the main ACC. If the display direction of the target display interface does not need to be adjusted, the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC can be released.
Therefore, when the terminal equipment is started, the folding posture of the terminal equipment and the display state of each screen can be obtained through the posture sensor and the common sensor. The state of the terminal equipment can be obtained according to the folding posture of the terminal equipment and the display state of each screen. When the terminal equipment is in a target state, the display interface corresponding to the first acceleration sensor and the target display interface have opposite relations, so that the target acceleration data corresponding to the target display interface can be accurately obtained by utilizing the opposite relations and the acceleration data corresponding to the auxiliary ACC. If the target acceleration data is used later, the target display interface can be adjusted by utilizing the target acceleration data so as to enable the target display interface to be displayed correctly, and therefore the use experience of a user is improved.
The following describes a flow of a data processing method according to an embodiment of the present disclosure, as shown in fig. 11, specifically including:
step 1101a, in response to a user operation, the pending application initiates a first listening event.
Wherein the first listening event is for indicating to listen to the primary ACC and the secondary ACC.
By way of example, the user operation may be any one of a single click operation, a double click operation, a knuckle tap, and a multi-finger selection operation. The application to be processed is an application installed in the terminal device. The application to be processed may be a system application or a third party application. For example, the applications to be processed may be video-type applications, photo-type applications, conference-type applications, sports-type applications, music-type applications, and so on. The present disclosure is not limited to the application to be processed.
In some examples, when the application to be processed is a video-type application, the user operation may be a video play operation. In response to the video playing operation, the application to be processed recognizes the direction of the terminal device by monitoring the acceleration data corresponding to the main ACC and the acceleration data corresponding to the sub ACC. And then, according to the direction of the terminal equipment, the video can be correctly displayed and played for the user on the target display interface.
In some examples, when the application to be processed is a photographing-type application, the user operation may be a photographing operation. In response to the shooting operation, the application to be processed recognizes the direction of the terminal equipment by monitoring the acceleration data corresponding to the main ACC and the acceleration data corresponding to the auxiliary ACC, so that the shooting object is correctly displayed for the user on the shooting interface (namely, the target display interface) according to the direction of the terminal equipment, and the user can shoot based on the shooting object displayed on the shooting interface.
Step 1101b, in response to the user operation, the pending application initiates a second listening event.
Wherein the second listening event is for indicating to listen to the motion sensor.
In some examples, in response to a user operation, the pending application initiates a second listening event to listen for a motion sensor in the terminal device. The movement state of the terminal device can be detected by the movement sensor. And then the target display interface of the terminal equipment can be adjusted according to the motion state of the terminal equipment.
Step 1102, sending a first notification to the ACC virtual sensor if the pending application triggers a first listening event or a second listening event.
And under the condition that the pending application triggers the first monitoring event or the second monitoring event, a first notification can be sent to the ACC virtual sensor so as to obtain gesture data of the terminal equipment based on the ACC virtual sensor. In practical applications, if other scenes also need to monitor the gesture sensor and the common sensor through the ACC virtual sensor, other monitoring events can be added. The present disclosure is not limited to snoop events.
The step 901 in the embodiment shown in fig. 9 may be referred to for sending the first notification to the ACC virtual sensor in the embodiment of the present disclosure, which is not described herein.
Step 1103, the ACC virtual sensor receives a first notification.
In step 1104, in response to the first notification, the ACC virtual sensor listens to the pose sensor and the common sensor to obtain pose data corresponding to the listening pose sensor and display state data corresponding to the common sensor.
In the embodiment of the present disclosure, the ACC virtual sensor receives the first notification, and in response to the first notification, the ACC virtual sensor listens to the gesture sensor and the common sensor to obtain the pose data corresponding to the listening gesture sensor and the display state data corresponding to the common sensor, which may refer to steps 902 to 903 in the embodiment shown in fig. 9, and the embodiments of the present disclosure are not repeated herein.
Step 1105, the ACC virtual sensor sends pose data corresponding to the pose sensor and display state data corresponding to the common sensor to the event distribution management module.
Step 1106, the event distribution management module receives pose data corresponding to the pose sensor and display state data corresponding to the common sensor, and determines pose data of the terminal device according to the pose data corresponding to the pose sensor and the display state data corresponding to the common sensor.
In the embodiment of the present disclosure, the ACC virtual sensor sends pose data corresponding to the pose sensor and display state data corresponding to the common sensor to the event distribution management module, and the event distribution management module receives the pose data corresponding to the pose sensor and the display state data corresponding to the common sensor, and determines the pose data of the terminal device according to the pose data corresponding to the pose sensor and the display state data corresponding to the common sensor, which may refer to steps 904 to 905 in the embodiment shown in fig. 9.
Step 1107, the event distribution management module determines that the terminal device is in a target state based on the gesture data of the terminal device.
The determining, by the event distribution management module in the embodiment of the present disclosure, that the terminal device is in the target state based on the gesture data of the terminal device may refer to step 906 in the embodiment shown in fig. 9, which is not described herein.
In step 1108, the event distribution management module sends a second notification to the switching module when the terminal device is in the target state.
In the embodiment of the present disclosure, in the case that the terminal device is in the target state, the sending, by the event distribution management module, the second notification to the exchange module may refer to step 907 in the embodiment shown in fig. 9, which is not described herein.
Step 1109, the exchange module receives a second notification.
Step 1110, in response to the second notification, the exchange module processes the acceleration data corresponding to the main ACC and the acceleration data corresponding to the auxiliary ACC according to a preset rule, so as to obtain target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC.
In the embodiment of the present disclosure, the exchange module receives the second notification and responds to the second notification, and the exchange module processes the acceleration data corresponding to the primary ACC and the acceleration data corresponding to the secondary ACC to obtain the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC, which may refer to steps 908 to 909 in the embodiment shown in fig. 9, and the embodiments of the present disclosure are not repeated herein.
And 1111, the exchange module sends target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC to the event distribution management module.
In step 1112, the event distribution management module receives the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC.
And after the event distribution management module receives the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, the event distribution management module sends the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC to the application to be processed. And the to-be-processed application adjusts the display direction of the target display interface according to the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, so that the target display interface is correctly displayed.
Step 1113, the event distribution management module sends target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC to the application to be processed.
In some examples, the event distribution management module sending the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC to the pending application may include: first, the event distribution management module transmits a reply message to the Sensor client manager, the reply message including target acceleration data corresponding to the primary ACC and target acceleration data corresponding to the secondary ACC. The Sensor client manager then receives the reply message. In response to the reply message, the Sensor client manager transmits an acceleration data reply message to the Sensor HIDL service through the QMI inter-core communication module and the QMI communication module, the acceleration data reply message including target acceleration data corresponding to the primary ACC and target acceleration data corresponding to the secondary ACC. The Sensor HIDL service receives the acceleration data reply message. And in response to the acceleration data reply message, sending target acceleration data corresponding to the main ACC and target acceleration data corresponding to the auxiliary ACC to the SensorService in the SystemService. After receiving the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, the sensor service sends the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC to the application to be processed, so that the subsequent application to be processed executes corresponding adjustment operation on the display direction of the target display interface based on the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC.
Step 1114, the application to be processed receives the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the sub ACC, and adjusts the display direction of the target display interface based on the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the sub ACC, so as to obtain the adjusted display direction.
After receiving the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC, the application to be processed can adjust the display direction of the target display interface according to the target acceleration data corresponding to the main ACC so as to obtain an adjusted display direction, and the target display interface is correctly displayed by utilizing the adjusted display direction.
In some examples, the adjusting the display direction of the target display interface based on the target acceleration data corresponding to the primary ACC and the target acceleration data corresponding to the secondary ACC may be based on the target acceleration data corresponding to the primary ACC, and the adjusting the display direction of the target display interface may be based on the target acceleration data corresponding to the primary ACC, so as to obtain the adjusted display direction.
In other examples, the display direction of the target display interface is adjusted based on the target acceleration data corresponding to the main ACC, so as to obtain the adjusted display direction, which may be determined based on the target acceleration data corresponding to the main ACC, and the display direction of the target display interface is adjusted to be the target direction (i.e., the adjusted display direction) when the target direction is different from the display direction of the target display interface. Illustratively, adjusting the display direction of the target display interface to the target direction may be adjusting the landscape display to the portrait display; or adjust the portrait display to landscape display; or the display direction is rotated 180 degrees and then displayed, etc.
In some examples, when the to-be-processed application has a specific processing mechanism, the to-be-processed application can self-decide to adopt the target acceleration data corresponding to the main ACC and the target acceleration data corresponding to the auxiliary ACC to adjust the display direction of the target display interface. And adjusting the display direction of the target display interface by adopting the acceleration data corresponding to the original main ACC and the acceleration data corresponding to the auxiliary ACC. The present disclosure is not limited to this, and is based on practical use. Wherein a specific processing mechanism may be used to solve the problems presented in the background.
Therefore, when the to-be-processed application starts the first monitoring event or the second monitoring event, the folding gesture of the terminal device and the display state of each screen can be obtained through the gesture sensor and the common sensor. The state of the terminal equipment can be obtained according to the folding posture of the terminal equipment and the display state of each screen. When the terminal equipment is in a target state, the display interface corresponding to the first acceleration sensor and the target display interface have opposite relations, so that the target acceleration data corresponding to the target display interface can be accurately obtained by utilizing the opposite relations and the acceleration data corresponding to the auxiliary ACC. And then the application to be processed can utilize the target acceleration data to adjust the target display interface, so that the target display interface can be correctly displayed, and the use experience of a user is improved.
For ease of understanding, a data processing method provided by an embodiment of the present disclosure is described below with reference to fig. 12. As shown in fig. 9, the data processing method may include the following steps 1201-1203.
Step 1201, acquiring gesture data, where the gesture data is used to indicate a folding gesture of the terminal device and a display state of each screen in the terminal device, and the display state includes a bright screen state and a dead screen state.
In some examples, acquiring gesture data includes: acquiring pose data of a first pose sensor and display state data of a second state sensor; determining the folding gesture of the terminal equipment according to the gesture data; and determining the display state of each screen in the terminal equipment according to the display state data.
The first posture sensor in the embodiment of the present disclosure corresponds to the posture sensor in the embodiment shown in fig. 9, and the second state sensor corresponds to the common sensor in the embodiment shown in fig. 9, so the second state sensor may correspond to a plurality of sensors. The acquiring gesture data in the embodiment of the present disclosure, where the gesture data is used to indicate a folding gesture of a terminal device, and a display state of each screen in the terminal device, where the display state includes a bright screen state and a dark screen state, may refer to steps 901 to 905 in the embodiment shown in fig. 9, which are not described herein again.
Step 902, under the condition that the terminal equipment is in the target state according to the gesture data, obtaining target acceleration data corresponding to a target display interface by utilizing acceleration data corresponding to a first acceleration sensor, wherein the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface.
The first acceleration sensor in the embodiment of the present disclosure corresponds to the sub ACC in the embodiment shown in fig. 9. The acceleration data corresponding to the first acceleration sensor corresponds to the acceleration data corresponding to the sub ACC in the embodiment shown in fig. 9. The target acceleration data corresponds to the target acceleration data corresponding to the main ACC in the embodiment shown in fig. 9.
In the embodiment of the present disclosure, when it is determined that the terminal device is in the target state according to the gesture data, the target acceleration data corresponding to the target display interface is obtained by using the acceleration data corresponding to the first acceleration sensor, where the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface, which may refer to steps 906 to 909 in the embodiment shown in fig. 9, and this embodiment of the present disclosure is not described herein again.
In some examples, the target state includes any one of a first state, a second state, and a third state; the first state comprises that the folding gesture of the terminal equipment is that the folding angle of the inner screen of the terminal equipment is within a first target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the second state comprises that the folding angle of the inner screen is within a second target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the third state comprises that the folding angle of the inner screen is in a third target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state. The folding gesture of the terminal device comprises a folding angle of an inner screen of the terminal device.
The first target preset angle range in the embodiment of the present disclosure corresponds to the fifth preset angle range corresponding to the tent state in the embodiment shown in fig. 5. The second target preset angle range corresponds to a fifth preset angle range corresponding to the desk calendar in the embodiment shown in fig. 5. The third target preset angle range corresponds to the first preset angle range corresponding to the deployed state in the embodiment shown in fig. 5.
In some examples, obtaining target acceleration data corresponding to the target display interface using acceleration data corresponding to the first acceleration sensor includes: and carrying out inverse processing on the acceleration data corresponding to the first acceleration sensor to obtain target acceleration data, and storing the target acceleration data in a preset buffer position.
In the embodiment of the disclosure, the acceleration data corresponding to the first acceleration sensor is subjected to inverse processing to obtain target acceleration data, and the target acceleration data is stored in a preset buffer position. Reference may be made to the content in step 909 in the embodiment shown in fig. 9, and the embodiments of the disclosure are not repeated here.
In some examples, the preset buffer position includes a buffer position corresponding to the first acceleration sensor and a buffer position corresponding to the second acceleration sensor; storing the target acceleration data in a preset cache location, including: and storing the target acceleration data in a buffer position corresponding to the second acceleration sensor in the preset buffer position.
The buffer location corresponding to the first acceleration sensor in the embodiment of the present disclosure is equivalent to the buffer pool corresponding to the secondary ACC in the embodiment shown in fig. 9. The buffer position corresponding to the second acceleration sensor corresponds to the buffer pool corresponding to the primary ACC in the embodiment shown in fig. 9. The buffer memory location corresponding to the second acceleration sensor that stores the target acceleration data in the preset buffer memory location corresponds to storing the target acceleration data corresponding to the primary ACC in the buffer memory pool corresponding to the primary ACC in the embodiment shown in fig. 9.
In the embodiment of the present disclosure, the target acceleration data is stored in the buffer position corresponding to the second acceleration sensor in the preset buffer position, and reference may be made to the content in step 909 in the embodiment shown in fig. 9, which is not described herein.
Step 903, displaying a target display interface based on the target acceleration data.
In the embodiment of the present disclosure, the target display interface is displayed based on the target acceleration data, and reference may be made to step 1114 in the embodiment shown in fig. 11, which is not described herein.
In some examples, prior to acquiring the pose data of the first pose sensor and the display state data of the second state sensor, the method further comprises: detecting a starting operation of a user, and monitoring the first pose sensor and the second state sensor to acquire pose data of the first pose sensor and display state data of the second state sensor.
In the embodiment of the present disclosure, when the startup operation of the user is detected, the first pose sensor and the second state sensor are monitored to obtain pose data of the first pose sensor and display state data of the second state sensor, reference may be made to step 901-step 903 in the embodiment shown in fig. 11, which is not described herein.
In some examples, prior to acquiring the pose data of the first pose sensor and the display state data of the second state sensor, the method further comprises: and responding to a preset event, namely monitoring the first pose sensor and the second state sensor to acquire pose data of the first pose sensor and display state data of the second state sensor, wherein the preset event comprises monitoring the first acceleration sensor and the second acceleration sensor or monitoring the motion sensor.
Listening for the first acceleration sensor and the second acceleration sensor in the embodiments of the present disclosure corresponds to the first listening event in the embodiment shown in fig. 11. Listening for motion sensors corresponds to the second listening event in the embodiment shown in fig. 11.
In the embodiment of the present disclosure, the first pose sensor and the second state sensor are monitored in response to a preset event to obtain pose data of the first pose sensor and display state data of the second state sensor, where the preset event includes monitoring the first acceleration sensor and the second acceleration sensor, or the monitoring motion sensor may refer to steps 1101a, 1101b and 1102-1104 in the embodiment shown in fig. 11, which are not described herein.
In some examples, the terminal device is a folding screen terminal device.
Corresponding to the method in the foregoing embodiment, an embodiment of the present disclosure further provides a data processing apparatus. The data processing apparatus may be applied to an electronic device for implementing the method in the foregoing embodiment. The functions of the data processing device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
For example, fig. 13 shows a schematic structural diagram of a data processing apparatus 1300, and as shown in fig. 13, the data processing apparatus 1300 may include: an acquisition module 1301, a determination module 1302, a display module 1303, and the like.
The acquiring module 1301 is configured to acquire gesture data, where the gesture data is used to indicate a folding gesture of the terminal device, and display states of each screen in the terminal device, where the display states include a screen-on state and a screen-off state.
The determining module 1302 is configured to obtain, when it is determined that the terminal device is in the target state according to the gesture data, target acceleration data corresponding to the target display interface by using the acceleration data corresponding to the first acceleration sensor, where the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface.
The display module 1303 is further configured to display a target display interface based on the target acceleration data.
In one possible implementation, the obtaining module 1301 is further configured to obtain pose data of the first pose sensor and display state data of the second state sensor. A determining module 1302, further configured to determine a folding gesture of the terminal device according to the gesture data; and determining the display state of each screen in the terminal equipment according to the display state data.
In one possible implementation, the target state includes any one of a first state, a second state, and a third state; the first state comprises that the folding gesture of the terminal equipment is that the folding angle of the inner screen of the terminal equipment is within a first target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the second state comprises that the folding angle of the inner screen is within a second target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the third state comprises that the folding angle of the inner screen is in a third target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state.
In a possible implementation manner, the determining module 1302 is further configured to perform a negation process on the acceleration data corresponding to the first acceleration sensor, obtain target acceleration data, and store the target acceleration data in a preset buffer location.
In one possible implementation manner, the preset buffer position includes a buffer position corresponding to the first acceleration sensor and a buffer position corresponding to the second acceleration sensor; the determining module 1302 is further configured to store the target acceleration data in a buffer location corresponding to the second acceleration sensor in the preset buffer location.
In one possible implementation, the data processing apparatus further includes a listening module 1304. The monitoring module is configured to detect the starting operation of a user and monitor the first pose sensor and the second state sensor so as to acquire pose data of the first pose sensor and display state data of the second state sensor.
In one possible implementation, the monitor module 1304 is further configured to monitor the first pose sensor and the second state sensor in response to a preset event, where the preset event includes monitoring the first acceleration sensor and the second acceleration sensor, or monitoring the motion sensor, to obtain pose data of the first pose sensor and display state data of the second state sensor.
In one possible implementation, the terminal device is a folding screen terminal device.
It should be understood that the division of units or modules (hereinafter referred to as units) in the above apparatus is merely a division of logic functions, and may be fully or partially integrated into one physical entity or may be physically separated. And the units in the device can be all realized in the form of software calls through the processing element; or can be realized in hardware; it is also possible that part of the units are implemented in the form of software, which is called by the processing element, and part of the units are implemented in the form of hardware.
For example, each unit may be a processing element that is set up separately, may be implemented as integrated in a certain chip of the apparatus, or may be stored in a memory in the form of a program, and the functions of the unit may be called and executed by a certain processing element of the apparatus. Furthermore, all or part of these units may be integrated together or may be implemented independently. The processing element herein may also be referred to as a processor and may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in a processor element or in the form of software called by a processing element.
It should be understood that the steps in the above-described method embodiments provided by the present disclosure may be accomplished by instructions in the form of integrated logic circuits of hardware in a processor or software. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
In one example, the units in the above apparatus may be one or more integrated circuits configured to implement the above method, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of at least two of these integrated circuit forms.
For another example, when the units in the apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as CPUs or other processors that may invoke programs. For another example, the units may be integrated together and implemented in the form of a system on chip SOC.
In one implementation, the above means for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler. For example, the apparatus may comprise a processing element and a storage element, the processing element invoking a program stored in the storage element to perform the method of the above method embodiments. The memory element may be a memory element on the same chip as the processing element, i.e. an on-chip memory element.
In another implementation, the program for performing the above method may be on a memory element on a different chip than the processing element, i.e. an off-chip memory element. At this point, the processing element invokes or loads a program from the off-chip storage element onto the on-chip storage element to invoke and execute the method of the above method embodiment.
For example, embodiments of the present disclosure may also provide an apparatus such as: an electronic device may include: a processor, a memory for storing instructions executable by the processor. The processor is configured to, when executing the above-described instructions, cause the electronic device to implement the data processing method of the previous embodiment. The memory may be located within the electronic device or may be located external to the electronic device. And the processor includes one or more.
In yet another implementation, the unit implementing each step in the above method may be configured as one or more processing elements, where the processing elements may be disposed on the electronic device corresponding to the above, and the processing elements may be integrated circuits, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits may be integrated together to form a chip.
For example, the embodiment of the present disclosure also provides a chip, which may be applied to the above-described electronic device. The chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuit to implement the methods of the above method embodiments.
Embodiments of the present disclosure also provide a computer readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by an electronic device, enable the electronic device to implement the data processing method as described above.
Embodiments of the present disclosure also provide a computer program product comprising computer instructions for execution by an electronic device as described above, which when executed in the electronic device, cause the electronic device to implement a data processing method as described above. From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present disclosure may be embodied in the form of a software product, such as: and (5) program. The software product is stored in a program product, such as a computer readable storage medium, comprising instructions for causing a terminal device (which may be a single-chip microcomputer, chip or the like) or processor (processor) to perform all or part of the steps of the methods of the various embodiments of the disclosure. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
For example, embodiments of the present disclosure may also provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by an electronic device, cause the electronic device to implement a data processing method as in the method embodiments described above.
The foregoing is merely a specific embodiment of the disclosure, but the protection scope of the disclosure is not limited thereto, and any changes or substitutions within the technical scope of the disclosure should be covered in the protection scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A method of data processing, the method comprising:
acquiring gesture data, wherein the gesture data are used for indicating the folding gesture of the terminal equipment and the display state of each screen in the terminal equipment, and the display state comprises a screen-on state and a screen-off state;
under the condition that the terminal equipment is in a target state according to the gesture data, acquiring target acceleration data corresponding to a target display interface by utilizing acceleration data corresponding to a first acceleration sensor, wherein the target display interface is a display interface corresponding to the target state, and the display interface corresponding to the first acceleration sensor is opposite to the target display interface;
and displaying the target display interface based on the target acceleration data.
2. The method of claim 1, wherein the acquiring gesture data comprises:
acquiring pose data of a first pose sensor and display state data of a second state sensor;
determining the folding gesture of the terminal equipment according to the gesture data;
and determining the display state of each screen in the terminal equipment according to the display state data.
3. The method of claim 1, wherein the target state comprises any one of a first state, a second state, and a third state; the first state comprises that the folding gesture of the terminal equipment is that the folding angle of the inner screen of the terminal equipment is in a first target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the second state comprises that the folding angle of the inner screen is in a second target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state; the third state comprises that the folding angle of the inner screen is in a third target preset angle range, the inner screen is in a screen-off state, and the outer screen is in a screen-on state.
4. The method of claim 1, wherein obtaining the target acceleration data corresponding to the target display interface using the acceleration data corresponding to the first acceleration sensor comprises:
and performing inverse processing on the acceleration data corresponding to the first acceleration sensor to obtain the target acceleration data, and storing the target acceleration data in a preset buffer position.
5. The method of claim 4, wherein the preset buffer locations comprise buffer locations corresponding to a first acceleration sensor and buffer locations corresponding to a second acceleration sensor; the storing the target acceleration data in a preset buffer position includes:
And storing the target acceleration data in a buffer position corresponding to the second acceleration sensor in the preset buffer position.
6. The method of claim 2, wherein prior to the acquiring the pose data of the first pose sensor and the display state data of the second state sensor, the method further comprises:
detecting a starting operation of a user, and monitoring the first pose sensor and the second state sensor to acquire pose data of the first pose sensor and display state data of the second state sensor.
7. The method of claim 2, wherein prior to the acquiring the pose data of the first pose sensor and the display state data of the second state sensor, the method further comprises:
and responding to a preset event, monitoring the first pose sensor and the second state sensor to acquire pose data of the first pose sensor and display state data of the second state sensor, wherein the preset event comprises monitoring of the first acceleration sensor and the second acceleration sensor or monitoring of a motion sensor.
8. The method according to any of claims 1-7, wherein the terminal device is a folding screen terminal device.
9. An electronic device comprising a processor, a memory for storing instructions executable by the processor; the processor is configured to, when executing the instructions, cause the electronic device to implement the method of any one of claims 1 to 8.
10. A computer readable storage medium having stored thereon computer program instructions; it is characterized in that the method comprises the steps of,
the computer program instructions, when executed by an electronic device, cause the electronic device to implement the method of any one of claims 1 to 8.
CN202310836173.6A 2023-07-10 2023-07-10 Data processing method and electronic equipment Active CN116567141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310836173.6A CN116567141B (en) 2023-07-10 2023-07-10 Data processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310836173.6A CN116567141B (en) 2023-07-10 2023-07-10 Data processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN116567141A true CN116567141A (en) 2023-08-08
CN116567141B CN116567141B (en) 2023-11-07

Family

ID=87498589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310836173.6A Active CN116567141B (en) 2023-07-10 2023-07-10 Data processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116567141B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117630414A (en) * 2024-01-25 2024-03-01 荣耀终端有限公司 Acceleration sensor calibration method, folding electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109840061A (en) * 2019-01-31 2019-06-04 华为技术有限公司 The method and electronic equipment that control screen is shown
CN110299100A (en) * 2019-07-01 2019-10-01 努比亚技术有限公司 Display direction method of adjustment, wearable device and computer readable storage medium
KR102137638B1 (en) * 2020-01-15 2020-07-27 주식회사 사피엔반도체 Brightness controlable display apparatus
CN112860359A (en) * 2019-11-28 2021-05-28 华为技术有限公司 Display method and related device of folding screen
CN115686407A (en) * 2022-06-30 2023-02-03 荣耀终端有限公司 Display method and electronic equipment
CN116107531A (en) * 2023-02-07 2023-05-12 维沃移动通信有限公司 Interface display method and device
WO2023103951A1 (en) * 2021-12-10 2023-06-15 华为技术有限公司 Display method for foldable screen and related apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109840061A (en) * 2019-01-31 2019-06-04 华为技术有限公司 The method and electronic equipment that control screen is shown
CN110299100A (en) * 2019-07-01 2019-10-01 努比亚技术有限公司 Display direction method of adjustment, wearable device and computer readable storage medium
CN112860359A (en) * 2019-11-28 2021-05-28 华为技术有限公司 Display method and related device of folding screen
KR102137638B1 (en) * 2020-01-15 2020-07-27 주식회사 사피엔반도체 Brightness controlable display apparatus
WO2023103951A1 (en) * 2021-12-10 2023-06-15 华为技术有限公司 Display method for foldable screen and related apparatus
CN115686407A (en) * 2022-06-30 2023-02-03 荣耀终端有限公司 Display method and electronic equipment
CN116107531A (en) * 2023-02-07 2023-05-12 维沃移动通信有限公司 Interface display method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117630414A (en) * 2024-01-25 2024-03-01 荣耀终端有限公司 Acceleration sensor calibration method, folding electronic device and storage medium

Also Published As

Publication number Publication date
CN116567141B (en) 2023-11-07

Similar Documents

Publication Publication Date Title
CN110389802B (en) Display method of flexible screen and electronic equipment
CN111316333B (en) Information prompting method and electronic equipment
KR102520225B1 (en) Electronic device and image capturing method thereof
WO2021213164A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
WO2021000881A1 (en) Screen splitting method and electronic device
WO2021052279A1 (en) Foldable screen display method and electronic device
WO2021063237A1 (en) Control method for electronic device, and electronic device
US10848669B2 (en) Electronic device and method for displaying 360-degree image in the electronic device
CN114217699B (en) Method for detecting pen point direction of handwriting pen, electronic equipment and handwriting pen
CN112114912A (en) User interface layout method and electronic equipment
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN112130788A (en) Content sharing method and device
CN116567141B (en) Data processing method and electronic equipment
KR20150027934A (en) Apparatas and method for generating a file of receiving a shoot image of multi angle in an electronic device
CN113384880A (en) Virtual scene display method and device, computer equipment and storage medium
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN108492339B (en) Method and device for acquiring resource compression packet, electronic equipment and storage medium
CN116723257A (en) Image display method and electronic equipment
CN111770196B (en) Information synchronization method, device and storage medium
CN112822544A (en) Video material file generation method, video synthesis method, device and medium
CN112181915A (en) Method, device, terminal and storage medium for executing service
WO2020233581A1 (en) Height measuring method and electronic device
CN113688043A (en) Application program testing method, device, server, iOS device and medium
JP2023546870A (en) Interface display method and electronic device
CN115964231A (en) Load model-based assessment method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant