CN117687501A - Display method for switching horizontal screen and vertical screen and related device - Google Patents

Display method for switching horizontal screen and vertical screen and related device Download PDF

Info

Publication number
CN117687501A
CN117687501A CN202310821107.1A CN202310821107A CN117687501A CN 117687501 A CN117687501 A CN 117687501A CN 202310821107 A CN202310821107 A CN 202310821107A CN 117687501 A CN117687501 A CN 117687501A
Authority
CN
China
Prior art keywords
electronic device
user interface
video playing
playing area
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310821107.1A
Other languages
Chinese (zh)
Inventor
黄士俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310821107.1A priority Critical patent/CN117687501A/en
Publication of CN117687501A publication Critical patent/CN117687501A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a display method for switching between horizontal and vertical screens and a related device. In the method, the electronic device may set a rotation animation for the video play area and no rotation animation for elements in the user interface other than the video play area. Under the condition that the electronic equipment is in a target video playing scene, the electronic equipment can realize the switching of the horizontal screen and the vertical screen, and the user interface can be displayed by adopting the rotating animation aiming at the video playing area in the switching process of the horizontal screen and the vertical screen. According to the technical method, the content displayed by the electronic equipment in the process of switching the horizontal screen and the vertical screen can be smoothly excessive, and the phenomenon that the user transfers attention to elements except for a video playing area can be avoided, so that the user can be immersed in the video continuously. Therefore, the user can be immersed in the video without being interfered by other elements when watching the video, and the user experience is good.

Description

Display method for switching horizontal screen and vertical screen and related device
Technical Field
The application relates to the technical field of terminals, in particular to a display method for switching horizontal and vertical screens and a related device.
Background
Currently, electronic devices may support landscape-portrait switching. The electronic device switches between the vertical screen state and the horizontal screen state, so that the content and the layout in the interface are changed. How to realize smooth transition of interface display in the process of switching horizontal and vertical screens of electronic equipment is a problem to be solved in the field.
Disclosure of Invention
The application provides a display method and a related device for switching horizontal and vertical screens, which can realize that the displayed content of electronic equipment can be smoothly excessive in the process of switching the horizontal and vertical screens. The method can also enable the user to observe the rotation change of the video playing area in the process of switching the transverse screen and the vertical screen of the electronic equipment, so that the user can be immersed in the video.
In a first aspect, the present application provides a display method for switching between horizontal and vertical screens, where the method includes: the electronic equipment displays a first user interface in a vertical screen state, wherein the first user interface comprises a first video playing area and a first element outside the first video playing area. After the electronic device detects the first operation, a first rotation animation is played, in which the first video playing area in the first user interface is gradually rotated by 90 degrees around the center of the first video playing area relative to the electronic device, and the first element is not rotated relative to the electronic device. The electronic equipment displays a second user interface in a horizontal screen state, the second user interface comprises a second video playing area, the first video playing area and the video played by the second video playing area are the same, the size of the second video playing area is larger than that of the first video playing area, wherein the display direction of the first user interface points to a second side from a first side of the electronic equipment, the display direction of the second user interface points to a fourth side from a third side of the electronic equipment, the first side is perpendicular to the third side and the fourth side, and the second side is perpendicular to the third side and the fourth side.
After the method provided by the application is implemented, the user can observe the rotation change of the video playing area relative to the screen in the process of switching the horizontal screen and the vertical screen of the electronic equipment, and only the video playing area rotates, so that the user can be immersed in the video continuously. Therefore, the possibility that the user is interfered by other elements when watching the video can be reduced, the user can be immersed in the video, and the user experience is good.
With reference to the first aspect, in some implementations, before the electronic device plays the first rotational animation, the method further includes: the electronic device obtains a screenshot of the first user interface for enabling the first element to be non-rotated relative to the electronic device. The electronic device draws the second user interface. The electronic device determines that the electronic device is in a target video playing scene according to the first user interface and the second user interface, wherein the target video playing scene comprises: the first user interface and the second user interface each include a scene of a video playing area.
With reference to the first aspect, in some implementations, the electronic device draws the second user interface, specifically including: the electronic equipment draws the second user interface according to drawing rules provided by an application program to which the first user interface belongs.
With reference to the first aspect, in some implementations, the target video playing scene specifically includes: the width of the first video playing area in the first user interface is larger than the height, the width of the second video playing area in the second user interface is larger than the height, and the second video playing area occupies a scene with the proportion of the screen being larger than the first value.
With reference to the first aspect, in some implementations, the target video playing scene specifically includes: the width of the first video playing area in the first user interface is larger than the height, the height of the first video playing area in the first user interface is larger than a first height threshold value, the width of the second video playing area in the second user interface is larger than the height, and the second video playing area occupies a scene with the proportion of the screen being larger than a first value.
With reference to the first aspect, in some implementations, the method further includes: the electronic device determines whether the first user interface includes a surfacview. And the electronic equipment judges whether the second user interface contains SurfaceView according to the drawn second user interface. And determining that the electronic equipment is in the target video playing scene under the condition that the first user interface and the second user interface contain the same surface view.
With reference to the first aspect, in some implementations, before the electronic device detects the first operation, the electronic device turns off a function of screen orientation locking.
With reference to the first aspect, in some implementations, the content displayed in the first video playing area in the first rotating animation is content displayed in the first video playing area in the first user interface when the electronic device detects the first operation.
With reference to the first aspect, in some implementations, a size of the first video playing area in the first rotation animation gradually increases.
In some implementations, the first rotational animation rotates by 90 degrees, or the degrees of rotation differ from 90 degrees by a first threshold.
In some implementations, the duration of the first rotational animation may be set to 400ms. In addition, the first rotary animation may be provided with a black mask layer for contents of the user interface other than the video playing area, and the time of the black mask layer may be set between 0ms and 150ms of the first rotary animation, and the opacity of the black mask layer may be changed from 0% to 100%, that is, the playing color of the contents of the user interface other than the video playing area is darker and darker along with the first rotary animation.
With reference to the first aspect, in some implementations, the first operation includes: an operation that the electronic device is rotated in the plane of the non-ground plane or a clicking operation that acts on a full screen play control in the first video play area.
With reference to the first aspect, in some implementations, after the electronic device detects the first operation, the method further includes: the electronic device performs a freeze-screen operation.
With reference to the first aspect, in some implementations, after the playing the first rotational animation, the method further includes: the electronic device performs a defrost operation.
With reference to the first aspect, in some implementations, after the electronic device displays the second user interface in the landscape state, the method further includes:
after the electronic device detects the second operation, a second rotation animation is played, in which the second video playing area in the second user interface is gradually rotated by 90 degrees around the center of the first video playing area relative to the electronic device, and the first element is not rotated relative to the electronic device. The electronic equipment displays a third user interface in a vertical screen state, wherein the third user interface comprises the first video playing area, and elements except the first video playing area in the first user interface, and the display direction of the third user interface points to the second side from the first side of the electronic equipment.
With reference to the first aspect, in some implementations, before the electronic device plays the second rotational animation in response to the second operation, the method further includes: the electronic device draws the third user interface according to drawing rules provided by the application program to which the second user interface belongs.
With reference to the first aspect, in some implementations, the content displayed in the second video playing area in the second rotary animation is content displayed in the second video playing area in the second user interface when the electronic device detects the second operation.
With reference to the first aspect, in some implementations, the size of the first video playing area in the first rotary animation gradually increases, and in the second rotary animation, the electronic device further displays elements in the first user interface other than the first video playing area.
In some implementations, the duration of the second rotational animation may be set to 400ms. In addition, the second rotary animation may be provided with a black mask layer for contents of the user interface other than the video playing area, and the time of the black mask layer may be set between 0ms and 150ms of the second rotary animation, and the opacity of the black mask layer may be changed from 0% to 100%, that is, the playing color of the contents of the user interface other than the video playing area is lighter and lighter along with the second rotary animation.
With reference to the first aspect, in some implementations, the second operation includes: an operation that the electronic device is rotated in the plane of the non-ground plane or a clicking operation that acts on a zoom-out control in the second video play area.
In a second aspect, the present application provides an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of the first aspect or any of the embodiments of the first aspect.
In a third aspect, embodiments of the present application provide a chip comprising one or more processors. The computer instructions, when executed by one or more processors, cause the electronic device to perform the method of the first aspect or any implementation of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer storage medium comprising computer instructions that, when run on an electronic device, cause the electronic device to perform the method of the first aspect or any implementation of the first aspect.
Drawings
FIG. 1A is a schematic view of a user interface in a portrait state according to an embodiment of the present application;
FIG. 1B is a schematic diagram of a user interface in a landscape screen state according to an embodiment of the present disclosure;
fig. 2A to fig. 2D are schematic views of a user interface in a horizontal-vertical screen switching process according to an embodiment of the present application;
3A-3D are user interface diagrams of a process of switching from a portrait state to a landscape state according to an embodiment of the present application;
FIGS. 4A-4D are schematic views of a user interface for a process of switching from a landscape screen state to a portrait screen state according to embodiments of the present application;
fig. 5 is a flowchart of a display method for switching between horizontal and vertical screens according to an embodiment of the present application;
fig. 6A-6B are schematic diagrams illustrating a process of changing a horizontal-vertical screen rotation animation of a video playing scene according to an embodiment of the present application;
fig. 7 is a flowchart of a method for determining whether an electronic device is in a target video playing scene according to an embodiment of the present application;
FIG. 8 is an OS interaction diagram provided in an embodiment of the present application;
fig. 9 is a schematic diagram of an electronic device architecture according to an embodiment of the present application;
fig. 10 is a block diagram of an electronic device software architecture according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and an acceptable form of the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
To meet user needs, the electronic device may be used by a user with a portrait or landscape screen. The horizontal screen and the vertical screen can be directed to the display direction of the electronic equipment. I.e. with respect to the display direction of the user interface displayed by the electronic device.
The display orientation of the electronic device is described below in connection with fig. 1A-1B.
The electronic device shown in fig. 1A is in a portrait state. As shown in fig. 1A, the screen of the electronic device has a rectangular shape, and four sides of the screen are referred to as a side, b side, c side and d side, respectively. Wherein, the a side and the c side are shorter two sides, and the b side and the d side are longer two sides. By way of example, the display orientation of the user interface 21 shown in FIG. 1A may refer to an orientation in which the a-side points to the c-side. The display direction of the user interface 21 is the first display direction shown in fig. 1A. It will be appreciated that when the electronic device is in the portrait state, the height (i.e., the length of the b-side or the d-side) of the display interface of the electronic device is greater than the width (i.e., the length of the a-side or the c-side) in the display direction of the electronic device.
The electronic device shown in fig. 1B is in a landscape state. As shown in fig. 1B, the screen of the electronic device has a rectangular shape, and four sides of the screen are referred to as a side, B side, c side and d side, respectively. Wherein, the a side and the c side are shorter two sides, and the b side and the d side are longer two sides. By way of example, the display orientation of the user interface 22 shown in FIG. 1B may refer to a direction in which the B-side points to the d-side. The display orientation of the user interface 22 is the second display orientation shown in fig. 1B. It will be appreciated that when the electronic device is in the landscape state, the height (i.e., the length of the a-side or the c-side) of the display interface of the electronic device is greater than the width (i.e., the length of the b-side or the d-side) in the display direction of the electronic device.
In the embodiment of the present application, the case of the horizontal and vertical screens is only illustrated by way of example, which is not limited thereto.
Based on the above description, the electronic device may also realize switching between landscape and portrait when the user uses the electronic device. Specifically, when the user rotates the electronic device, the electronic device can be switched in a horizontal-vertical mode. If the display screen of the electronic device is always parallel to the ground in the rotation process, the switching of the horizontal screen and the vertical screen may not occur.
In some implementations, an a-side may be referred to as a first side, a b-side may be referred to as a third side, a c-side may be referred to as a second side, and a d-side may be referred to as a fourth side.
The following describes a switching manner of the horizontal and vertical screens in conjunction with fig. 2A to 2D.
2A-2D illustrate a process by which an electronic device switches from a portrait display user interface to a landscape display user interface.
As shown in fig. 2A-2D, fig. 2A illustrates a user interface 31 in a video playing application in a vertical screen state of the electronic device, fig. 2B illustrates a user interface 32 in a video playing application rotated 30 degrees in a rotation direction (i.e., counterclockwise direction) illustrated in fig. 2B from the vertical screen state of the electronic device, fig. 2C illustrates a user interface 33 in a video playing application rotated 60 degrees in a counterclockwise direction from the vertical screen state of the electronic device, and fig. 2D illustrates a user interface 34 in a video playing application rotated 90 degrees in a counterclockwise direction, i.e., a horizontal screen state, from the vertical screen state of the electronic device. The user interface 34 displays only the video playing area 341, that is, in the landscape state, the video playing area 341 is displayed in full screen. In the embodiment of the present application, the above rotation angle is merely exemplary, and is not limited thereto.
The rotation of the electronic device is directed to the geodetic coordinate system, not to the electronic device coordinate system. Wherein the geodetic coordinate system is fixed. By way of example, the ground coordinate system may be defined herein as the X-axis in a direction parallel to the ground and the Y-axis in a direction perpendicular to the ground. The electronic device coordinate system is variable, and may be exemplified herein by the shorter side of the electronic device screen being the X-axis and the longer side of the electronic device screen being the Y-axis. In the embodiment of the present application, the X-axis and the Y-axis of the two coordinate systems are not limited. In the embodiment of the application, taking the case that the plane where the screen of the electronic device is located is perpendicular to the ground in the vertical screen state and the horizontal screen state as an example.
As shown in fig. 2A, the electronic device is in a vertical screen state, and the geodetic coordinate system is consistent with the electronic device coordinate system.
As shown in fig. 2B, the geodetic coordinate system is not consistent with the electronic device coordinate system at this time. In the geodetic coordinate system, the electronic device shown in fig. 2B is different from the electronic device shown in fig. 2A by 30 degrees, and the user interface 32 coincides with the display direction of the user interface 31. Thus, in the electronic device coordinate system, the user interface 32 may be understood as being obtained after the user interface 31 is rotated by 30 degrees in the opposite direction (i.e., clockwise) of the rotation direction as shown in fig. 2B, the display direction of the user interface 32 is different from the display direction of the user interface 31 by 30 degrees, and various elements in the user interface 32 are larger than corresponding various elements in the user interface 31.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 2C. In the geodetic coordinate system, the electronic device shown in fig. 2C is different from the electronic device shown in fig. 2B by 30 degrees, and the user interface 33 coincides with the display direction of the user interface 32. Thus, in the electronic device coordinate system, the user interface 33 may be understood as being obtained after the user interface 32 is rotated by 30 degrees in the opposite direction (i.e., clockwise) of the rotation direction as shown in fig. 2C, the display direction of the user interface 33 is different from the display direction of the user interface 32 by 30 degrees, and various elements in the user interface 33 are larger than corresponding various elements in the user interface 32.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 2D. In the geodetic coordinate system, the electronic device shown in fig. 2D is different from the electronic device shown in fig. 2C by 30 degrees, and the user interface 34 coincides with the display direction of the user interface 33. Thus, in the electronic device coordinate system, the user interface 34 may be understood as being obtained after the user interface 33 is rotated by 30 degrees in the opposite direction (i.e., clockwise) to the rotation direction shown in fig. 2C, and the display direction of the user interface 34 is different from the display direction of the user interface 33 by 30 degrees.
The rotation of the user interface may refer to all of the content displayed in the user interface rotating about a center point of a screen of the electronic device. During the rotation of the electronic device, the display of the element in the user interface may be incomplete, and part of the content in the element beyond the boundary of the display screen is not displayed, and after the rotation, the region without the element in the user interface is displayed in black.
From the above, it can be seen that, in the process of rotating the electronic device from the portrait state to the landscape state, the user interface displayed by the electronic device is also a dynamic change process, and the user interface changes with the rotation moment of the electronic device. However, since in the video play scene, the user focuses on the video, not other elements than the video. Setting the transition animation based on the entire user interface may cause the user to suddenly divert attention to other areas than the video while immersed in the video playing, and the user experience may be poor.
In order to reduce the problems, the embodiment of the application provides a display method for switching between horizontal and vertical screens. In the method, the electronic device may set a rotation animation for the video play area and no rotation animation for elements in the user interface other than the video play area. Under the condition that the electronic equipment is in a target video playing scene, the electronic equipment can realize the switching of the horizontal screen and the vertical screen, and the user interface can be displayed by adopting the rotating animation aiming at the video playing area in the switching process of the horizontal screen and the vertical screen.
By implementing the method provided by the application, the content displayed by the electronic equipment in the process of switching the horizontal screen and the vertical screen can be smoothly excessive. The user can observe the rotation change of the video playing area in the process of switching the transverse screen and the vertical screen of the electronic equipment. And because the user interface has no rotating animation except the video playing area, the user does not pay attention to the elements except the video playing area, and the user can be immersed in the video continuously. Therefore, the user can be immersed in the video without being interfered by other elements when watching the video, and the user experience is good.
The embodiment of the application can be applied to a scene containing video in a user interface, for example: a user interface supported by a system application that plays video, a user interface supported by a third party application that plays video, and so forth. The video may be displayed in a vertical screen state without full screen, and the width of the video playing area is greater than the height; and video that is displayed full screen (also understood as maximized display) in the landscape state.
The following describes a user interface schematic diagram in a video playing scene provided in the embodiment of the present application in connection with a UI embodiment.
3A-3D illustrate a process by which an electronic device switches from a portrait display user interface to a landscape display user interface.
As shown in fig. 3A-3D, fig. 3A illustrates a user interface 41 in a video playing application in a portrait state of an electronic device, where the user interface 41 may include a video playing area 411. Fig. 3B illustrates the user interface 42 in the video playback application rotated 30 degrees from the portrait state in the rotational direction (i.e., counterclockwise) illustrated in fig. 3B, and the user interface 42 may include a video playback area 421. Fig. 3C illustrates the user interface 43 in a video playback application with the electronic device rotated 60 degrees in a counter-clockwise direction from portrait state, the user interface 43 may include a video playback area 431. Fig. 3D illustrates the user interface 44 in the video playback application in a 90 degree counterclockwise rotation of the electronic device from portrait state, i.e., landscape state. Wherein, the user interface 44 displays only the video playing area 441, i.e., in the landscape state, the video playing area 441 is displayed full screen. In the embodiment of the present application, the above rotation angle is merely exemplary, and is not limited thereto.
The rotation of the electronic device is directed to the geodetic coordinate system, not to the electronic device coordinate system. Wherein the geodetic coordinate system is fixed. By way of example, the ground coordinate system may be defined herein as the X-axis in a direction parallel to the ground and the Y-axis in a direction perpendicular to the ground. The electronic device coordinate system is variable, and may be exemplified herein by the shorter side of the electronic device screen being the X-axis and the longer side of the electronic device screen being the Y-axis. In the embodiment of the present application, the X-axis and the Y-axis of the two coordinate systems are not limited. In the embodiment of the application, taking the case that the plane where the screen of the electronic device is located is perpendicular to the ground in the vertical screen state and the horizontal screen state as an example.
As shown in fig. 3A, the electronic device is in a vertical screen state, and the geodetic coordinate system is consistent with the electronic device coordinate system.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 3B. In the geodetic coordinate system, the electronic apparatus shown in fig. 3B differs from the electronic apparatus shown in fig. 3A by 30 degrees, and the video playback area 421 coincides with the display direction of the video playback area 411. Thus, in the electronic device coordinate system, the video playing area 421 can be understood as being obtained after the video playing area 411 is rotated by 30 degrees in the opposite direction (i.e., clockwise) to the rotation direction shown in fig. 3C, the display direction of the video playing area 421 is different from the display direction of the video playing area 411 by 30 degrees, and the video playing area 421 is slightly larger than the video playing area 411.
In the coordinate system of the electronic device, the display direction of the other elements in the user interface 42 except the video play area 421 coincides with the display direction of the user interface 41. Illustratively, the display direction of the video playing area 421 is based on the ground coordinate system, the horizontal direction of the video playing area 421 may refer to the X direction of the ground coordinate system, and the vertical direction of the video playing area 421 may refer to the Y direction of the ground coordinate system. The display direction of the other elements in the user interface 42 except the video playing area 421 is based on the electronic device coordinate system, the horizontal direction of the other elements in the user interface 42 except the video playing area 421 may refer to the X direction of the electronic device coordinate system, and the vertical direction of the other elements in the user interface 42 except the video playing area 421 may refer to the Y direction of the electronic device coordinate system.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 3C. In the geodetic coordinate system, the electronic apparatus shown in fig. 3C is different from the electronic apparatus shown in fig. 3B by 30 degrees, and the display direction of the video play area 431 coincides with the display direction of the video play area 421. In the electronic device coordinate system, the video playing area 431 in the user interface 43 may be understood as being obtained after the video playing area 421 is rotated by 30 degrees in the opposite direction (i.e., clockwise) of the rotation direction shown in fig. 3B, the display direction of the video playing area 431 is different from the display direction of the video playing area 421 by 30 degrees, and the video playing area 431 is larger than the video playing area 421.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 3D. In the geodetic coordinate system, the electronic apparatus shown in fig. 3D is different from the electronic apparatus shown in fig. 3C by 30 degrees, and the video playing area 441 coincides with the display direction of the video playing area 431. In the electronic device coordinate system, the video playing area 441 in the user interface 44 may be understood as being obtained after the video playing area 431 is rotated by 30 degrees in the opposite direction (i.e., clockwise) to the rotation direction shown in fig. 3C, and the display direction of the video playing area 441 is different from the display direction of the video playing area 431 by 30 degrees.
The rotation of the display direction of the video playing area may mean that the video playing area rotates around the center point of the video playing area, and as the rotation angle increases, the video playing area increases. And the video playing area occupies the full screen or most of the screen after the electronic equipment is switched from the vertical screen state to the horizontal screen state. During the rotation of the electronic device, a part of the content in the video playing area may exceed the boundary of the display screen, the exceeding part of the content is not displayed, and the area without elements in the user interface is displayed in black during the rotation.
Fig. 4A-4D illustrate, for example, a process by which an electronic device switches from a landscape display user interface to a portrait display user interface.
As shown in fig. 4A-4D, fig. 4A illustrates a user interface 51 in a video playing application in a landscape state of an electronic device, and the user interface 51 may include a video playing area 511. Wherein the user interface 51 displays only the video playing area 511, i.e. in the landscape state, the video playing area 511 is displayed full screen. Fig. 4B illustrates the user interface 52 in the video playback application rotated 30 degrees from the landscape state in the direction of rotation shown in fig. 4B (i.e., clockwise), and the user interface 52 may include a video playback area 521. Fig. 4C illustrates the user interface 53 in the video playback application rotated 60 degrees in a clockwise direction from the landscape state by the electronic device, and the user interface 53 may include a video playback area 531. Fig. 4D illustrates the user interface 54 in the video playback application in a 90 degree clockwise rotation from the landscape state, i.e., portrait state. In the embodiment of the present application, the above rotation angle is merely exemplary, and is not limited thereto.
The rotation of the electronic device is directed to the geodetic coordinate system, not to the electronic device coordinate system. Wherein the geodetic coordinate system is fixed. By way of example, the ground coordinate system may be defined herein as the X-axis in a direction parallel to the ground and the Y-axis in a direction perpendicular to the ground. The electronic device coordinate system is variable, and may be exemplified herein by the shorter side of the electronic device screen being the X-axis and the longer side of the electronic device screen being the Y-axis. In the embodiment of the present application, the X-axis and the Y-axis of the two coordinate systems are not limited. In the embodiment of the application, taking the case that the plane where the screen of the electronic device is located is perpendicular to the ground in the vertical screen state and the horizontal screen state as an example.
As shown in fig. 4A, in the state of the electronic device being in the landscape screen, the geodetic coordinate system and the electronic device coordinate system are as shown in fig. 4A.
As shown in fig. 4B, the geodetic coordinate system is not consistent with the electronic device coordinate system at this time. In the geodetic coordinate system, the electronic apparatus shown in fig. 4B differs from the electronic apparatus shown in fig. 4A by 30 degrees, and the video playing area 521 coincides with the display direction of the video playing area 511. Thus, in the electronic device coordinate system, the video playing area 521 can be understood as being obtained after the video playing area 511 is rotated by 30 degrees in the opposite direction (i.e., counterclockwise) to the rotation direction shown in fig. 4B, the display direction of the video playing area 521 is different from the display direction of the video playing area 511 by 30 degrees, and the video playing area 521 is smaller than the video playing area 511.
In the coordinate system of the electronic device, the display direction of the other elements in the user interface 52 except the video playing area 521 coincides with the display direction of the user interface 51. Illustratively, the display direction of the video playing area 521 is based on the ground coordinate system, the horizontal direction of the video playing area 521 may refer to the X direction of the ground coordinate system, and the vertical direction of the video playing area 521 may refer to the Y direction of the ground coordinate system. The display direction of the other elements in the user interface 52 than the video playing area 521 is based on the electronic device coordinate system, the horizontal direction of the other elements in the user interface 52 than the video playing area 521 may refer to the Y direction of the electronic device coordinate system, and the vertical direction of the other elements in the user interface 52 than the video playing area 521 may refer to the X direction of the electronic device coordinate system.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 4C. In the geodetic coordinate system, the electronic apparatus shown in fig. 4C is different from the electronic apparatus shown in fig. 4B by 30 degrees, and the display directions of the video play area 531 and the video play area 521 coincide. In the electronic device coordinate system, the video play area 531 in the user interface 53 can be understood as being obtained after the video play area 521 is rotated by 30 degrees in the opposite direction (i.e., counterclockwise) to the rotation direction shown in fig. 4B, the display direction of the video play area 531 is different from the display direction of the video play area 521 by 30 degrees, and the video play area 531 is smaller than the video play area 521.
At this time, the geodetic coordinate system is also inconsistent with the electronic device coordinate system, as shown in fig. 4D. In the geodetic coordinate system, the electronic apparatus shown in fig. 4D differs from the electronic apparatus shown in fig. 4C by 30 degrees, and the display directions of the video play area 541 and the video play area 531 coincide. In the electronic device coordinate system, the video play area 541 in the user interface 54 may be understood as being obtained after the video play area 531 is rotated by 30 degrees in the opposite direction (i.e., counterclockwise) to the rotation direction shown in fig. 4C, the display direction of the video play area 541 is different from the display direction of the video play area 531 by 30 degrees, and the video play area 541 is smaller than the video play area 531.
The rotation of the display direction of the video playing area may mean that the video playing area rotates around the center point of the video playing area, and as the rotation angle increases, the video playing area decreases. And the video playing area occupies the full screen or most of the screen after the electronic equipment is switched from the vertical screen state to the horizontal screen state. During the rotation of the electronic device, a part of the content in the video playing area may exceed the boundary of the display screen, the exceeding part of the content is not displayed, and the area without elements in the user interface is displayed in black during the rotation.
From the above, it can be seen that, in the process of rotating the electronic device from the portrait state to the landscape state and from the landscape state to the portrait state, the video playing area displayed by the electronic device is also a dynamic changing process, that is, the size of the video playing area displayed by the electronic device is changed. And the video playing area changes with the rotation time of the electronic equipment. In addition, compared with the viewing angle of the user, the display direction of the video playing area is not changed, so that better viewing experience can be provided for the user. Since in a video playing scene, the user focuses on the video, not other elements than the video. Setting the transition animation based on the video playing area can enable a user to observe the rotation change of the video playing area relative to the screen in the process of switching the horizontal screen and the vertical screen of the electronic equipment, and only the video playing area rotates to enable the user to be immersed in the video continuously. Therefore, the possibility that the user is interfered by other elements when watching the video can be reduced, the user can be immersed in the video, and the user experience is good.
Based on the user interface described above, the flow of the display method for switching between horizontal and vertical screens provided in the embodiments of the present application is described below.
Fig. 5 illustrates a flow of a display method for switching between horizontal and vertical screens according to an embodiment of the present application. Fig. 5 specifically illustrates a display method in the process of switching the electronic device from the vertical screen state to the horizontal screen state.
As shown in fig. 5, the method specifically includes:
s501, the electronic equipment is in a vertical screen state, and a first user interface is displayed in a first direction.
When the electronic equipment is in a vertical screen state, the height of an interface displayed by the electronic equipment is larger than the width. For example, the user interface of the electronic device in the portrait state may refer to fig. 1A and 3A.
The first direction may refer to a direction of a display interface of the electronic device. When the electronic device is in the portrait state, the first direction may refer to a direction in which a top narrow side of a screen of the electronic device points to a bottom narrow side of the screen. For example, the first direction may refer to the first display direction shown in fig. 1A.
The first user interface may be divided into two types: one means that the first user interface does not comprise a video playing area, and the other means that the first user interface comprises a video playing area.
S502, the electronic equipment detects user operation for switching the electronic equipment from the vertical screen state to the horizontal screen state.
In some implementations, the user operations described above may include the following two types: one is an operation in which a user rotates an electronic device; one is a click operation, a touch operation, a voice instruction, and a gesture instruction, which are performed on a full-screen display control. Wherein,
in the case where the user operation is an operation in which the user rotates the electronic device, in some implementations, the electronic device may acquire the sensor data in real time in a power-on state. Specifically, the electronic device may set an interval (e.g., 5 ms) for acquiring sensor data. The electronic device can analyze and process the acquired various sensor data. The descriptions of the various sensors may be referred to below in connection with fig. 9, and are not described in detail herein.
In some implementations, the electronic device may detect operation of the user to rotate the electronic device by analyzing the sensor data. Specifically, after the electronic device acquires the sensor data, whether the electronic device rotates or not and the degree of rotation generated by the electronic device can be analyzed. Wherein the above-mentioned sensor can include, but is not limited to: acceleration sensor, gyro sensor. The acceleration sensor may acquire the magnitude of acceleration of the electronic device in various directions (for example, X, Y and Z axis), and detect inclination and movement of the electronic device therefrom. The gyro sensor can acquire the magnitude of angular velocity of the electronic apparatus in various directions (for example, X, Y and Z axis), thereby detecting rotation and tilt of the electronic apparatus. In the embodiment of the present application, the acquisition of the sensor data is not limited, and more sensor data may exist. For example, the electronic device may identify X, Y and the Z axis according to the obtained acceleration data, may convert X, Y and the Z axis vector into acceleration in one direction, and may calculate an inclination angle of the electronic device on the Z axis by using an inverse trigonometric function, so as to determine the degree of rotation generated by the electronic device.
In some implementations, the electronic device may be placed on a plane parallel to the ground plane, or the electronic device may be placed perpendicular to the ground plane, or may be placed at an angle to the ground plane. For example, the mobile phone may be placed on a desktop or held by a user. The electronic device may be rotated in the above-described placement situation. The user rotating the electronic device may refer to: the electronic equipment is in a horizontal screen state, so that a screen of the electronic equipment rotates in a clockwise direction or a counterclockwise direction in a plane vertical to the ground; or in a clockwise or counter-clockwise direction in a plane at an angle to the ground. Specifically, when the electronic device rotates under the condition that the electronic device is placed perpendicular to the ground, the angle formed by the electronic device and the ground needs to be larger than a preset threshold (for example, 45 degrees). The user rotating the electronic device may further refer to: the electronic equipment is in a horizontal screen state, so that a screen of the electronic equipment rotates in a clockwise direction or a counterclockwise direction perpendicular to the screen; or rotated in a clockwise or counter-clockwise direction at an angle to the screen.
In the embodiment of the present application, the description of the user rotating the electronic device is only illustrated, and may further include more rotation situations triggering the horizontal-vertical screen switching, which is not limited.
In some implementations, the electronic device may trigger a switch of landscape and portrait after detecting a user's operation to rotate the electronic device by analyzing the sensor data. That is, when the degree of rotation of the electronic device reaches the condition of the landscape screen switching (e.g., the degree of rotation exceeds a set switching threshold), the landscape screen switching may be triggered.
Under the condition that the user operation is the clicking operation, the touch operation, the voice instruction and the gesture instruction which are applied to the full-screen display control, the electronic equipment can respond to the user operation to trigger the switching of the horizontal screen and the vertical screen.
In some implementations, the electronic device is typically provided with a screen orientation lock function. Specifically, after the electronic device responds to the user operation and starts the function of locking the screen direction, the display direction of the interface in the electronic device is not changed all the time no matter how the electronic device rotates. The display method for switching the horizontal screen and the vertical screen provided by the embodiment of the application is worth to explain, and the electronic equipment does not start the function of locking the screen direction.
In some implementations, the user operation in S502 may be referred to as a first operation.
S503, the electronic equipment captures a screen of the first user interface, and captures a screen of the first user interface are obtained.
In some implementations, the electronic device may also perform a freeze screen operation prior to screen capturing the first user interface.
In some implementations, after the electronic device detects the user operation in S502, the electronic device may execute the freeze-screen operation when the electronic device displays the first user interface. The freeze screen operation may refer to the electronic device pausing the drawing and refreshing of the interface and not responding to any touch operation. In the embodiment of the application, the time when the electronic device is in the frozen screen state (i.e., the time interval between the electronic device executing the frozen screen operation and the unfreezing screen operation) is extremely short, and is generally not perceived by a user.
In some implementations, the electronic device can screen capture a first user interface before rotating the electronic device, resulting in a screen capture. After the electronic device obtains the screenshot, the screenshot can be cut to obtain a picture of the non-video playing area in the first user interface. Exemplary content in the user interface as in fig. 3A-3C, except for the video play area. That is, the screenshot is used to fill in areas of the user interface other than the video play area during the subsequent pass in which the rotational animation is played.
In some implementations, after the electronic device captures the interface in the first direction, the electronic device may further obtain information of a surfacview in the first user interface through layer information of the first user interface. In the embodiment of the present application, the time sequence of acquiring the screenshot and acquiring the layer information of the first user interface by the electronic device is not limited. It should be noted that, the electronic device may clip the screenshot according to the position information of the surfacview, and obtain the picture of the non-video playing area in the first user interface.
Wherein View may represent a control. Surface View, one of the views, may be used to display video. The information that the electronic device obtains the first user interface surfmeview is actually used in the subsequent process to determine whether the current scene is a video playing scene. Because most of application programs for playing videos play videos through surface view, the electronic device can determine whether the current scene is a video playing scene or not by acquiring information of the surface view.
Specifically, the electronic device may first obtain the composite layer (Composition layers) in the first user interface, and Composition layers is actually a layer (layer) set. A layer may include one or more views. And then filtering the layer with the name carrying the field of 'surface view' from the layer, and acquiring the name of the layer and the boundary information (such as the left upper position information and the right lower position information of the rectangular layer) of the layer. The following reference to surface view may refer to a layer carrying a "surface view" field, which may also be understood as a video play area displayed in the user interface from a user-viewable perspective.
In some implementations, the information of the surface view in the first user interface may include, but is not limited to: the content to be displayed by the surfacview in the first user interface, the number of surfacview contained in the first user interface, the name of the surfacview contained in the first user interface, the height of the surfacview in the first user interface, the width of the surfacview in the first user interface, and so on. For example, if the first user interface does not include the video playing area, the number of surfeView included in the first user interface is 0.
In one possible implementation, the electronic device may store the information of the surface view of a user interface during the first switching between landscape and portrait when the electronic device displays the user interface. Therefore, when the interface is switched between the horizontal screen and the vertical screen, the information of the surface view is not required to be obtained through the information of the first layer, the information of the surface view can be obtained directly from the cache, and the computing resources are saved.
S504, the electronic equipment draws a second user interface.
The drawing described above merely means that the second user interface is generated, and in S504, the second user interface is not displayed by the current electronic device.
The second user interface may refer to an interface displayed by the electronic device when the electronic device is in a landscape state. When the electronic equipment is in a horizontal screen state, the height of an interface displayed by the electronic equipment is smaller than the width. For example, reference may be made to fig. 1B and 3D in a landscape state.
In some implementations, after the electronic device obtains the sensor data in S502, the electronic device may determine a display direction of the second user interface. The display orientation of the second user interface may be referred to as a second orientation. The first direction may refer to a direction in which a broadside of the electronic device points to another broadside of the screen. For example, the second direction may refer to the second display direction shown in fig. 1B.
In some implementations, the second direction may be specifically divided into two cases. For example, the electronic device may set a front camera at the top of the screen in the portrait state. After the electronic device acquires the sensor data, the electronic device can determine the rotation direction of the electronic device. And then determining the display direction of the user interface of the electronic equipment in the transverse screen state, namely a second direction. The display directions of the user interface in the horizontal screen state include two types: one is that the front camera is arranged at the left end of the screen when the user views the user interface according to the display direction. The other is that the front camera is arranged at the right end of the screen when the user observes the user interface according to the display direction. The second direction may affect the direction of the horizontal/vertical screen rotation animation using the video playing scene and the direction of the interface rendering after the rotation of the electronic device in the subsequent process, so the electronic device needs to determine the display direction of the second user interface, i.e. the second direction.
In some implementations, after the electronic device determines the second direction, the second user interface can be drawn. In particular, the second user interface may be a user interface in an application in the electronic device. Different applications may be provided with different drawing rules for the user interface displayed on the landscape and portrait screen. The rendering rules may be used to determine what is displayed by the second user interface. The drawing can be realized according to a default drawing rule of the system of the electronic equipment, and the default drawing rule of the system can be designed by a developer of the system according to the aesthetic degree of an interface and the look and feel of a user. The drawing may also be implemented according to a drawing rule set by an application program to which the second user interface belongs, where the drawing rule set by the application program is designed by a developer of the application program. When the application program to which the second user interface belongs is not provided with a drawing rule aiming at the horizontal screen display, the electronic equipment draws the second user interface according to the default drawing rule; when the application program to which the second user interface belongs sets a drawing rule for the horizontal screen display, the application program can send the drawing rule to a system of the electronic device, and the electronic device can draw the second user interface according to the drawing rule set by the application program.
In some implementations, the electronic device can apply drawing rules of the program to draw the second user interface. Specifically, the electronic device may obtain, from the layer information in the first user interface, content of each control in the second user interface, and may also obtain, according to the drawing rule, a size and a position of each control in the second user interface.
When the first user interface does not include the video playing area, that is, when the number of surfmeview included in the first user interface acquired by the electronic device is 0, the electronic device may draw the second user interface according to the content, the size and the position of the control in the first user interface acquired from the layer information in the first user interface, and in combination with a drawing rule (for example, a rule that the font of characters in the control under the horizontal screen is unchanged, the width of the control is doubled, the height is unchanged, and the like). Illustratively, the first user interface may refer to user interface 21 described above with respect to FIG. 1A, and the second user interface may refer to user interface 22 described above with respect to FIG. 1B.
When the first user interface includes a video playing area, that is, when the number of SurfaceView included in the first user interface acquired by the electronic device is not 0, the electronic device may draw the second user interface according to the content and the size of the SurfaceView in the first user interface acquired from the layer information in the first user interface, and in combination with a drawing rule (for example, rules such as unchanged SurfaceView content under a horizontal screen, centered position of the SurfaceView, increase in width, height, etc. of the SurfaceView). Illustratively, the first user interface may refer to user interface 31 described above with respect to FIG. 2A, and the second user interface may refer to user interface 34 described above with respect to FIG. 2D.
In the embodiment of the present application, the drawing rules are only described as examples, and further drawing rules may be included.
In some implementations, the electronic device can obtain information of the surfacview in the second user interface after drawing the second user interface.
The information of the surface view in the second user interface may include, but is not limited to: the content that is required to be displayed by the surfacview in the second user interface, the number of surfacview contained in the second user interface, the name of the surfacview contained in the second user interface, the height of the surfacview in the second user interface, the width of the surfacview in the second user interface, and so on. For example, if the interface in the second direction does not include the video playing area, the number of surfmeview included in the interface in the second direction is 0.
S505, the electronic equipment judges whether the current electronic equipment is in a target video playing scene or not according to the first user interface and the second user interface.
In some implementations, the electronic device may also perform a defrost operation before determining whether the current electronic device is in the target video playback scene.
That is, the electronic device may perform the unfreezing operation after acquiring the information of the surfacview in the second user interface, or may perform the unfreezing operation after determining whether the current electronic device is in the target video playing scene. In the embodiment of the present application, it is only necessary to ensure that the operation of thawing the screen is performed before the rotation animation is played, which is not limited.
In the embodiment of the application, not all videos adopting SurfaceView can meet the requirement of playing the horizontal and vertical screen rotation animation of the target video playing scene. When the video playing area in the first user interface is not displayed in full screen, the width of the video playing area is larger than the height; and when the video playing area is displayed in the second user interface in a full screen mode, the electronic equipment can be understood to be currently in the target video playing scene. The electronic device needs to determine whether it is in the target video play scene before playing the rotation animation. That is, the electronic device needs to determine whether the above requirement of being in the target video playing scene is met in the first user interface and in the second user interface.
Further, the electronic device is currently in the target video playing scene needs to satisfy: the height of the video playing area in the first user interface is greater than a first height threshold; the width of the video playing area in the second user interface is greater than the width threshold, and the height of the video playing area in the second user interface is greater than the second height threshold. That is, the electronic device needs to determine whether the electronic device is in the target video playing scene according to the acquired information of the surface view in the first user interface and the acquired information of the surface view in the second user interface. The details of determining whether the electronic device is in the target video playback scene may be referred to in the following description of fig. 7, which will not be explained here.
In the case where the electronic device determines that the current electronic device is in the target video playback scene, S506-1 may be performed. In the case where the electronic device determines that the current electronic device is not in the target video playback scene, S506-2 may be performed. Specifically, when the user interface displayed by the electronic device does not include the video playing area and the displayed user interface includes the video playing area, but the video playing area does not meet the above requirements, the electronic device may determine that the current electronic device is not in the target video playing scene. 1A-1B and 2A-2D are not in the target video play scene.
S506-1, the electronic equipment plays the rotary animation corresponding to the target video playing scene.
In some implementations, the time that the electronic device is rotated is greater than or equal to the time that the rotational animation is played. In one possible implementation, the electronic device is also rotating during execution of S502-S505 described above, such that the electronic device rotates for a time period greater than the time period for which the rotational animation is played.
In some implementations, the rotational animation corresponding to the target video playback scene may be provided by a system of the electronic device. In one possible implementation manner, the first frame of the rotation animation corresponding to the target video playing scene may include a first user interface displayed in a vertical screen state of the electronic device, and the last frame of the rotation animation corresponding to the target video playing scene may include a second user interface displayed in a horizontal screen state of the electronic device.
In another possible implementation manner, the first frame of the rotary animation corresponding to the target video playing scene does not include the first user interface displayed in the vertical screen state of the electronic device, and the last frame of the rotary animation corresponding to the target video playing scene does not include the second user interface displayed in the horizontal screen state of the electronic device.
In the embodiment of the application, the electronic device displays the second user interface after playing the rotation animation.
In some implementations, in S506-1, the rotating animation corresponding to the target video playing scene may include a rotating animation that is switched from a vertical screen state to a horizontal screen state. In addition to the rotational animation designed in S506-1, the rotational animation corresponding to the target video playback scene may further include a rotational animation that is switched from a landscape screen state to a portrait screen state.
The following describes a landscape-portrait screen rotation animation of a target video playback scene in conjunction with fig. 6A and 6B.
Fig. 6A is a schematic diagram illustrating a change process of a rotation animation (for short, a vertical-to-horizontal rotation animation) switched from a vertical screen state to a horizontal screen state in a target video playing scene provided by the present application.
In some implementations, the electronic device can mask the black layer to achieve filling in the range between the video play area displayed full screen in the landscape state and the screen boundary with black. I.e. the electronic device is provided with a blackout layer next to the video playing area. Because the electronic device does not fill the range except the video playing area with black in the vertical screen state, the black mask layer also changes in the process of switching the horizontal screen and the vertical screen. The change can also be realized by the vertical rotation and horizontal rotation animation.
Referring to fig. 6A, the vertical rotation and horizontal rotation animation may be composed of two parts: a portion for a video play area and a portion for a blackout layer. The displayed content of the vertical-rotation and horizontal-rotation animation is composed of the content of a video playing area and a black mask layer. The vertical rotation and horizontal rotation animation is set with a duration. Illustratively, the duration of the vertical-rotation and horizontal-rotation animation for the video play area is set to 400ms. The duration of the vertical-rotation and horizontal-rotation animation for the Mongolian black layer is set to 150ms. In the embodiment of the present application, the duration of the vertical-rotation and horizontal-rotation animation is only exemplified, which is not limited.
As shown in fig. 6A, the vertical rotation and horizontal rotation animation for a video play area may include the following factors: size Scale, rotation angle Rotation, and Position. The Scale shown in fig. 6A changes from an initial size to a final size within 0 to 400ms, specifically: the size of the video playing area in the vertical screen state is gradually increased to the size of the video playing area in the horizontal screen state within 0-400 ms. The Rotation shown in fig. 6A changes from 0 degrees to 90 degrees within 0 to 400ms, and the Rotation center is the center of the video playing area, specifically: the vertical-rotation and horizontal-rotation animation rotates by 90 degrees counterclockwise or 90 degrees clockwise in the plane of the screen with the center of the video playing area within 0-400 ms. The Position shown in fig. 6A changes from an initial Position to a final Position within 0 to 400ms, specifically: the position of the video playing area in the vertical screen state is switched to the position of the video playing area in the horizontal screen state at the moment when the Rotation reaches 90 degrees, namely the position for maximizing display.
The vertical-to-horizontal rotation animation for the Mongolian black layer may also include the following factors: transparency Alpha. The Alpha shown in fig. 6A changes from 0% to 100% within 0 to 150ms, specifically: the black mask layer gradually changes from transparent to black within 0-150 ms. In this way, in the process of switching the electronic equipment from the vertical screen state to the horizontal screen state, the user can observe that the range of the user interface except for the video playing area is gradually blackened, the user experience is good, and the blackened range is finally black, so that the possibility that the user is interfered by other elements when watching the video can be reduced. For example, the user interface for switching the electronic device from the portrait screen state to the landscape screen state according to the changing process of the portrait-to-landscape-rotation animation shown in fig. 6A may refer to the user interfaces of fig. 3A to 3D described above.
Fig. 6B is a schematic diagram illustrating a change process of a rotation animation (for short, a horizontal-to-vertical rotation animation) switched from a horizontal screen state to a vertical screen state in a video playing scene of a target provided in the present application.
Referring to fig. 6B, the panning/tilting animation can be composed of two parts as well: a portion for a video play area and a portion for a blackout layer. The displayed content of the horizontal-to-vertical rotation animation is composed of the content of a video playing area and a black mask layer. The horizontal-to-vertical rotation animation is set with a duration. Illustratively, the duration of the panning/tilting rotation animation for the video play area is set to 400ms. The duration of the panning/tilting rotation animation for the Mongolian black layer is set to 150ms. In the embodiment of the present application, the duration of the horizontal-to-vertical rotation animation is only illustrated by way of example, and is not limited thereto.
As shown in fig. 6B, the panning/tilting rotation animation for the video play area may include the following factors: size Scale, rotation angle Rotation, and Position. Wherein, scale shown in fig. 6B changes from an initial size to a final size within 0 to 400ms, specifically: the size of the video playing area in the horizontal screen state is gradually reduced to the size of the video playing area in the vertical screen state within 0-400 ms. The Rotation shown in fig. 6B changes from 0 degrees to 90 degrees within 0 to 400ms, and the Rotation center is the center of the video playing area, specifically: the horizontal-to-vertical rotation animation rotates 90 degrees counterclockwise or 90 degrees clockwise in the plane of the screen with the center of the video play area within 0-400 ms. The Position shown in fig. 6B changes from the initial Position to the final Position within 0 to 400ms, specifically: the position of the video playing area in the horizontal screen state, namely the position of the maximized display is switched to the position of the video playing area in the vertical screen state at the moment when the Rotation reaches 90 degrees.
The panning/tilting rotation animation for the blackout layer may also include the following factors: transparency Alpha. The Alpha shown in fig. 6B changes from 100% to 0% within 0 to 150ms, specifically: the black mask layer is gradually changed from black to transparent within 0-150 ms. Therefore, in the process that the electronic equipment is switched from the horizontal screen state to the vertical screen state, a user can observe that the range of the user interface except for the video playing area is gradually transparent, and the user experience is good. For example, the user interface for switching the electronic device from the landscape state to the portrait state according to the change procedure of the landscape-to-portrait rotation animation shown in fig. 6B may refer to the user interfaces of fig. 4A to 4D described above.
In the embodiment of the present application, the transparency may be set more, for example, 5% -95%, which is not limited.
In some implementations, the panning and tilting animation shown in fig. 6A may be referred to as a first rotation animation and the tilting and tilting animation shown in fig. 6B may be referred to as a second rotation animation.
Optionally, the electronic device may judge the initial state of the electronic device by performing screenshot on the first user interface in S503, that is, whether the electronic device is switched from the vertical screen state to the horizontal screen state or from the horizontal screen state to the vertical screen state, so as to judge which of the two types of rotation animation corresponding to the adopted target video playing scene is.
S506-2, the electronic equipment plays a default rotation animation.
In some implementations, the default landscape rotational animation described above may refer to the entire user interface being rotated 90 degrees clockwise or 90 degrees counterclockwise about the center of the user interface in the plane of the screen. For example, the user interface of the electronic device in the process of switching between landscape and portrait in the target video playing scene may refer to the user interfaces of fig. 2A to 2D, which are not described herein.
In some implementations, in the process of switching the electronic device from the vertical screen state to the horizontal screen state, in addition to adopting the vertical-rotation and horizontal-rotation animation, elements except for a video playing area are required to be kept unchanged in an interface in the vertical screen state relative to the electronic device until the last frame of the vertical-rotation and horizontal-rotation animation, and then the user interface of the electronic device is switched to the interface in the horizontal screen state; similarly, in the process of switching the electronic device from the horizontal screen state to the vertical screen state, besides adopting the horizontal-to-vertical rotation animation, elements except for the video playing area are required to be kept unchanged in the interface in the vertical state until the last frame of the horizontal-to-vertical rotation animation, and then the user interface of the electronic device is switched to the interface in the vertical state. According to the method, the user can observe the rotation change of the video playing area in the process of switching the horizontal screen and the vertical screen of the electronic equipment, and only the video playing area rotates, so that the user can be immersed in the video continuously. Therefore, the possibility that the user is interfered by other elements when watching the video can be reduced, the user can be immersed in the video, and the user experience is good.
S507, the electronic equipment displays a second user interface.
In some implementations, after the electronic device plays the rotational animation, a second user interface can be displayed. The second user interface may be a user interface that the electronic device displays with a width greater than a height when the electronic device is in a landscape state. For example, the user interface of the electronic device in the landscape state may refer to fig. 1B and fig. 4A.
In one possible implementation, the rotation animation is not played when the electronic device rotates, and the time required for judging after the electronic device rotates triggers the horizontal and vertical screen switching. Thus, at the moment the electronic device plays the rotation animation, the electronic device may not be in a completely landscape state, i.e. the long side of the electronic device may not be parallel to the ground plane.
The display direction of the second user interface may be a direction in which one side of the electronic device is directed to the other side. For example, the display orientation of the second user interface may refer to the second display orientation shown in fig. 1B.
S508, the electronic equipment detects user operation for switching the electronic equipment from the horizontal screen state to the vertical screen state.
In some implementations, the electronic device may also perform a freeze screen operation after detecting a user operation to switch the electronic device from a landscape screen state to a portrait screen state.
In some implementations, the user operations described above may include the following two types: one is an operation in which a user rotates an electronic device; one is a click operation, a touch operation, a voice instruction, and a gesture instruction, which are performed on a reduced display control. Wherein,
in the case where the user operation is an operation in which the user rotates the electronic device, in some implementations, the electronic device may acquire the sensor data in real time in a power-on state. The electronic device may detect operation of the user to rotate the electronic device by analyzing the sensor data.
In some implementations, the electronic device may be placed on a plane parallel to the ground plane, or the electronic device may be placed perpendicular to the ground plane, or may be placed at an angle to the ground plane. For example, the mobile phone may be placed on a desktop or held by a user. The electronic device may be rotated in the above-described placement situation. The user rotating the electronic device may refer to: the electronic equipment is in a vertical screen state or in a horizontal screen state, so that the screen of the electronic equipment rotates in a clockwise direction or a counterclockwise direction in a plane vertical to the ground; or in a clockwise or counter-clockwise direction in a plane at an angle to the ground. Specifically, when the electronic device rotates under the condition that the electronic device is placed perpendicular to the ground, the angle formed by the electronic device and the ground needs to be larger than a preset threshold (for example, 45 degrees). The user rotating the electronic device may further refer to: the electronic equipment is in a horizontal screen state, so that a screen of the electronic equipment rotates in a clockwise direction or a counterclockwise direction perpendicular to the screen; or rotated in a clockwise or counter-clockwise direction at an angle to the screen.
In the case that the user operation is a click operation, a touch operation, a voice instruction, and a gesture instruction that act on the reduced display control, the electronic device may trigger a switching of the landscape screen in response to the user operation.
The display method for switching the horizontal screen and the vertical screen provided by the embodiment of the application is worth to explain, and the electronic equipment does not start the function of locking the screen direction.
In some implementations, the user operation in S508 may be referred to as a second operation.
S509, the electronic device draws a third user interface.
The drawing described above merely means that the third user interface is generated, and in S504, the current electronic device does not display the third user interface. The third user interface may be a user interface displayed by the electronic device when the electronic device is in a portrait state. When the electronic equipment is in a vertical screen state, the width of an interface displayed by the electronic equipment is smaller than the height. For example, reference may be made to fig. 1A and 4D in a portrait state of the electronic device.
In some implementations, in the event that the electronic device is not in the target video playback scene,
in one possible implementation, the electronic device has detected a user operation for switching the electronic device from the landscape state to the portrait state while the second user interface is displayed. At this time, the second user interface is also drawn according to the drawing rule of the application program. Therefore, after detecting the user operation for switching the electronic device from the horizontal screen state to the vertical screen state, the electronic device can acquire the elements and layout information in the third user interface according to the drawing rule of the application program, and draw the third user interface. In this case, the first user interface may be consistent with the third user interface.
In another possible implementation, the electronic device detects a user operation for switching the electronic device from the landscape state to the portrait state after updating the second user interface while the second user interface is displayed. At this time, the second user interface after the update is drawn based on the user update operation, which may be significantly different from the elements displayed by the first user interface. Therefore, after detecting the user operation for switching the electronic device from the horizontal screen state to the vertical screen state, the electronic device can apply the drawing rule of the program, acquire the elements and layout information in the third user interface, and draw the third user interface. In this case, the first user interface is inconsistent with the third user interface.
In some implementations, in the case where the electronic device is in a target video playback scene,
in one possible implementation, the electronic device may update due to the video playing area playing video or may not update due to the video playing area pausing playing video while the second user interface is displayed. But since the second user interface is a full screen display of the video playing area, the third user interface also needs to include elements other than the video playing area. Therefore, after detecting the user operation for switching the electronic device from the horizontal screen state to the vertical screen state, the electronic device can acquire the elements and layout information in the third user interface according to the drawing rules provided by the application program, and draw the third user interface in real time.
The relevant description for drawing may refer to the relevant description in S504, which is not described herein.
In some implementations, the electronic device may also obtain information of the surfacview in the third user interface after drawing the third user interface.
S510, the electronic equipment judges whether the current electronic equipment is in the target video playing scene or not according to the second user interface and the third user interface.
In some implementations, the electronic device may also perform a defrost operation before determining whether the current electronic device is in the target video playback scene.
That is, the electronic device may perform the unfreezing operation after acquiring the information of the surfacview in the third user interface, or may perform the unfreezing operation after determining whether the current electronic device is in the target video playing scene. In the embodiment of the present application, it is only necessary to ensure that the operation of thawing the screen is performed before the rotation animation is played, which is not limited.
In the embodiment of the application, not all videos adopting SurfaceView can meet the requirement of playing the horizontal and vertical screen rotation animation of the target video playing scene. When the video playing area in the third user interface is not displayed in full screen, the width of the video playing area is larger than the height; and when the video playing area is displayed in the second user interface in a full screen mode, the electronic equipment can be understood to be currently in the target video playing scene. The electronic device needs to determine whether it is in the target video play scene before playing the rotation animation. That is, the electronic device needs to determine whether the above requirement of being in the target video playing scene is satisfied in the second user interface and in the third user interface.
Further, the electronic device is currently in the target video playing scene needs to satisfy: the height of the video playing area in the third user interface is greater than a first height threshold; the width of the video playing area in the second user interface is greater than the width threshold, and the height of the video playing area in the second user interface is greater than the second height threshold. That is, the electronic device needs to determine whether the electronic device is in the target video playing scene according to the acquired information of the surfaceView in the third user interface and the acquired information of the surfaceView in the second user interface. The details of determining whether the electronic device is in the target video playback scene may be referred to in the following description of fig. 7, which will not be explained here.
In the case where the electronic device determines that the current electronic device is in the target video play scene, S511-1 may be performed. In the case where the electronic device determines that the current electronic device is not in the target video play scene, S511-2 may be performed. Specifically, when the user interface displayed by the electronic device does not include the video playing area and the displayed user interface includes the video playing area, but the video playing area does not meet the above requirements, the electronic device may determine that the current electronic device is not in the target video playing scene. 1A-1B and 2A-2D are not in the target video play scene.
S511-1, the electronic equipment plays the rotary animation corresponding to the target video playing scene.
In some implementations, the time that the electronic device is rotated is greater than or equal to the time that the rotational animation is played. In one possible implementation, the electronic device is also rotating during the execution of S508-S510 described above, such that the electronic device rotates for a time period greater than the time period for which the rotation animation is played.
In some implementations, the rotational animation corresponding to the target video playback scene may be provided by a system of the electronic device. In one possible implementation, the first frame of the rotational animation corresponding to the target video playing scene may include a second user interface displayed in a landscape state of the electronic device, and the last frame of the rotational animation corresponding to the target video playing scene may include a third user interface displayed in a portrait state of the electronic device.
In another possible implementation manner, the first frame of the rotary animation corresponding to the target video playing scene does not include the second user interface displayed by the electronic device in the landscape screen state, and the last frame of the rotary animation corresponding to the target video playing scene does not include the third user interface displayed by the electronic device in the portrait screen state.
In the embodiment of the application, the electronic device displays the third user interface after playing the rotation animation.
In some implementations, in S511-1, the rotational animation corresponding to the target video playback scene may further include a rotational animation that is switched from a landscape state to a portrait state.
The rotary animation for switching the horizontal screen state to the vertical screen state may be described with reference to fig. 6B, which is not described herein.
S511-2, the electronic device plays a default rotation animation.
In some implementations, the default landscape rotational animation described above may refer to the entire user interface being rotated 90 degrees clockwise or 90 degrees counterclockwise about the center of the user interface in the plane of the screen. For example, the user interface of the electronic device in the process of switching between landscape and portrait in the target video playing scene may refer to the user interfaces of fig. 2A to 2D, which are not described herein.
In some implementations, in the process of switching the electronic device from the landscape screen state to the portrait screen state, besides adopting the landscape-to-portrait rotation animation, elements except for the video playing area need to be kept unchanged in the interface in the landscape screen state until the last frame of the landscape-to-portrait rotation animation, and then the user interface of the electronic device is switched to the interface in the portrait state. According to the method, the user can observe the rotation change of the video playing area in the process of switching the horizontal screen and the vertical screen of the electronic equipment, and only the video playing area rotates, so that the user can be immersed in the video continuously. Therefore, the possibility that the user is interfered by other elements when watching the video can be reduced, the user can be immersed in the video, and the user experience is good.
How to determine whether the current electronic device is in the target video playing scene according to the first user interface and the second user interface in the above step S505 is specifically described below with reference to the flowchart of the method shown in fig. 7. Further, the electronic device may determine whether the current electronic device is in the target video playing scene according to the information of the surfacview in the interface in the first direction and the information of the surfacview in the interface in the second direction.
Fig. 7 illustrates a method flow of the electronic device determining whether it is in a target video playback scene. The method flow is exemplarily introduced in the case that the electronic equipment is switched from the vertical screen state to the horizontal screen state.
As shown in fig. 7, the method specifically includes:
S701、i=1,num=N。
in some implementations, the electronic device may obtain a number num of surfmeview of the first user interface of the electronic device in the portrait state, where the num has a value of N. Wherein N is a non-negative integer. And, the electronic device may set the parameter i, which takes a value of 1. Here, the electronic device setting parameter i is a case where the current scene contains a plurality of surfmeview is considered. In this way, the logic when the method flow of judging whether the target video playing scene is in the target video playing scene is executed can be better met.
S702、i<=num?
In some implementations, step S702 represents the electronic device determining whether the number of surfacview of the first user interface is greater than or equal to 1. That is, the electronic device may determine whether a surfacview exists in the first user interface.
If the electronic device determines that i is less than or equal to num, that is, if the electronic device determines that the surface view exists in the first user interface, it is indicated that the first user interface of the electronic device in the vertical screen state may include a video playing area, and further determination needs to be made on the video playing area that may be included in the first user interface, so S703 is executed.
If the electronic device determines that i is greater than num, that is, if the electronic device determines that there is no surfacview in the first user interface, it indicates that the first user interface of the electronic device is not a video playing scene, and thus S708-2 is executed.
S703, acquiring information of the ith surface view (surface view_i) of the first user interface.
In some implementations, the electronic device may traverse the surface view to obtain information of the i-th surface view, i.e., information of the surface view_i, after obtaining information of one or more surface views in the first user interface. For example, when i takes a value of 1, step S603 indicates that the electronic device obtains information of the 1 st surface view in the first user interface, that is, information of surface view_1. Wherein, the information of the surface view_1 may include, but is not limited to: the width of the 1 st surface view traversed in the first user interface, the height of the 1 st surface view, the name of the 1 st surface view, and so on.
S704, a dimension of SurfaceView_i > height?
Wherein width represents the width of surface view_i, height represents the height of surface view_i, and Threshold represents the height Threshold of surface view in the vertical screen state. For example, the electronic device may set the height threshold to 300 pixels. In the embodiment of the present application, more values of the Threshold may exist, which is not limited.
In some implementations, the target video playing scene in the vertical screen state in the embodiments of the present application needs to not only satisfy non-full screen display of the video playing area, where the width of the video playing area is greater than the height, but also satisfy that the height of the video playing area is greater than the height threshold. The height threshold value can ensure that the video playing area displayed on the first user interface of the electronic equipment is not too small in the vertical screen state, so that the user has bad look and feel. The display mode of the horizontal and vertical screen switching provided by the embodiment of the application is mainly applied to the scene that the video playing area in the interface in the horizontal screen state is displayed in a full screen mode, and the aspect ratio of the video playing area in the interface in the horizontal screen state is approximately the same as that of the video playing area in the interface in the vertical screen state. If the aspect ratios are inconsistent, the phenomenon of compression and stretching of the video after the horizontal and vertical screen switching can be caused, and the user experience is poor. Therefore, the width and the height of the video playing area in the interface in the vertical screen state are required correspondingly. In addition, the aspect ratio judgment adopted in the embodiment of the present application is to reduce misjudgment, and the proportional relationship of most target video playing areas in the embodiment of the present application should be reasonable.
From the above, the electronic device can determine whether the width of the video playing area in the vertical screen state is greater than the height by comparing the sizes of the width and the height, and determine whether the height of the video playing area in the vertical screen state is greater than the height Threshold by comparing the sizes of the height and the Threshold.
When the electronic device determines that the width is greater than height and the height is greater than Threshold, that is, the electronic device determines that the surface view_i in the interface in the vertical screen state meets the requirement of the embodiment of the application on the target video playing area, it is described that the first user interface of the electronic device is the target video playing scene, and it is also required to determine whether the second user interface of the electronic device in the horizontal screen state is the video playing scene, so S705 is executed.
In the case that the electronic device determines that the surface view_i in the first user interface in the vertical screen state does not meet the requirement of the embodiment of the application on the video playing area, the electronic device determines that the surface view_i in the first user interface in the vertical screen state does not meet the requirement of the embodiment of the application on the video playing area:
1. width is less than or equal to height, and height is less than or equal to Threshold;
2. the width is smaller than or equal to the height, and the height is larger than Threshold;
3. width is greater than height, and height is less than or equal to Threshold;
It is explained that the surfacview_i of the first user interface of the electronic device in the portrait state does not satisfy the requirement of the target video playback scene, so S709 is executed. Here, the electronic device performs S709 to determine whether all surfmeview in the portrait state do not meet the requirement of the target video playing scene, and further performs S708-2.
S705, whether a surface view with the same name as the surface view_i exists in the surface views acquired by the second user interface?
In some implementations, in S504 above, the electronic device has acquired surfeview information in the second user interface in the landscape state.
Therefore, the electronic device can judge whether the surface view with the same name as the name of the surface view_i exists in the drawn second user interface according to the name of the surface view contained in the surface view information in the second user interface in the horizontal screen state. Thus, the electronic device can directly exclude the scene without the video playing area in the second user interface in the horizontal screen state.
When the electronic device determines that the surface view with the same name as the name of the surface view_i exists in the horizontal screen state, that is, when the electronic device determines that the video playing area corresponding to the surface view_i is displayed on the interface in both the vertical screen state and the horizontal screen state, it is described that the second user interface in the horizontal screen state of the electronic device includes the video playing area, and further, it is required to determine the video playing area in the second user interface in the horizontal screen state, so that S606 is executed.
When the electronic device determines that the surfacview with the same name as the surfacview_i does not exist in the second user interface, that is, the electronic device determines that the video playing area corresponding to the surfacview_i is displayed in the first user interface in the vertical screen state, and when the video playing area is not displayed in the second user interface in the horizontal screen state, it is stated that the video playing area corresponding to the surfacview_i is not included in the horizontal screen state of the electronic device, and therefore S709 is executed. The electronic device executes S709 here to determine whether all of the surfacview in the landscape state is different from the surfacview in the portrait state. This is because, in general, if a surface view exists on the interface in the landscape state, a surface view having the same name as the surface view should also exist in the portrait state. All the above-mentioned SurfaceView in the landscape state and SurfaceView in the portrait state may include: and in the horizontal screen state, the condition of SurfaceView does not exist. Therefore, the electronic device performs S709 to determine whether none of the second user interfaces in the landscape state meets the requirement of the target video playing scene, and further performs S708-2.
S706, obtaining information of the SurfaceView with the same name as SurfaceView_i.
In some implementations, after determining that the surface view with the same name as the surface view_i exists in the surface view in the second user interface, the electronic device may acquire information of the surface view in the landscape screen state according to the name of the surface view_i. For example, when i takes a value of 1, the name of surface view_1 in the vertical screen state is aaa, and step S706 indicates that the electronic device obtains surface view information with the name aaa in the interface in the horizontal screen state, that is, information of surface view with the same name as surface view_1. Wherein, the information of the same name SurfaceView_1 can include but is not limited to: the width of the surface view named aaa in the landscape state, the height of the surface view named aaa, etc. In the embodiment of the application, the surface view with the same name as the surface view_i in the horizontal screen state may be simply referred to as the same name surface view_i.
S707, information with_s > height_s?
The width_s represents the width of the same-name surface view_i, the height_s represents the height of the same-name surface view_i, the SC_W represents the screen width of the electronic device in the horizontal screen state, the SC_H represents the screen height of the electronic device in the horizontal screen state, the Ratio represents the Ratio of the width of the video playing area required by the electronic device to the width of the full screen of the electronic device to the height of the full screen, the product of the SC_W and the Ratio represents the width threshold of the maximized display of the video playing area, and the product of the SC_H and the Ratio represents the height threshold of the maximized display of the video playing area. By way of example, the electronic device may be set to a scale of 0.9. In the embodiment of the present application, the Ratio may further have more values, which may be set according to the actual effect, and is not limited thereto. In some implementations, the above ratio may be referred to as a first value.
In some implementations, the target video playing scene in the horizontal screen state mentioned in the embodiments of the present application needs to not only satisfy that the width of the video playing area is greater than the height, but also satisfy that the video playing area is displayed in full screen. The full screen display (which may also be referred to as a maximized display) may refer to a video playing area having a width approximately equal to the width of the screen of the electronic device in a landscape state and a height approximately equal to the height of the screen of the electronic device. This is because the aspect ratio of the screens of electronic devices produced by different manufacturers may be different and the aspect ratio of the screens of electronic devices is different from the aspect ratio of the video playing area. If the width and the height of the video playing area are completely consistent with those of the screen of the electronic equipment in the horizontal screen state, the phenomenon of compression and stretching of the video can be caused. Therefore, the electronic device sets a corresponding requirement for the width and the height of the video playing area in the horizontal screen state, that is, the electronic device may set the proportional threshold based on the aspect ratio of the electronic device and the aspect ratio of the video playing area. This may enable the electronic device to determine whether there is a maximized displayed video play area in the landscape state based on the scale threshold. In general, a certain distance exists between a video playing area in the electronic device and the boundary of the left screen and the right screen in a horizontal screen state, and the range between the video playing area and the boundary of the screen can be filled with black, so that the user has good appearance.
From the above, the electronic device may determine whether the width of the video playing area in the horizontal screen state is greater than the height by comparing the sizes of the width_s and the height_s, determine whether the width of the video playing area in the horizontal screen state is greater than the maximum display width threshold by comparing the size of the product of the width_s and the sc_w multiplied by Ratio, and determine whether the height of the video playing area in the horizontal screen state is greater than the maximum display height threshold by comparing the size of the product of the height_s and the sc_h multiplied by Ratio.
When the electronic device determines that the width_s is greater than the height_s, the width_s is greater than the product of sc_w times Ratio, and the height_s is greater than the product of sc_h times Ratio, that is, when the electronic device determines that the same name surface view_i in the interface in the horizontal screen state meets the requirement of the embodiment of the application on the video playing area, it is explained that the electronic device is the target video playing scene in the horizontal screen state, so S708-1 is executed.
If the electronic device determines that the width_s is less than or equal to the height_s, and/or the width_s is less than or equal to the product of sc_w times Ratio, and/or the height_s is greater than the product of sc_h times Ratio, that is, if the electronic device determines that the same name surface view_i in the interface in the horizontal screen state does not meet the requirement of the embodiment of the present application on the video playing area, it indicates that the same name surface view_i of the electronic device in the horizontal screen state is not the target video playing scene, and thus S609 is executed. Here, the electronic device performs S609 to determine whether all the surface views with the same name as those of the surface views in the vertical screen state in the horizontal screen state do not meet the requirement of the target video playing scene, and further performs S708-2.
S708-1, determining that the current electronic device is in the target video playing scene.
In some implementations, after the electronic device determines that the width and the height of the video playing area in the vertical screen state and the horizontal screen state meet the requirements in S704 and S707, it may be determined that the electronic device is in the target video playing scene. It should be noted that after the electronic device completes execution of S701 to S707, it is determined that there is a surfacview in the vertical screen state and there is a surfacview with the same name in the horizontal screen state, that is, the electronic device is in the video playing scene, and then the process is finished, and it is not necessary to execute S701 to S707 to determine all the surfacview in the vertical screen state.
S708-2, determining that the target video playing scene is not in the electronic device.
In some implementations, after the electronic device determines that the width and the height of the video playing area in the vertical screen state do not meet the requirements through S604, and/or the width and the height of the video playing area in the horizontal screen state do not meet the requirements through S607, it may be determined that the electronic device is not in the video playing scene.
S709、i++。
In some implementations, step S609 is used to traverse all the acquired surface views in the vertical screen state, so as to ensure whether the width and the height of each acquired surface view meet the judgment of the target video playing scene, thereby ensuring the comprehensiveness of the judgment.
Based on the above description of the exemplary process of determining whether the electronic device is in the target video playing scene in the process of switching the electronic device from the vertical screen state to the horizontal screen state, in this embodiment of the present application, the process of determining whether the electronic device is in the target video playing scene in the process of switching the electronic device from the horizontal screen state to the vertical screen state may refer to the flowchart shown in fig. 7, which is not described herein.
By the method, whether the electronic equipment is in the target video playing scene or not can be judged, so that the follow-up electronic equipment plays the rotary animation corresponding to the target video playing scene.
In one possible implementation, the electronic device may also enable video playback through TextureView. For example, online video playing through a web page in an electronic device may take the form of TextureView. The electronic device may further determine whether the current scene is a video playing scene by acquiring information of TextureView. In the embodiment of the present application, the display method for implementing the switching between the horizontal and vertical screens through the information of TextureView is similar to the method flow shown in fig. 5, and will not be described herein.
In one possible implementation, in addition to the rotating electronic device triggering the switching of the landscape and portrait screens and playing the rotating animation, the related user operations may also trigger the switching of the landscape and portrait screens and playing the rotating animation. Specifically, when the electronic device is in the target video playing scene, the electronic device may respond to a click operation of a user acting on a full-screen display control in a video playing area in the first user interface, play a vertical-rotation and horizontal-rotation animation, and then display the second user interface. The electronic device may also play the panning/tilting rotation animation in response to a click operation of the user acting on the zoom-out display control in the video play area in the second user interface, and then display the first user interface.
Next, an Operating System (OS) interaction diagram of the electronic device corresponding to the method flow in fig. 5 is described with reference to fig. 8.
Sensor, windowManagerService (WMS for short), surfaceFlinger and system actions shown in fig. 8 are modules for managing interface display and rotation animation included in the electronic device, and for definition of these modules, reference may be made to the following detailed description at the software architecture of the electronic device, and specific roles of these modules are described in the following flow and are not repeated herein.
As shown in fig. 8, the OS interaction of the display method for switching between horizontal and vertical screens provided in the embodiment of the present application includes the following steps:
s801, collecting sensor data by a sensor of the electronic equipment.
In particular, the sensor of the electronic device may collect sensor data including, but not limited to, any one or more of the following: acceleration data, gyroscope data, and the like. Sensor may include, but is not limited to, acceleration sensors, gyroscopic sensors in particular: the acceleration sensor may acquire the magnitude of acceleration of the electronic device in various directions (for example, X, Y and Z axis), and detect inclination and movement of the electronic device therefrom. The gyro sensor can acquire the magnitude of angular velocity of the electronic apparatus in various directions (for example, X, Y and Z axis), thereby detecting rotation and tilt of the electronic apparatus. For a specific description of collecting sensor data, reference is made to the description related to S501 above.
S802, the sensor of the electronic device sends the sensor data to a window management service WMS.
Specifically, the sensor of the electronic device may send acceleration data, gyroscope data, etc. to the WMS after collecting the sensor data.
S803, the WMS of the electronic device determines that the electronic device is rotating.
Specifically, the WMS of the electronic device may determine that the screen of the electronic device rotates with the electronic device according to the received sensor data. That is, the WMS of the electronic device may detect an operation of the user to rotate the electronic device by analyzing the sensor data. The WMS of the electronic device may also determine, based on the sensor data, a direction in which the screen of the electronic device rotates with the electronic device. For a specific description of determining that the electronic device is rotated, reference may be made to the above description of S502.
S804, the WMS of the electronic device executes the screen freezing operation and screen capturing aiming at the first user interface.
After the WMS of the electronic equipment determines that the electronic equipment rotates, the switching of the horizontal screen and the vertical screen can be triggered.
Specifically, after the WMS of the electronic device determines that the electronic device rotates, the screen freezing operation may be performed with respect to the interface before rotation, and the screen capturing may be performed on the interface before rotation, to obtain a screenshot. The freeze screen operation may refer to the electronic device pausing the drawing and refreshing of the interface and not responding to any touch operation. For a specific description of S804, reference may be made to the description related to S503 above.
S805, the WMS of the electronic device sends data requesting to acquire information of a surface View in the first user interface to a surface eFlinger.
After the WMS of the electronic device captures the first user interface, the information of the surfacview stored in the first user interface in the SurfaceFlinger may be obtained. The surfeflinger of the electronic device may first obtain the composite layer (Composition layers) in the first user interface, composition layers is effectively a collection of layers (layers). And then filtering the layer with the name carrying the field of 'surface view' from the layer, and acquiring the name of the layer and the boundary information (such as the left upper position information and the right lower position information of the rectangular layer) of the layer.
S806, the SurfaceFlinger of the electronic device sends SurfaceView information in the first user interface to the WMS.
Specifically, after receiving the request sent by the WMS, the SurfaceFlinger of the electronic device may send information about surfacview in the interface in the first direction to the WMS. The information of the surface view in the interface in the first direction may include, but is not limited to: the content to be displayed by the surfacview in the first user interface, the number of surfacview contained in the first user interface, the name of the surfacview contained in the first user interface, the height of the surfacview in the first user interface, the width of the surfacview in the first user interface, and so on.
S807, the WMS of the electronic device draws a second user interface.
Specifically, the electronic device WMS may draw the second user interface in the second direction after having determined the second direction after the rotation in the above step S703. For a specific description of the interface for drawing the second direction, reference is made to the description related to S504 above.
S808, the WMS of the electronic device sends data requesting to acquire information of the SurfaceView in the second user interface to the SurfaceFlinger.
S809, the SurfaceFlinger of the electronic device sends SurfaceView information in the second user interface to the WMS.
Specifically, the SurfaceFlinger of the electronic device may send the information of the surfacview in the second user interface to the WMS after receiving the request sent by the WMS. Wherein, the information of the surface view in the second user interface may include, but is not limited to: the content that is required to be displayed by the surfacview in the second user interface, the number of surfacview contained in the second user interface, the name of the surfacview contained in the second user interface, the height of the surfacview in the second user interface, the width of the surfacview in the second user interface, and so on.
S810, the WMS of the electronic equipment executes a defrosting screen operation.
Specifically, the WMS of the electronic device may perform a defrost operation after acquiring information of the SurfaceView in the interface in the second direction,
s811, the WMS of the electronic device determines that the current electronic device is in a target video playing scene.
Specifically, the WMS of the electronic device determines that the current scene is the target video playing scene according to the acquired information of the SurfaceView in the first user interface and the acquired information of the SurfaceView in the second user interface. For a specific description of determining the target video play scene, reference may be made to the description related to the flowchart shown in fig. 7.
S812, the WMS of the electronic equipment sends data of the rotary animation corresponding to the target video playing scene to the system dynamic effect.
S813, the system dynamic effect of the electronic equipment plays the rotary animation corresponding to the target video playing scene.
Specifically, the electronic device may determine whether a vertical-rotation horizontal-rotation animation or a horizontal-rotation vertical-rotation animation of the target video playing scene is adopted. For a specific description of the rotation animation, reference is made to the related description in S506-1 described above.
In some implementations, the WMS of the electronic device may also determine that the current scene is not a video playing scene. The system effects may then employ a default rotational animation.
The following describes related structures of the electronic device provided in the embodiments of the present application.
Fig. 9 shows a schematic structural diagram of the electronic device 100.
The electronic device 100 may be a mounted deviceOr other operating system, the electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (UMPC), netbook, and cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, humanThe industrial and intelligent (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, without limitation, the electronic device 100 may also include a non-portable terminal device such as a laptop computer (laptop) having a touch-sensitive surface or touch panel, a desktop computer having a touch-sensitive surface or touch panel, and so forth. The embodiment of the application does not particularly limit the specific type of the electronic device.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL).
The I2S interface may be used for audio communication.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals.
The UART interface is a universal serial data bus for asynchronous communications.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like.
The GPIO interface may be configured by software.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc.
The modem processor may include a modulator and a demodulator.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC), multi-level memory cells (MLC), triple-level memory cells (TLC), quad-level memory cells (QLC), etc. divided according to a storage specification, may include universal FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory cards (embedded multi media Card, eMMC), etc. divided according to a storage specification.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
In some implementations, the gyro sensor 180B may be used to acquire gyro data and send the gyro data to the processor 110.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
In some implementations, the acceleration sensor 180E may be used to acquire acceleration data and send the acceleration data to the processor 110.
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is used to sense ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device".
The keys 190 include a power-on key, a volume key, etc.
The motor 191 may generate a vibration cue.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
Based on the OS interaction diagram shown in fig. 8, a software architecture block diagram provided in the embodiment of the present application is described below.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 10 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 10, the application package may include applications such as music, video, talk, bluetooth, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 10, the application framework layer may include a window management service (WindowManagerService, WMS), a window manager, a content manager, a view system, a phone manager, a resource manager, a notification manager, and the like.
Wherein the window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The window management service is a system-level window service for providing related services of window display class for each application. The WMS may be configured to detect a service request for each application to switch display directions, and to initiate a task for scheduling the associated module to perform a response in response to the request. The internal class of the WMS can also be used as a bridge for the java layer to interact with the native layer for the WMS to call. WMSs may also be used to perform frozen screens as well as unfreezing screen operations. WMS may also be used to determine whether the current scene is a video play scene. The WMS may also send a request to surfeflinger to retrieve surfeview information. WMS may also send a request to the system dynamic to take the rotational animation of a video playing scene.
The content manager is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The view system also includes SurfaceFlinger, surfaceFlinger, which may be used for composition of layers, and may also provide storage functionality for various elements in the interface.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
Because of the majority of classes of the Framework layer, it is simply the application that uses the "intermediaries" of library files in the system library. Because the upper layer applications are written in java language, they need the most direct support of java interface, and the system library supports the operation of another language (such as C++ language), so the Framework layer acts as an intermediary between the application layer and the system library, and each module in the Framework layer does not really realize specific functions or only realizes a part of functions, but places the main center of gravity in the core library to finish. For example AudioSystem, audioTrack and AudioFlinger in the Framework layer also have corresponding classes in the system library, except that the Framework layer is mostly written in java language, and the system library is mostly written in c++ language.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing. Wherein the 2D graphics engine may include a system activity that may be used to provide a rotational animation.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver. The display driver may be configured to receive an instruction sent by the WMS to display the interface of the second aspect, and display the interface of the second aspect on a screen of the electronic device. The sensor driver may be used to collect sensor data in real time.
It should be understood that each step in the above-described method embodiments provided in the present application may be implemented by an integrated logic circuit of hardware in a processor or an instruction in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor or in a combination of hardware and software modules in a processor.
The application also provides an electronic device, which may include: memory and a processor. Wherein the memory is operable to store a computer program; the processor may be operative to invoke a computer program in the memory to cause the electronic device to perform the method of any of the embodiments described above.
The present application also provides a chip system including at least one processor for implementing the functions involved in the method performed by the electronic device in any of the above embodiments.
In one possible design, the system on a chip also includes a memory to hold program instructions and data, the memory being located either within the processor or external to the processor.
The chip system may be formed of a chip or may include a chip and other discrete devices.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integrated with the processor or may be separate from the processor, and embodiments of the present application are not limited. For example, the memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of memory and the manner of disposing the memory and the processor in the embodiments of the present application are not specifically limited.
Illustratively, the system-on-chip may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated chip (application specific integrated circuit, ASIC), a system on chip (SoC), a central processing unit (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
The present application also provides a computer program product comprising: a computer program (which may also be referred to as code, or instructions), which when executed, causes a computer to perform the method performed by the electronic device in any of the embodiments described above.
The present application also provides a computer-readable storage medium storing a computer program (which may also be referred to as code, or instructions). The computer program, when executed, causes a computer to perform the method performed by the electronic device in any of the embodiments described above.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces, in whole or in part, a flow or function as described herein. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (20)

1. The display method for switching the horizontal screen and the vertical screen is characterized by comprising the following steps of:
the method comprises the steps that the electronic equipment displays a first user interface in a vertical screen state, wherein the first user interface comprises a first video playing area and a first element outside the first video playing area;
after the electronic equipment detects a first operation, a first rotation animation is played, in the first rotation animation, the first video playing area in the first user interface rotates by 90 degrees around the center of the first video playing area relative to the electronic equipment gradually, and the first element does not rotate relative to the electronic equipment;
The electronic equipment displays a second user interface in a horizontal screen state, the second user interface comprises a second video playing area, the first video playing area and the video played by the second video playing area are the same, the size of the second video playing area is larger than that of the first video playing area,
the display direction of the first user interface points to a second side from a first side of the electronic equipment, the display direction of the second user interface points to a fourth side from a third side of the electronic equipment, the first side is perpendicular to the third side and the fourth side, and the second side is perpendicular to the third side and the fourth side.
2. The method of claim 1, wherein prior to the electronic device playing the first rotational animation, the method further comprises:
the electronic equipment acquires a screenshot of the first user interface, wherein the screenshot is used for realizing that the first element does not rotate relative to the electronic equipment;
the electronic equipment draws the second user interface;
the electronic equipment determines that the electronic equipment is in a target video playing scene according to the first user interface and the second user interface, wherein the target video playing scene comprises: the first user interface and the second user interface each include a scene of a video playback area.
3. The method according to claim 2, wherein the electronic device renders the second user interface, in particular comprising:
and the electronic equipment draws the second user interface according to drawing rules provided by an application program to which the first user interface belongs.
4. A method according to claim 2 or 3, wherein the target video playing scene specifically comprises:
the width of the first video playing area in the first user interface is larger than the height, the width of the second video playing area in the second user interface is larger than the height, and the second video playing area occupies a scene with the proportion of the screen being larger than the first value.
5. A method according to claim 2 or 3, wherein the target video playing scene specifically comprises:
the width of the first video playing area in the first user interface is larger than the height, the height of the first video playing area in the first user interface is larger than a first height threshold, the width of the second video playing area in the second user interface is larger than the height, and the proportion of the second video playing area to the screen is larger than a scene with a first value.
6. The method according to any one of claims 2-5, further comprising:
the electronic equipment judges whether the first user interface contains a SurfaceView or not;
the electronic equipment judges whether the second user interface contains a SurfaceView according to the drawn second user interface;
and determining that the electronic equipment is in a target video playing scene under the condition that the first user interface and the second user interface contain the same surface view.
7. The method of any of claims 1-6, wherein the electronic device turns off a screen orientation lock function before the electronic device detects the first operation.
8. The method of any of claims 1-7, wherein the content displayed by the first video playback area in the first rotational animation is content displayed by the first video playback area in the first user interface when the electronic device detects the first operation.
9. The method of any of claims 1-8, wherein the first video play area in the first rotational animation gradually increases in size.
10. The method of any one of claims 1-9, wherein the first operation comprises: an operation that the electronic device is rotated in a plane where the non-ground plane is located, or a clicking operation that acts on a full screen play control in the first video play area.
11. The method of any of claims 1-10, wherein after the electronic device detects the first operation, the method further comprises:
and the electronic equipment executes the frozen screen operation.
12. The method of claim 11, wherein after the playing the first rotational animation, the method further comprises:
and the electronic equipment executes the operation of thawing the screen.
13. The method of any of claims 1-12, wherein after the electronic device displays a second user interface in a landscape state, the method further comprises:
after the electronic device detects a second operation, playing a second rotation animation, in which the second video playing area in the second user interface is gradually rotated by 90 degrees around the center of the first video playing area relative to the electronic device, and the first element is not rotated relative to the electronic device;
The electronic device displays a third user interface in a vertical screen state, wherein the third user interface comprises the first video playing area, and elements except the first video playing area in the first user interface,
the display direction of the third user interface is directed by the first side of the electronic device to the second side.
14. The method of claim 13, wherein prior to the electronic device playing a second rotational animation in response to the second operation, the method further comprises:
and the electronic equipment draws the third user interface according to drawing rules provided by the application program to which the second user interface belongs.
15. The method of claim 13 or 14, wherein the content displayed by the second video playing area in the second rotating animation is content displayed by the second video playing area in the second user interface when the electronic device detects the second operation.
16. The method of any of claims 13-15, wherein the first video play area in the first rotational animation gradually increases in size, and wherein in the second rotational animation the electronic device further displays elements of the first user interface other than the first video play area.
17. The method of any of claims 12-16, wherein the second operation comprises: an operation that the electronic device is rotated in a plane where the non-ground plane is located, or a click operation that acts on a zoom-out control in the second video play area.
18. An electronic device comprising a memory, one or more processors; the memory is coupled with the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the electronic device to perform the method of any of claims 1-17.
19. A chip for application to an electronic device, wherein the chip comprises one or more processors for invoking computer instructions to cause the electronic device to perform the method of any of claims 1-17.
20. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1-17.
CN202310821107.1A 2023-07-06 2023-07-06 Display method for switching horizontal screen and vertical screen and related device Pending CN117687501A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310821107.1A CN117687501A (en) 2023-07-06 2023-07-06 Display method for switching horizontal screen and vertical screen and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310821107.1A CN117687501A (en) 2023-07-06 2023-07-06 Display method for switching horizontal screen and vertical screen and related device

Publications (1)

Publication Number Publication Date
CN117687501A true CN117687501A (en) 2024-03-12

Family

ID=90127215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310821107.1A Pending CN117687501A (en) 2023-07-06 2023-07-06 Display method for switching horizontal screen and vertical screen and related device

Country Status (1)

Country Link
CN (1) CN117687501A (en)

Similar Documents

Publication Publication Date Title
WO2021027747A1 (en) Interface display method and device
CN112217923B (en) Display method of flexible screen and terminal
CN112130742B (en) Full screen display method and device of mobile terminal
CN115473957B (en) Image processing method and electronic equipment
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN114546190A (en) Application display method and electronic equipment
CN113553130B (en) Method for executing drawing operation by application and electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
WO2022262475A1 (en) Image capture method, graphical user interface, and electronic device
WO2022161119A1 (en) Display method and electronic device
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
CN114579016A (en) Method for sharing input equipment, electronic equipment and system
WO2022267783A1 (en) Method for determining recommended scene, and electronic device
EP4266173A1 (en) Application display method and apparatus, chip system, medium and program product
CN115904160A (en) Icon moving method, related graphical interface and electronic equipment
CN113688019A (en) Response time duration detection method and device
CN115119048B (en) Video stream processing method and electronic equipment
CN117769696A (en) Display method, electronic device, storage medium, and program product
CN117687501A (en) Display method for switching horizontal screen and vertical screen and related device
CN116592756B (en) Detection method for included angle of folding screen and electronic equipment
CN113986406B (en) Method, device, electronic equipment and storage medium for generating doodle pattern
WO2024017332A1 (en) Method for controlling component, and related apparatus
CN116781808A (en) Display method and electronic equipment
CN117784990A (en) Method and related device for displaying icons of application programs in task bar
CN116339569A (en) Split screen display method, folding screen device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination