WO2024001900A1 - Procédé d'affichage pour écran pliant, et dispositif associé - Google Patents

Procédé d'affichage pour écran pliant, et dispositif associé Download PDF

Info

Publication number
WO2024001900A1
WO2024001900A1 PCT/CN2023/101661 CN2023101661W WO2024001900A1 WO 2024001900 A1 WO2024001900 A1 WO 2024001900A1 CN 2023101661 W CN2023101661 W CN 2023101661W WO 2024001900 A1 WO2024001900 A1 WO 2024001900A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
folding
angle
folding screen
display
Prior art date
Application number
PCT/CN2023/101661
Other languages
English (en)
Chinese (zh)
Inventor
魏昊霖
姜顺吉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024001900A1 publication Critical patent/WO2024001900A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Definitions

  • the present application relates to the field of electronic technology, and in particular, to a display method for a folding screen and related equipment.
  • the terminal device When the folding screen is in a fully unfolded state, the terminal device displays a two-dimensional picture according to the existing implementation. The user can see the complete picture from the perspective of looking down on the folding screen.
  • the terminal device displays a two-dimensional picture according to the existing implementation.
  • the existing implementation displays two-dimensional images on two screens respectively. Since the folding screen is in a half-folded state, there is an angle between the different screens, resulting in poor display effect of the displayed image.
  • Embodiments of the present application provide a display method for a folding screen, so that users can see images with richer content when the folding screen is in a semi-folded state, thereby improving the user's interactive experience.
  • the present application provides a display method for a folding screen, which is applied to an electronic device including a folding screen.
  • the method includes: displaying a first object on the folding screen when the folding angle of the folding screen is a first angle; detecting The folding angle to the folding screen changes from the first angle to the second angle, and the second object is displayed according to the first object; where the first object is the image of the two-dimensional object, and the second object is the image of the three-dimensional object corresponding to the two-dimensional object. image.
  • the content in the display screen is transformed and replaced (for example, the original two-dimensional display content is replaced with a three-dimensional display content) so that the user can see it when the folding screen is in a half-folded state.
  • Pictures with richer content improve the user's interactive experience.
  • the second object can be displayed without the first object being displayed on the folding screen.
  • the first object is a user interface UI, a photo, an icon or a string.
  • the second object may contain semantic information of the first object.
  • the first object is a string with specific semantic information
  • the second object may be a three-dimensional object with specific semantics that the first object has.
  • the second object may contain image information of the first object.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the method also includes: enabling three-dimensional display of the first object based on detecting that the folding screen satisfies the first preset condition. function; the first preset condition includes at least one of the following: the folding screen is in a half-folded state; or the angle between one of the first screen and the second screen and the horizontal plane is less than the angle threshold; or the posture of the folding screen is in The change within the preset time is less than the threshold; or, one of the first screen and the second screen is close to the target table.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the method also includes: enabling three-dimensional display of the first object based on detecting that the folding screen satisfies the second preset condition function; the second preset condition includes at least one of the following: the folding screen is in a half-folded state; or the change in the posture of the folding screen within the preset time is less than the threshold; or the side of the folding screen perpendicular to the folding line is The target surface is close to the surface; or, the angle between the side perpendicular to the folding line on the folding screen and the horizontal plane is less than the angle threshold.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the method also includes: enabling three-dimensional display of the first object based on detecting that the folding screen satisfies a third preset condition function; the third preset condition includes at least one of the following: the folding screen is in a half-folded state; or one of the first screen and the second screen is in close contact with the target table; or one of the first screen and the second screen
  • the angle between a screen and the horizontal plane is less than the angle threshold; or, the change in the posture of the folding screen within the preset time is less than the threshold.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the method also includes: enabling three-dimensional display of the first object based on detecting that the folding screen satisfies a fourth preset condition function; the fourth preset condition includes at least one of the following: the folding screen is in a half-folded state; or the change in the posture of the folding screen within the preset time is less than the threshold; or the two sides of the folding screen away from the central folding line are The target table surface is close to the target; or, the angle between the two sides away from the central fold line on the folding screen and the horizontal plane is less than the angle threshold.
  • the second object is an image of a three-dimensional object in the target perspective mapped to a plane where the folding screen is located.
  • the target perspective is related to the second perspective.
  • the mapping relationship between each screen angle and the viewing angle of the three-dimensional object can be maintained.
  • the mapping relationship can be a mapping relationship between discrete data, or a mapping relationship between continuous data (for example, through a function relationship), after obtaining the screen angle, the target angle corresponding to the second angle can be determined based on the mapping relationship between the screen angle and the viewing angle of the three-dimensional object.
  • the mapping relationship is a mapping relationship between discrete data
  • the target perspective corresponding to the second angle can be determined based on the mapping relationship between the screen angle (including the second angle) and the viewing perspective of the three-dimensional object.
  • the mapping relationship is a mapping relationship between continuous data
  • the second angle can be used as the dependent variable of the mapping relationship to determine the target perspective (independent variable) corresponding to the second angle.
  • the mapping relationship can indicate that the viewing angle of the three-dimensional object changes with the change of the screen angle. From the perspective of the display effect, as the folding angle changes, the picture of the second object is in a dynamically changing state ( The folding screen still meets the enabling conditions for the three-dimensional display function of the first object).
  • the second object may be an image of a three-dimensional object of the first object under the target perspective, and as the folding angle of the folding screen changes, the target perspective also changes accordingly.
  • the method further includes: in the process of the folding angle of the folding screen changing from the first angle to the second angle, mapping the three-dimensional objects displayed on the folding screen under the changing viewing angle to the position of the folding screen. multi-frame images on a plane; or, when the folding angle of the folding screen changes from the first angle to the second angle, display the first object on the folding screen until the folding angle of the folding screen changes to the second angle , display the second object.
  • the first angle and the second angle are less than or equal to 180 degrees; when the second angle is less than the first angle, they continue to change, including: changing to a horizontal viewing angle; when the second angle is greater than the first angle , constantly changing, including: changes to the vertical viewing angle.
  • displaying the second object includes: displaying the second object at a display position matching the first object on the first screen and the second screen.
  • the method before displaying the second object, further includes: determining the target screen viewing angle according to the second angle; and determining the target screen viewing angle by adjusting the target screen viewing angle based on the pose relationship between the target screen viewing angle and the folding screen.
  • the image of the three-dimensional object is stretched and deformed to map the three-dimensional object onto the plane of the folding screen to obtain the second object.
  • the difference between the target screen viewing angle and the center line of the second angle is within a preset range.
  • the folding screen includes a first screen and a second screen
  • the second object includes a first sub-object and a second sub-object
  • the image includes a first image corresponding to the first sub-object and a second sub-object.
  • the corresponding second image mapping the image to the folding screen by stretching and deforming the image, including: stretching and deforming the first image according to the positional relationship between the target perspective and the first screen, so as to The first image is mapped to the first screen; and the second image is stretched and deformed according to the positional relationship between the target perspective and the second screen to map the second image to the second screen.
  • the folding screen includes a first screen and a second screen; displaying the first object includes: displaying the first object on the first screen and displaying the target screen on the second screen; displaying the second object , including: displaying the second object on the first screen, and displaying the target picture on the second screen.
  • the target screen can be all or part of the content displayed on the second screen. Unlike the first object, the target screen on the second screen still maintains a two-dimensional display logic before and after folding.
  • the folding screen when the folding angle of the folding screen is the first angle, the folding screen is in a fully unfolded state.
  • this application provides a folding screen display device, which is applied to electronic equipment including a folding screen.
  • the device includes:
  • a display module configured to display the first object on the folding screen when the folding angle of the folding screen is the first angle
  • the folding angle of the folding screen changes from the first angle to the second angle, and the second object is displayed according to the first object; wherein the first object is an image of a two-dimensional object, and the second object is a three-dimensional object corresponding to the two-dimensional object. Image.
  • the first object is a user interface UI, a photo, an icon or a string.
  • the second object contains semantic information or image information of the first object.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • An enabling module configured to enable the three-dimensional display function of the first object based on detecting that the folding screen satisfies a first preset condition; the first preset condition includes at least one of the following:
  • the foldable screen is in a half-folded state
  • the angle between one of the first screen and the second screen and the horizontal plane is less than the angle threshold; or,
  • the change in the posture of the folding screen within the preset time is less than the threshold; or,
  • One of the first and second screens is flush with the target surface.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • An enabling module configured to enable the three-dimensional display function of the first object based on detecting that the folding screen meets a second preset condition; the second preset condition includes at least one of the following:
  • the foldable screen is in a half-folded state
  • the change in the posture of the folding screen within the preset time is less than the threshold; or,
  • the side of the folding screen that is perpendicular to the folding line is close to the target table; or,
  • the angle between the side perpendicular to the folding line on the folding screen and the horizontal plane is less than the angle threshold.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • An enabling module configured to enable the three-dimensional display function of the first object based on detecting that the folding screen satisfies a third preset condition;
  • the third preset condition includes at least one of the following:
  • the foldable screen is in a half-folded state
  • One of the first and second screens is flush with the target surface; or,
  • the angle between one of the first screen and the second screen and the horizontal plane is less than the angle threshold; or,
  • the change in the posture of the folding screen within the preset time is less than the threshold.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • An enabling module configured to enable the three-dimensional display function of the first object based on detecting that the folding screen satisfies a fourth preset condition;
  • the fourth preset condition includes at least one of the following:
  • the foldable screen is in a half-folded state
  • the change in the posture of the folding screen within the preset time is less than the threshold; or,
  • the two sides of the folding screen away from the central folding line are in close contact with the target table; or,
  • the angle between the two sides away from the central fold line on the folding screen and the horizontal plane is less than the angle threshold.
  • the second object is an image of a three-dimensional object in the target perspective mapped to a plane where the folding screen is located.
  • the target perspective is related to the second perspective.
  • the display module is also used to:
  • the three-dimensional objects displayed on the folding screen under the changing viewing angle are mapped to the multi-frame images on the plane where the folding screen is located; or,
  • the first object is displayed on the folding screen, until the folding angle of the folding screen changes to the second angle, the second object is displayed.
  • the first angle and the second angle are less than or equal to 180 degrees;
  • the second angle When the second angle is smaller than the first angle, it continues to change, including: changing to the horizontal viewing angle;
  • the second angle When the second angle is greater than the first angle, it changes continuously, including changing to the vertical viewing angle.
  • the display module is specifically used for:
  • the second object is displayed at a display position matching the first object on the first screen and the second screen.
  • the device further includes:
  • the viewing angle determination module is used to determine the target viewing angle corresponding to the second angle according to the second angle and the mapping relationship between the screen angle of the folding screen and the viewing angle of the three-dimensional object before displaying the second object.
  • the display module is also used to:
  • the image of the three-dimensional object under the target viewing angle is stretched and deformed to map the three-dimensional object onto the plane of the folding screen to obtain the second object.
  • the difference between the target screen viewing angle and the center line of the second angle is within a preset range.
  • the folding screen includes a first screen and a second screen
  • the second object includes a first sub-object and a second sub-object
  • the image includes a first image corresponding to the first sub-object and a second sub-object.
  • display module specifically used for:
  • the second image is stretched and deformed to map the second image to the second screen.
  • the folding screen includes a first screen and a second screen
  • Display module specifically used for:
  • the second object is displayed on the first screen, and the target picture is displayed on the second screen.
  • the folding screen when the folding angle of the folding screen is the first angle, the folding screen is in a fully unfolded state.
  • the present application provides an electronic device, which is characterized in that it includes: a folding screen, which is divided into a first screen and a second screen by folding lines when folded; one or more processors; one or more memory; sensor;
  • a sensor for detecting data to enable one or more processors to detect the angle between the first screen and the second screen;
  • One or more memories are used to store computer program code, and the computer program code includes computer instructions; when the computer instructions are run on the processor, the electronic device is caused to perform the steps of the above-mentioned first aspect and any one of the possible implementation methods of the first aspect. .
  • the present application provides a computer storage medium that includes computer instructions.
  • the computer instructions When the computer instructions are run on an electronic device or a server, the steps of the first aspect and any one of the possible implementation methods of the first aspect are executed.
  • this application provides a computer program product.
  • the computer program product is run on an electronic device or a server, the steps of the first aspect and any one of the possible implementation methods of the first aspect are executed.
  • the present application provides a chip system, which includes a processor to support an execution device or a training device to implement the functions involved in the above aspects, for example, sending or processing data involved in the above methods; Or, information.
  • the chip system also includes a memory, which is used to save necessary program instructions and data for executing the device or training the device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • Figure 1 is a schematic diagram of the product form of an electronic device provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of the product form of another electronic device provided by an embodiment of the present application.
  • Figure 3 is a schematic flowchart of a placement posture determination process provided by an embodiment of the present application.
  • Figure 4A is a schematic diagram of the electronic device provided by the embodiment of the present application in a first placement posture
  • FIG. 4B is a schematic diagram of the electronic device provided by the embodiment of the present application in a second placement posture
  • Figure 4C is a schematic diagram of the electronic device provided by the embodiment of the present application in a third placement posture
  • Figure 4D is a schematic diagram of the electronic device provided by the embodiment of the present application in a fourth placement posture
  • Figure 5 is a schematic structural diagram of an electronic device provided by the application embodiment
  • Figure 6A is a schematic diagram of a principle for calculating the angle ⁇ between screen A and screen B provided by an embodiment of the present application;
  • Figure 6B is a schematic diagram of an example of a geographical coordinate system provided by an embodiment of the present application.
  • Figure 6C is a schematic diagram of the software architecture of an electronic device provided by an embodiment of the present application.
  • Figure 6D is a schematic diagram of the positional relationship between an A screen or a B screen and a horizontal plane provided by an embodiment of the present application;
  • Figure 7 is a perspective view provided by an embodiment of the present application.
  • Figure 8 is a perspective view provided by an embodiment of the present application.
  • FIGS 9-24A are schematic interface diagrams provided by embodiments of the present application.
  • Figure 24B is a schematic flowchart of a folding screen display method provided by an embodiment of the present application.
  • FIGS. 25-37 are schematic interface diagrams provided by embodiments of the present application.
  • Figure 38 is a schematic structural diagram of a folding screen display device provided by an embodiment of the present application.
  • Figure 39 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • Embodiments of the present application provide a display method for a folding screen, which method can be applied to electronic devices with folding screens.
  • the folding screen can be folded to form at least two screens.
  • a folding screen can be folded along a folding edge or a folding axis to form two screens, such as screen A and screen B.
  • the folding methods of the folding screen on the electronic device can be divided into two categories.
  • One type is a folding screen that folds outwards (referred to as an outward-folding folding screen, or an outward-folding screen)
  • the other type is a folding screen that folds inward (referred to as an inward-folding folding screen, or an inward-folding screen).
  • the foldable screen can be folded to form a first screen and a second screen as an example. After the outward-folding folding screen is folded, the first screen and the second screen face each other. After the inward folding screen is folded, the first screen and the second screen face each other.
  • the first screen may be called screen A
  • the second screen may be called screen B.
  • FIG. 1 shows a schematic diagram of a product form of an electronic device 100 with an outward-folding screen provided by an embodiment of the present application.
  • (a) in Figure 1 is a schematic diagram of the fully unfolded form of the outward-folding folding screen.
  • the outward-folding folding screen can be folded along the folding edge in the directions 11a and 11b shown in (a) in Figure 1 to form a half-folded screen A (i.e. the first screen) shown in (b) in Figure 1 ) and B screen (i.e. the second screen).
  • the outward-folding folding screen can continue to be folded along the folding edge in the directions 12a and 12b shown in (b) of Figure 1 to form an ever-folding screen in the folded state shown in (c) of Figure 1 .
  • screen A ie, the first screen
  • screen B ie, the second screen
  • the electronic device 100 when the folding screen is in a fully folded state or a half-folded state, the electronic device 100 can be on the A screen (ie, the first screen) or the B screen (ie, the second screen). Display interface content.
  • the electronic device 100 can display interface content on screen A (ie, the first screen) and screen B (ie, the first screen).
  • FIG. 2 shows a schematic diagram of a product form of an electronic device 100 with an inward-folding screen provided by an embodiment of the present application.
  • (a) in Figure 2 is a schematic diagram of the shape of the inward-folding folding screen when it is fully unfolded.
  • the inward-folding folding screen can be folded along the folding edge in the directions 21a and 21b shown in (a) in Figure 2 to form screens A and B in the semi-folded form shown in (b) in Figure 2 .
  • the outward folding screen can be folded along the folding edge, as shown in (b) of Figure 2, screen A and screen B.
  • the inward-folding folding screen can continue to be folded along the folding edge in the directions 22a and 22b shown in (b) of Figure 2 to form a completely folded outward folding screen as shown in (c) of Figure 2 .
  • screen A and screen B face each other and are invisible to the user.
  • the value range of the angle ⁇ between screen A and screen B of the folding screen (including an inward-folding folding screen and an outward-folding folding screen) of the electronic device 100 is [0°, 180°].
  • the electronic device 100 can determine that the folding screen is in a fully folded form; if ⁇ (P1, P2), the electronic device 100 can determine that the folding screen is in a semi-folded form; ⁇ [P2, 180°], the electronic device 100 can determine that the folding screen is in a fully unfolded state.
  • P1 and P2 can be preset angle thresholds.
  • P1 and P2 may be determined based on the usage habits of a large number of users using the folding screen; or, P1 and P2 may be set by the user in the electronic device 100 .
  • the value range of the preset angle threshold P1 may be (0, 30°)
  • the value range of the preset angle threshold P2 may be (150°, 180°).
  • the preset angle threshold P1 may be 5°, 10°, 15°, 20°, etc.
  • the preset angle threshold P2 can be 155°, 160°, 165° or 170°, etc.
  • the at least two screens formed after the folding screen (including an inward-folding folding screen and an outward-folding folding screen) in the embodiment of the present application are folded may be multiple independent screens or may be an integrated structure. A complete screen, just folded to form at least two parts.
  • the folding screen may be a flexible folding screen, and the flexible folding screen includes a folding edge made of flexible material. Part or all of the flexible folding screen is made of flexible materials.
  • the at least two screens formed after the flexible folding screen is folded are a complete screen with an integrated structure, but are folded to form at least two parts.
  • the above-mentioned folding screen may be a multi-screen folding screen.
  • the multi-screen folding screen may include multiple (two or more) screens.
  • the multiple screens are multiple individual display screens. These multiple screens can be connected in turn through folding shafts. Each screen can rotate around the folding axis connected to it to realize the folding of multi-screen folding screens.
  • FIG. 1 and FIG. 2 take the folding screen as a flexible folding screen as an example to illustrate the folding screen in the embodiment of the present application. Moreover, in the subsequent embodiments of the present application, the method provided by the embodiments of the present application will also be explained by taking the folding screen as a flexible folding screen as an example.
  • the electronic device 100 in the embodiment of the present application can be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), or a netbook.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistants
  • AR augmented reality
  • VR virtual reality
  • the embodiments of the present application apply to electronic devices There are no special restrictions on the specific type.
  • FIG. 3 is a flow chart of the electronic device 100 identifying the stand mode of the folding screen in an embodiment of the present application.
  • the process for the electronic device 100 to identify the stand mode of the folding screen can be as follows:
  • the electronic device 100 can receive the user's folding operation. In response to the user's folding operation, the electronic device 100 can calculate the folding angle ⁇ of the folding screen (ie, the angle between the first screen and the second screen) ⁇ .
  • the electronic device 100 may determine that the folding screen is in a fully folded form (for example, (c) in FIG. 1 or (c) in FIG. 2 ) shown).
  • the electronic device 100 may determine that the folding screen is in an unfolded form (for example, (a) in Figure 1 or (a) in Figure 2 above) shown).
  • the electronic device 100 may determine that the folding screen is in a semi-folded form (for example, (b) in Figure 1 or (b) in Figure 2 above) shown). Among them, 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the process by which the electronic device 100 calculates the folding angle ⁇ of the folding screen may refer to subsequent embodiments, and will not be described again here.
  • the electronic device 100 can also determine whether the folding mode of the folding screen is outward folding (as shown in (b) in Figure 1 above) or inward folding (as shown in the above figure). shown in (b) in 2).
  • the electronic device 100 can determine whether the electronic device 100 is placed in a dual-screen landscape standing position or a single screen horizontal display.
  • the electronic device 100 when the electronic device 100 is folded outward into a half-folded form and the dual screens are placed in a horizontal position, the electronic device 100 can determine that the electronic device 100 is in the two-person operation stand mode.
  • the electronic device 100 can determine that the electronic device 100 is in the viewing stand mode.
  • the electronic device 100 can determine whether the placement form of the electronic device 100 is a dual-screen vertical display or a single-screen horizontal display.
  • the electronic device 100 may determine that the electronic device 100 is in the reading stand mode.
  • the electronic device 100 can determine that the electronic device 100 is in Computer stand mode.
  • the electronic device 100 can determine that the electronic device 100 is in a dual state.
  • Screen horizontal screen standing display form.
  • side a is the outer edge parallel to the folding line on screen A of the electronic device 100
  • side b is the outer edge parallel to the folding line on screen B of the electronic device 100.
  • the folding screen of the electronic device 100 is divided into A screen (ie, the first screen) and B screen (ie, the second screen).
  • the angle ⁇ (P1, P2) with screen B where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • side a and side b on the electronic device 100 are in contact with the real object, where the angle ⁇ [0, P3] formed by side a and side b with the horizontal plane is optional, 0° ⁇ P3 ⁇ 30°.
  • the electronic device 100 can determine that the electronic device 100 is in a single-screen mode.
  • Horizontal display form Among them, 0° ⁇ P3 ⁇ 30°, side a is the outer edge of the electronic device 100 on the A screen side that is parallel to the folding line, and side b is the outer edge of the electronic device 100 that is parallel to the folding line on the B screen side.
  • the folding screen of the electronic device 100 is folded out into screen A (ie, the first screen) and screen B (ie, the second screen).
  • the folding angle ⁇ (P1, P2) between screen and B screen where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the display surface of screen B on the electronic device 100 is in contact with the real object, where the angle ⁇ [0, P3] between side a (i.e., the first side) and side b (i.e., the second side) and the horizontal plane can be Selected, 0° ⁇ P3 ⁇ 30°.
  • the electronic device 100 can determine that the electronic device 100 is in a dual-screen vertical screen standing configuration.
  • 60° ⁇ P4 ⁇ 90° side a is the outer edge parallel to the folding line on screen A of the electronic device 100
  • side b is the outer edge parallel to the folding line on screen B of the electronic device 100.
  • the folding screen of the electronic device 100 is folded into screen A (ie, the first screen) and screen B (ie, the second screen).
  • Screen A and screen B are The folding angle ⁇ (P1, P2) of the screen, where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the angle ⁇ [P4, 90°] between side a (i.e., the first side) and side b (i.e., the second side) of the electronic device 100 and the horizontal plane optionally, 60° ⁇ P4 ⁇ 90 °.
  • the electronic device 100 can determine that the electronic device 100 is in a single state.
  • the screen is placed horizontally.
  • side a is the outer edge parallel to the folding line on screen A of the electronic device 100
  • side b is the outer edge parallel to the folding line on screen B of the electronic device 100.
  • the folding screen of the electronic device 100 is divided into screen A (ie, the first screen) and screen B (ie, the second screen).
  • Screen A and screen B are The folding angle ⁇ (P1, P2) of the screen, where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the back side of the B screen (i.e., the second screen) on the electronic device 100 is in contact with the real object, where the angle ⁇ [ 0, P3], optional, 0° ⁇ P3 ⁇ 30°.
  • the field of view presented to the user by the folding screen will become larger.
  • the screen is regarded as a window, and the scenery observed through the window is regarded as the display image of the screen.
  • the half-folded state has a larger viewing angle and can see more scenery, that is, it can observe images with a larger viewing angle. Reflected on the screen display content, that is, the screen in the half-folded state can provide larger and more three-dimensional display content compared to the fully unfolded state.
  • Figures 7 and 8 only show the principle of the inner folding screen. The same applies to the outer folding screen. The similarities will not be repeated here.
  • the folding screen when the folding screen is in a fully unfolded state, the user can see the distortion-free view by looking down at the folding screen.
  • the screen changes (taking the folding line as the short side as an example, please refer to the left side of Figure 9, taking the folding line as the long side as an example, please refer to the left side of Figure 11), when the folding screen is in a half-folded state , when the user observes the folding screen, the screen displayed on the folding screen will have a certain distortion (taking the folding line as the short side as an example, please refer to the right side of Figure 9, taking the folding line as the long side as an example, please refer to Figure 11 shown on the right).
  • the best viewing angle for the user to view the display screen is the same as the folding screen.
  • the viewing angle position on the center line of the folding angle formed by the folding angle is the same as the folding screen.
  • the folding screen provides the user with a larger observation field of view when the folding screen is in a half-folded state
  • the content in the display screen is transformed and replaced (for example, the original two-dimensional display content is replaced with a three-dimensional display content), so that users can still see distortion-free and richer content when the folding screen is in a half-folded state.
  • the best viewing position can be understood as the position with the best viewing effect when the user watches the folding screen mobile phone screen.
  • the best viewing position can be unique of. For example, when the foldable screen mobile phone is fully unfolded, the user's line of sight is perpendicular to the plane of the screen, which is the best viewing angle.
  • the best viewing position is located at the center of the screen and on a straight line perpendicular to the plane of the screen. The specific position will vary. Screen sizes vary. When the folding angle of a folding screen device changes, it will change the best viewing position for users when using a folding screen phone.
  • Figure 13 shows the change in the best viewing position from the fully unfolded state to the half-folded state.
  • the best viewing position can be determined according to preset rules. For example, taking the plane where one split screen of a folding screen mobile phone is located as the horizontal plane, there is a mapping relationship between the best viewing position of each folding angle and the folding angle. According to the mapping relationship, the coordinates of the best viewing position can be calculated. Or you can directly store the coordinates of the best viewing position corresponding to each folding angle in advance.
  • the folding screen mobile phone when it is not placed on a flat surface, it only needs to be combined with sensors such as gyroscopes to determine the position of the folding screen. According to the same principle, the best viewing position for users to watch the screen can be determined.
  • FIG. 5 shows a schematic structural diagram of the electronic device 100.
  • the following uses the electronic device 100 as an example to describe the embodiment in detail. It should be understood that the electronic device 100 shown in FIG. 5 is only an example, and the electronic device 100 may have more or fewer components than shown in FIG. 5 , two or more components may be combined, or Can have different component configurations.
  • the various components shown in Figure 5 may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the electronic device 100 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, And subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor.
  • processor AP
  • modem processor graphics processing unit (GPU), image signal processor (ISP), controller, memory, video codec, digital signal processor (digital signal processor (DSP), baseband processor, and/or neural network processing unit (NPU), etc.
  • DSP digital signal processor
  • NPU neural network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can separately couple the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 can be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to implement the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 can be coupled with the audio module 170 through the I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface to implement the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to implement the function of answering calls through a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate through the CSI interface to implement the shooting function of the electronic device 100 .
  • the processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, display screen 194, wireless communication module 160, audio module 170, sensor module 180, etc.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic explanation and does not constitute an electronic
  • the device 100 is structurally defined.
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, camera 193, wireless communication module 160, etc.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 is the above-mentioned outward-folding folding screen or inward-folding folding screen.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 . Speaker 170A, also called “speaker”, is used to convert audio electrical signals into sound signals. The electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls. by Microphone 170B, also called “earpiece”, is used to convert audio electrical signals into sound signals. When the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Speaker 170A also called “speaker”
  • the electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • Microphone 170B also called “earpiece” is used to convert audio electrical signals into sound signals. When
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 170C with the human mouth and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • the headphone interface 170D is used to connect wired headphones.
  • the headphone interface 170D may be a USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA Cellular Telecommunications Industry Association of the USA
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A there are many types of pressure sensors 180A, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material.
  • the electronic device 100 determines the intensity of the pressure based on the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is applied to the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization. For example, when the shutter is pressed, the gyro sensor 180B detects the angle at which the electronic device 100 shakes, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the display screen 194 of the electronic device 100 can be folded to form multiple screens.
  • Gyroscope sensors 180B may be provided in multiple screens for measuring the orientation of the corresponding screen (ie, the direction vector of the orientation).
  • the electronic device 100 can determine the angle between adjacent screens (for example, the angle between screen A and screen B) based on the orientation angle change of each screen measured by the gyro sensor 180B.
  • the folding screen (such as the above-mentioned display screen 194) of the electronic device 100 can be folded to form multiple screens.
  • Each screen may include a gyroscope sensor (such as the above-mentioned gyroscope 180B) for measuring the orientation of the corresponding screen (ie, the direction vector of the orientation).
  • the display screen 194 of the electronic device 100 can be folded to form an A screen (i.e., the first screen) and a B screen (i.e., the second screen).
  • the A screen and the B screen both include gyro sensors. 180B, used to measure the orientation of screen A and screen B respectively.
  • the electronic device 100 can determine the angle between adjacent screens and the relationship between each screen and the horizontal plane based on the measured angle change in the orientation of each screen.
  • the foldable screen of the electronic device 100 can be folded to form screen A and screen B as shown in FIG. 6A.
  • the A screen is provided with a gyro sensor A
  • the B screen is provided with a gyro sensor B.
  • the gyro sensor A measures the orientation of screen A (ie, the direction vector of the orientation)
  • the gyro sensor B measures the orientation of screen B (ie, the reverse vector of the orientation)
  • the electronic device 100 measures the orientation of screen A according to the principle of A.
  • the principle of calculating the angle ⁇ between screen A and screen B will be explained based on the orientation of the screen and the orientation of screen B.
  • the coordinate system of the gyroscope sensor is a geographical coordinate system.
  • the origin O of the geographical coordinate system is located at the point where the carrier (ie, the device containing the gyroscope sensor, such as the electronic device 100) is located, the X-axis points to the east (E) along the local latitude, and the Y-axis points along the local meridian. North (N), the Z axis points upward along the local geographical vertical line, and forms a right-handed rectangular coordinate system with the X axis and Y axis.
  • the plane formed by the X-axis and the Y-axis is the local horizontal plane
  • the plane formed by the Y-axis and the Z-axis is the local meridian plane. Therefore, it can be understood that the coordinate system of the gyro sensor is: taking the gyro sensor as the origin O, pointing east along the local latitude as the X axis, pointing north along the local meridian as the Y axis, and pointing upward along the local geographical vertical line (i.e. The opposite direction of the geographical vertical) is the Z-axis.
  • the electronic device 100 uses the gyro sensor provided in each screen to measure the direction vector of each screen in the coordinate system of the gyro sensor provided therein. For example, referring to the side view of the electronic device 100 as shown in FIG. 6A , the electronic device 100 measures The direction vector of screen A in the coordinate system of gyro sensor A is vector z1, and the direction vector of screen B in the coordinate system of gyro sensor B is vector z2.
  • the electronic device 100 uses the following formula (1), The angle ⁇ between vector z1 and vector z2 can be calculated:
  • one or more other sensors may be used to measure the angle ⁇ between screen A and screen B.
  • an acceleration sensor can be installed in each screen of the folding screen.
  • the electronic device 100 (such as the processor 110) can use an acceleration sensor to measure the motion acceleration of each screen when it is rotated; and then calculate the rotation angle of one screen relative to the other screen based on the measured motion acceleration, that is, the angle between screen A and screen B.
  • Angle ⁇ can be used to measure the angle ⁇ between screen A and screen B.
  • the above-mentioned gyro sensor can be a virtual gyro sensor formed by the cooperation of multiple other sensors.
  • the virtual gyro sensor can be used to calculate the angle between adjacent screens of the folding screen, that is, screen A and screen B. The angle ⁇ .
  • the electronic device 100 can also measure the angle ⁇ between side a on screen A (or side b on screen B) and the horizontal plane through the above-mentioned gyroscope 180B.
  • side a and side b reference may be made to the above embodiment shown in FIGS. 4A to 4D , which will not be described again here.
  • the foldable screen of the electronic device 100 can be folded (outwardly or inwardly) to form A screen and B screen.
  • the electronic device 100 uses the gyro sensor installed in each screen to measure the direction vector of edge a on screen A (or edge b on screen B) in the coordinate system of the gyro sensor set therein, as a vector, where, Side a and side b are parallel.
  • FIG. 6D is a schematic diagram of the position of screen A and the horizontal plane on the electronic device 100 .
  • the electronic device 100 can measure that the direction vector of edge a of the screen A in the coordinate system of the gyro sensor A is vector z3, and the direction vector of the normal line of the horizontal plane in the coordinate system of the gyro sensor A is z4. Therefore, the electronic device 100 Using the following formula (2), the angle ⁇ between the vector z3 and the covector z4 can be calculated:
  • the electronic device 100 can determine the angle ⁇ between the side a and the horizontal plane based on the measured direction vector of the side a of the electronic device 100 and the normal vector of the horizontal plane. Since side a and side b of the electronic device 100 are parallel, the angle between side a and the horizontal plane and the angle between side b and the horizontal plane are equal to ⁇ . In the same way, the electronic device 100 can determine the angle ⁇ between the side b and the horizontal plane based on the measured direction vector of the side b of the electronic device 100 and the normal vector of the horizontal plane.
  • Air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • Magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may utilize the magnetic sensor 180D to detect opening and closing of the flip holster.
  • the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. Then, based on the detected opening and closing status of the leather case or the opening and closing status of the flip cover, features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications. It should be noted that in this embodiment of the present application, the display screen 194 of the electronic device 100 can be folded to form multiple screens. An acceleration sensor 180E may be included in each screen for measuring the orientation of the corresponding screen (ie, the direction vector of the orientation).
  • Distance sensor 180F for measuring distance.
  • Electronic device 100 can measure distance via infrared or laser. In some embodiments, In shooting scenes, the electronic device 100 can use the distance sensor 180F to measure distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outwardly through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect when the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in holster mode, and pocket mode automatically unlocks and locks the screen. It should be noted that, in the embodiment of the present application, the proximity light sensor 180F can be disposed on the A screen and B screen, the a side, the b side, the back side of the A screen and the B screen of the electronic device 100, Waiting position.
  • the proximity light sensor 180F can be used to detect whether the A screen, B screen, side a, side b, the back of the A screen, and the back of the B screen of the electronic device 100 are in contact with a physical object.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touching.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc.
  • Temperature sensor 180J is used to detect temperature.
  • the electronic device 100 utilizes the temperature detected by the temperature sensor 180J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of a processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the low temperature from causing the electronic device 100 to shut down abnormally. In some other embodiments, when the temperature is lower than another threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • Bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human body's vocal part.
  • the bone conduction sensor 180M can also contact the human body's pulse and receive blood pressure beating signals.
  • the bone conduction sensor 180M can also be provided in an earphone and combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibrating bone obtained by the bone conduction sensor 180M to implement the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M to implement the heart rate detection function.
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • touch operations for different applications can correspond to different vibration feedback effects.
  • the motor 191 can also respond to different vibration feedback effects for touch operations in different areas of the display screen 194 .
  • Different application scenarios such as time reminders, receiving information, alarm clocks, games, etc.
  • the touch vibration feedback effect can also be customized.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • This embodiment of the present invention takes a layered architecture system as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 6C is a software structure block diagram of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers. From top to bottom, they are the application layer (referred to as the application layer), the application framework layer (referred to as the framework layer), the kernel layer (also referred to as the driver layer) and the hardware layer. platform.
  • the application layer can include a series of application packages. As shown in Figure 7, the application layer can include multiple application packages such as system applications and third-party applications.
  • the application package can be applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, and desktop launcher (luncher).
  • the framework layer provides application programming interface (API) and programming framework for applications in the application layer.
  • the application box framework includes some predefined functions.
  • the framework layer can include event generation module, pattern recognition module, data calculation module, data reporting module, activity manager (Window manager service, WMS) and activity manager (activity manager service, AMS), etc.
  • the framework layer may also include a content provider, a view system, a phone manager, a resource manager, a notification manager, etc. (not shown in the figure).
  • the window manager WMS is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the activity manager AMS is responsible for managing Activity and is responsible for the startup, switching, and scheduling of each component in the system, as well as the management and scheduling of applications.
  • the driver layer (not shown in the figure) is the layer between the hardware platform and the framework layer.
  • the kernel layer can include display drivers, input/output device drivers (for example, keyboard, touch screen, headphones, speakers, microphones, etc.), camera drivers, audio drivers, sensor drivers, etc.
  • Hardware platforms can include gyroscope sensors, proximity light sensors, displays, and more.
  • sensors included in the hardware platform reference may be made to the schematic hardware structure diagram of the electronic device 100 shown in FIG. 5 .
  • the user performs input operations on the electronic device 100 (such as folding operations on the folding screen), and the hardware platform can report sensor data collected by sensors such as gyroscope sensors and proximity light sensors to the data reporting module of the framework layer.
  • the data reporting module Report sensor data to the data calculation module.
  • the data module calculation module can calculate the folding angle ⁇ , folding direction, angle ⁇ between side a and side b and the horizontal plane, physical detection results, etc. of the folding screen of the electronic device 100 based on the sensor data, and report it to the pattern recognition module.
  • the pattern recognition module can determine the current folding shape of the folding screen (such as the folding angle of the folding screen) and the bracket mode based on the folding angle ⁇ , folding direction, physical detection results, etc. reported by the data calculation module.
  • the pattern recognition module can identify the bracket mode of the folding screen, reference can be made to the embodiment shown in FIG. 3 above, which will not be described again here.
  • the event generation module can report scaffold mode events to the application layer.
  • System applications or third-party applications in the application layer can call the startup Activity interface and set the application's stand mode through the activity manager AMS (for example, two-person operation stand mode, viewing stand mode, reading stand mode or computer stand mode, etc.) As well as the position and size of the application display object in the bracket mode.
  • the window server WMS of the framework layer draws the object according to the settings of the AMS, and then sends the object data to the display driver of the kernel layer (not shown in the figure), and the display driver displays the corresponding application interface on the folding screen.
  • Embodiments of the present application provide a display method for a folding screen, which is applied to an electronic device with a folding screen.
  • the folding screen can be folded to include a first screen and a second screen.
  • the screen will be displayed. Transform and replace the content (such as replacing the original two-dimensional display content with three-dimensional display content), so that users can still see distortion-free and richer content when the folding screen is in a semi-folded state, which improves user experience. interactive experience.
  • the folding process when holding the folding screen in one possible implementation, the user can fold the folding screen while holding the folding screen, and the electronic device can receive the user's folding operation, and in response to the user's folding operation, the electronic device The device can calculate the folding angle ⁇ of the folding screen (that is, the angle between the first screen and the second screen).
  • the electronic device determines the folding angle ⁇ (P1, P2) of the folding screen, the electronic device can determine that the folding screen is in a semi-folded form (for example, as shown in (b) in Figure 1 or (b) in Figure 2 ). Among them, 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • P1 can be 1 degree, 2 degrees, 3 degrees, 5 degrees, 7 degrees, 10 degrees, 15 degrees, 20 degrees, 30 degrees, 60 degrees, 90 degrees, etc.
  • P2 can be 175 degrees, 170 degrees, 165 degrees, 150 degrees, 140 degrees, 120 degrees, 100 degrees, 90 degrees, etc.
  • the folding screen when the folding angle of the folding screen is the first angle, the folding screen can display the first object, and when the folding angle of the folding screen is the second angle, the folding screen can display the second object.
  • the second angle may be an angle within the folding angle range that triggers the three-dimensional display function of the first object, such as a folding angle when the folding screen is in a half-folded state.
  • the first object when the user holds the folding screen, the first object may be displayed on the folding screen.
  • the electronic device detects that the folding screen meets the preset conditions, the electronic device may enable the three-dimensional display function of the first object ( Then display the second object).
  • the first object, preset conditions and three-dimensional display functions are introduced respectively:
  • the visual interface may include multiple display elements, and different display elements may be displayed on the same or different layers.
  • the display elements can be divided into three-dimensional display elements and flat display elements according to whether the display elements can be displayed in three-dimensional rendering (for example, from different viewing angles).
  • the stereoscopic display element may be the first object.
  • planar display elements that is, objects displayed on a flat surface regardless of viewing angle.
  • Flat display elements do not support independent rendering and can only be adjusted overall following the display interface, and this adjustment does not involve three-dimensional changes.
  • the first object is an image of a two-dimensional object.
  • the first object can be a user interface (UI), a photo, an icon, and a string, which will be introduced separately below.
  • UI user interface
  • the UI may be an application interface, for example, it may be a UI for an audio application, a UI for a video application, a UI for a text application, a UI for a chat application, or a game application.
  • the user when the folding screen is in a fully unfolded state, the user can see a distortion-free UI by looking down at the folding screen.
  • the UI in the display screen will be distorted (for example, please refer to Figure 14) , where the so-called distortion refers to the display screen seen when the folding screen is in a fully unfolded state.
  • the content of the display screen can be adjusted so that the UI seen from the user's perspective does not appear distorted, and because in the half-folded state, the viewing angle range of the display screen becomes larger, Therefore, the UI can be converted into a three-dimensional display mode (see, for example, FIG. 15 ).
  • the icon can be a pattern on the UI, for example, it can be a prompt pattern on the UI interface, an image on the UI, an expression (for example, a static expression or a dynamic expression), a display image (For example, statistical diagrams, such as bar charts, line charts, etc.).
  • the folding screen when the folding screen is in a fully unfolded state, the user can see a distortion-free UI by looking down at the folding screen.
  • the folding screen When the folding screen is in a semi-expanded state, due to the change in the folding angle between the first screen and the second screen, the display screen will appear distorted from the user's perspective. Therefore, in the case of XX, the content of the display screen can be adjusted so that no distortion occurs from the user's perspective.
  • the folding screen is an outward-folding screen, and due to the half-folded state, the viewing angle of the display screen The range becomes larger, so richer and more three-dimensional content can be displayed.
  • the icon can be a string on the UI, for example, it can be a piece of text or a link on the UI interface.
  • the preset state may be that the folding angle of the folding screen is within a preset range, for example, it may be the angle range corresponding to the half-folded state introduced above.
  • a second object can be displayed on the folding screen according to the second angle, where the second object can be an image of a three-dimensional object of the first object.
  • the second object may contain semantic information of the first object.
  • the first object is a string with specific semantic information
  • the second object can be a three-dimensional object with specific semantics that the first object has.
  • the first object is the string "A” and the second object is " The image of the three-dimensional object A" at a certain viewing angle can be shown in Figure 16, for example.
  • the second object may contain image information of the first object.
  • the first object is expression A
  • the second object is an image of the three-dimensional object of expression A at a certain perspective.
  • the first object may be a histogram, which may represent the numerical characteristics of each data
  • the second object may be a three-dimensional histogram, as shown in FIG. 17 for details.
  • the second object may be displayed at a position that matches the display position of the first object on the folding screen.
  • matching can be understood as: the difference between the display position of the first object on the folding screen and the display position of the second object on the folding screen can be less than a threshold, and the threshold can be 1 pixel deviation. , 2 pixel deviation, 3 pixel deviation, 4 pixel deviation, 5 pixel deviation, etc.
  • the display position of the first object before the folding screen is different from that after the folding.
  • “matching” can be understood as: the second object can be displayed at the target position, and the target position is: If the three-dimensional display function of the first object is not enabled, the first object is near the display position on the folding screen (for example, the position difference may be less than a threshold).
  • the folded screen still maintains a dual-screen display mode (both the first screen and the second screen are displayed).
  • the folding screen device changes from the fully unfolded state to the semi-folded state
  • the display page will change, and the two split screens will display part of the content of the original display interface.
  • the playback controls are displayed on one split screen, and the lyrics/video screen is displayed on another split screen.
  • the second object may be displayed near the original display position of the first object after splitting the screen (for example, the position difference may be less than a threshold).
  • the folded screen maintains a single-screen display mode (one of the first screen and the second screen is displayed, for example, in an outward-folding folding screen, or in a part of an inward-folding folding screen).
  • the folding screen device switches from the fully unfolded state to the semi-folded state, and folds to a certain angle, the display page will change, and one of the two split screens will display the original display interface. all or part of the content.
  • the second object may be displayed near the original display position of the first object after splitting the screen (for example, the position difference may be less than a threshold).
  • the second object is an image of a two-dimensional object or a three-dimensional object at a target perspective. How to determine the target perspective and how to display the second object on the folding screen will be described in subsequent embodiments.
  • the second object when it is detected that the enabling conditions for the three-dimensional display function of the first object are met, the second object can be displayed on the folding screen, and as the folding angle changes, the screen of the second object In a static state (the folding screen still meets the enabling conditions for the three-dimensional display function of the first object).
  • the second object when it is detected that the enabling conditions for the three-dimensional display function of the first object are met, the second object can be displayed on the folding screen, and as the folding angle changes, the picture of the second object In a dynamic changing state (the folding screen still meets the enabling conditions for the three-dimensional display function of the first object).
  • the second object may be an image of a three-dimensional object of the first object under the target perspective, and as the folding angle of the folding screen changes, the target perspective also changes accordingly.
  • the second object when it is detected that the enabling conditions for the three-dimensional display function of the first object are met, the second object can be displayed on the folding screen, for example, the first object can mutate into the second object, or It gradually becomes the second object.
  • the target perspective During the gradient process, as the folding angle changes, the target perspective also changes accordingly.
  • the first object when the three-dimensional display function is not enabled on the folding screen, the first object can be an image observed from a top-down perspective (that is, vertically downward), and as the folding angle becomes smaller (the folding angle is 180° when fully unfolded) °, the reduction of the folding angle can be understood as folding in the direction of complete folding), and the target viewing angle can change to the horizontal viewing angle. Furthermore, a dynamic effect can be presented and the user's perception of the three-dimensional display of the displayed content can be increased. Similarly, as the folding angle increases (the folding angle is 180° when fully unfolded, the increasing folding angle can be understood as folding in the fully unfolded direction), the target viewing angle can gradually change to a vertically downward viewing angle. Furthermore, a dynamic effect can be presented and the user's perception of the three-dimensional display of the displayed content can be increased.
  • Figures 18 to 21 illustrate how the target viewing angle changes as the folding angle of the folding screen changes.
  • Figure 18 shows the display interface when the folding screen is in a fully unfolded state.
  • Figure 19 shows the folding of the folding screen.
  • the display interface when the angle is angle 1.
  • Figure 20 is the display interface when the folding angle of the folding screen is angle 2.
  • Figure 21 is the display interface when the folding angle of the folding screen is angle 3. Among them, angle 3 is smaller than angle 1, and the angle 1 is less than angle 2.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the third screen can be enabled based on detecting that the folding screen satisfies a first preset condition.
  • the gyroscope detects that the device posture changes from tilt to level, and the posture information of the device no longer changes, triggering the device to enter the swing mode.
  • condition 1) For devices with cameras and depth sensors, when condition 1) is met, it is detected that the amount of light entering the camera is reduced and the depth sensor is close to the table, which can be used as an auxiliary condition to improve the accuracy of the determination.
  • condition 3 For devices with vibration sensors and sound sensors, while satisfying condition 1), the vibration and impact sound of the mobile phone when placed on the table can also be detected as an auxiliary judgment condition.
  • the moving speed of the equipment can be detected.
  • the swing mode is entered into the triggering state, so that the swing mode can be quickly entered.
  • the three-dimensional display function for the first object can be enabled when it is detected that the folding screen is in computer stand mode, wherein it can be determined when it is detected that one or more of the following conditions are met.
  • the folding screen is in computer stand mode: the folding screen is in a semi-folded state, one of the first screen and the second screen is close to the target table, and the first screen and the second screen are in close contact with each other.
  • the angle between a screen and the horizontal plane is less than the angle threshold, and the change in the posture of the folding screen within the preset time is less than the threshold.
  • the angle threshold can be an angle of [0, P3], 0° ⁇ P3 ⁇ 30°.
  • the electronic device 100 can determine that the electronic device 100 is in a single state.
  • the screen is placed horizontally.
  • side a is the outer edge parallel to the folding line on screen A of the electronic device 100
  • side b is the outer edge parallel to the folding line on screen B of the electronic device 100.
  • the folding screen of the electronic device 100 is divided into screen A (ie, the first screen) and screen B (ie, the second screen).
  • Screen A and screen B are The folding angle ⁇ (P1, P2) of the screen, where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the back side of the B screen (i.e., the second screen) on the electronic device 100 is in contact with the real object, where the angle ⁇ [ 0, P3], optional, 0° ⁇ P3 ⁇ 30°.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the third screen can be enabled based on detecting that the folding screen satisfies a fourth preset condition.
  • the three-dimensional display function of an object; the fourth preset condition includes at least one of the following: the folding screen is in a half-folded state; or the change in the posture of the folding screen within the preset time is less than a threshold; or, The two sides of the folding screen away from the central folding line are in close contact with the target table; or, the angle between the two sides of the folding screen away from the central folding line and the horizontal plane is less than the angle threshold.
  • the three-dimensional display function for the first object when it is detected that the folding screen is in the two-person operation stand mode, the three-dimensional display function for the first object can be enabled, wherein, when it is detected that one or more of the following conditions are met, Determine that the folding screen is in the two-person operation stand mode: the folding screen is in a semi-folded state, the two sides of the folding screen away from the central folding line are in close contact with the target table, and the posture of the folding screen changes within the preset time by less than threshold.
  • the angle threshold can be an angle of [0, P3], 0° ⁇ P3 ⁇ 30°.
  • the electronic device 100 can determine that the electronic device 100 is in a dual state.
  • Screen horizontal screen standing display form.
  • side a is the outer edge parallel to the folding line on screen A of the electronic device 100
  • side b is the outer edge parallel to the folding line on screen B of the electronic device 100.
  • the folding screen of the electronic device 100 is divided into A screen (ie, the first screen) and B screen (ie, the second screen).
  • the angle ⁇ (P1, P2) with screen B where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • side a and side b on the electronic device 100 are in contact with the real object, where the angle ⁇ [0, P3] formed by side a and side b with the horizontal plane is optional, 0° ⁇ P3 ⁇ 30°.
  • Figures 22 to 24A illustrate how the target viewing angle changes as the folding angle of the folding screen changes.
  • Figure 22 shows the display interface when the folding angle of the folding screen is angle 1.
  • Figure 23 shows the folding screen. The display interface when the folding angle of the folding screen is angle 2.
  • Figure 24A shows the display interface when the folding angle of the folding screen is angle 3, where angle 2 is smaller than angle 1, and angle 1 is smaller than angle 3.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the third preset condition can be enabled based on detecting that the folding screen satisfies a third preset condition.
  • the angle between the screen and the horizontal plane is less than the angle threshold; or, the change in the posture of the folding screen within the preset time is less than the threshold.
  • the three-dimensional display function for the first object when it is detected that the folding screen is in the viewing stand mode, the three-dimensional display function for the first object can be enabled, wherein, when it is detected that one or more of the following conditions are met, Determine that the folding screen is in the two-person operation stand mode: the folding screen is in a semi-folded state, one of the first screen and the second screen is close to the target table, the first screen and the second screen
  • the angle between one of the screens and the horizontal plane is less than the angle threshold, and the change in the posture of the folding screen within the preset time is less than the threshold.
  • the angle threshold can be an angle of [0, P3], 0° ⁇ P3 ⁇ 30°.
  • the electronic device 100 can determine that the electronic device 100 is in a single-screen mode.
  • Horizontal display form Among them, 0° ⁇ P3 ⁇ 30°, side a is the outer edge of the electronic device 100 on the A screen side that is parallel to the folding line, and side b is the outer edge of the electronic device 100 that is parallel to the folding line on the B screen side.
  • the folding screen of the electronic device 100 is folded out into screen A (ie, the first screen) and screen B (ie, the second screen).
  • the folding angle ⁇ (P1, P2) between screen and B screen where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the display surface of screen B on the electronic device 100 is in contact with the real object, where the angle ⁇ [0, P3] between side a (i.e., the first side) and side b (i.e., the second side) and the horizontal plane can be Selected, 0° ⁇ P3 ⁇ 30°.
  • Reading bracket mode i.e. the second preset condition
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the third screen can be enabled based on detecting that the folding screen satisfies a second preset condition.
  • the three-dimensional display function of an object; the second preset condition includes at least one of the following: the folding screen is in a semi-folded state; or, the change in the posture of the folding screen within the preset time is less than a threshold; or, The side of the folding screen that is perpendicular to the folding line is in close contact with the target table; or, the angle between the side of the folding screen that is perpendicular to the folding line and the horizontal plane is less than the angle threshold.
  • the electronic device 100 can determine that the electronic device 100 is in a dual-screen vertical screen standing configuration.
  • 60° ⁇ P4 ⁇ 90° side a is the outer edge parallel to the folding line on screen A of the electronic device 100
  • side b is the outer edge parallel to the folding line on screen B of the electronic device 100.
  • the folding screen of the electronic device 100 is folded into screen A (ie, the first screen) and screen B (ie, the second screen).
  • Screen A and screen B are The folding angle ⁇ (P1, P2) of the screen, where 0° ⁇ P1 ⁇ P2 ⁇ 180°.
  • the angle ⁇ [P4, 90°] between side a (i.e., the first side) and side b (i.e., the second side) of the electronic device 100 and the horizontal plane optionally, 60° ⁇ P4 ⁇ 90 °.
  • the above describes the display method of the folding screen in the embodiment of the present application from multiple application scenarios.
  • the display method of the folding screen in the embodiment of the present application is introduced in detail from the perspective of an algorithm. Refer to Figure 24B.
  • the display method of the folding screen includes: 2401. When the folding angle of the folding screen is a first angle, display a first object on the folding screen.
  • the folding screen when the folding angle of the folding screen is the first angle, the folding screen can display the first object, and when the folding angle of the folding screen is the second angle, the folding screen can display the second object.
  • the second angle may be an angle within the folding angle range that triggers the three-dimensional display function of the first object, such as a folding angle when the folding screen is in a half-folded state.
  • the folding angle of the folding screen changes from the first angle to the second angle, and display a second object according to the first object; wherein the first object is an image of a two-dimensional object, and the third object The second object is an image of a three-dimensional object corresponding to the two-dimensional object.
  • the second object is an image of a three-dimensional object corresponding to the two-dimensional object, and the three-dimensional object needs to have information contained in the first object (for example, semantic information and/or image information).
  • the second object in order to enable the second object to have a three-dimensional display effect, can be an image of the first object's three-dimensional object at the target perspective (the first object and the second object are different), which is equivalent to the user starting from The best viewing angle of the folding screen When looking at the folding screen, you can see the image of the first object's three-dimensional object from the target viewing angle.
  • the image content of the three-dimensional object of the first object at the target perspective it is necessary to obtain the image content of the three-dimensional object of the first object at the target perspective, and the image content needs to be passed through a certain Image transformation (such as stretching and deformation processing) is projected onto the folding screen for display.
  • a certain Image transformation such as stretching and deformation processing
  • the reason why the image content cannot be directly displayed on the folding screen is because the best viewing angle of the folding screen when it is in a semi-folded state is not perpendicular to the screen. If the image content is directly displayed on the folding screen, the user will not see The second object is distorted (because it is large near and small far), and when the display area of the second object spans two screens, this distortion will be more obvious, causing further distortion between the content displayed on the upper and lower screens. distortion.
  • FIG. 25 shows a schematic diagram of directly displaying the image content of a stereoscopic object at a target perspective on a fully unfolded folding screen
  • FIG. 26 shows a schematic diagram of a fully unfolded folding screen in a half-folded state.
  • Figure 27 shows a schematic representation of the image content of a stereoscopic object at the target viewing angle displayed directly on a folding screen (distortion exists).
  • Figure 27 shows a schematic representation of the image content of a stereoscopic object at the target viewing angle displayed after image transformation in a half-folded state (without distortion). distortion exists).
  • the size of the target perspective needs to be confirmed. Next, how to determine the target perspective is described.
  • the target viewing angle corresponding to the second angle is determined through the mapping relationship between the screen angle of the folding screen and the viewing angle of the three-dimensional object.
  • the mapping relationship between each screen angle and the viewing angle of the three-dimensional object can be maintained.
  • the mapping relationship can be a mapping relationship between discrete data, or a mapping relationship between continuous data (for example, through a function relationship), after obtaining the screen angle, the target angle corresponding to the second angle can be determined based on the mapping relationship between the screen angle and the viewing angle of the three-dimensional object.
  • the mapping relationship is a mapping relationship between discrete data
  • the target perspective corresponding to the second angle may be determined based on the mapping relationship between the screen angle (including the second angle) and the three-dimensional object viewing perspective.
  • the mapping relationship is a mapping relationship between continuous data
  • the second angle can be used as the dependent variable of the mapping relationship to determine the target perspective (independent variable) corresponding to the second angle.
  • the mapping relationship can indicate that the viewing angle of the three-dimensional object changes with the change of the screen angle. From the perspective of the display effect, as the folding angle changes, the picture of the second object is in a dynamically changing state ( The folding screen still meets the enabling conditions for the three-dimensional display function of the first object).
  • the second object may be an image of a three-dimensional object of the first object under the target perspective, and as the folding angle of the folding screen changes, the target perspective also changes accordingly.
  • the first object when the three-dimensional display function is not enabled on the folding screen, the first object can be an image observed from a top-down perspective (that is, vertically downward), and as the folding angle becomes smaller (the folding angle is 180° when fully unfolded) °, the reduction of the folding angle can be understood as folding in the direction of complete folding), and the target viewing angle can change to the horizontal viewing angle. Furthermore, a dynamic effect can be presented and the user's perception of the three-dimensional display of the displayed content can be increased. Similarly, as the folding angle increases (the folding angle is 180° when fully unfolded, the increasing folding angle can be understood as folding in the fully unfolded direction), the target viewing angle can gradually change to a vertically downward viewing angle. Furthermore, a dynamic effect can be presented and the user's perception of the three-dimensional display of the displayed content can be increased.
  • mapping relationship may also have other intermediate variables.
  • the target screen viewing angle can be determined according to the second angle, and the target screen viewing angle can be determined through the relationship between the screen viewing angle and the three-dimensional object viewing angle.
  • the mapping relationship determines the target viewing angle corresponding to the target screen viewing angle.
  • an image of the three-dimensional object (the three-dimensional object corresponding to the two-dimensional object) in the target perspective can be obtained according to the target perspective.
  • the mapping relationship between each pixel point of the first object and the second object can be determined according to the target perspective.
  • the above mapping relationship can be called the overall rendering parameter.
  • the overall rendering parameter It is a set of parameters for overall rendering of the screen display page. It is used to simulate that when viewing the pre-folded screen from the best viewing position after folding (that is, the target screen viewing angle mentioned above), compared with the original display page, the display page The position changes of each pixel on the image.
  • the currently displayed page a is viewed as a card.
  • (a) in Figure 28 shows the state of the folding screen mobile phone before folding (fully unfolded and laid flat on the table).
  • the overall rendering parameters are used to adjust page a in (b) in Figure 28 to the shape of page a on the screen in (a) in Figure 28 from the best viewing angle in Figure 4B through the overall rendering parameters.
  • page a in the shape of (a) in FIG. 28 is displayed on the screen in (b) in FIG. 28 , refer to (c) in FIG. 28 .
  • the foldable screen phone can simulate three-dimensional modeling to make one of the edges of the foldable screen phone overlap, and then obtain (b in Figure 28 ) displays the shape of page a before folding (i.e., the dark gray card in (c) in Figure 28), and then based on (b) in Figure 28 itself, the shape of page a (i.e. (( in Figure 28) c) (light gray card in Figure 28(b)), a set of parameter groups is obtained, which indicates the shape of page a from itself in (b) in Figure 28 to the page a before folding in the screen in (b) in Figure 28 distortion of the shape.
  • the obtained parameter group is the overall rendering parameter.
  • the shape of the light gray card in (c) of FIG. 28 can be adjusted to the shape of the dark gray card in (c) of FIG. 28 .
  • the overall rendering parameters indicate the mapping relationship between each pixel on the light gray card and the corresponding pixel on the dark gray card.
  • the three-dimensional presentation angle (that is, the target screen viewing angle) of the stereoscopic display element can be determined based on the best viewing position after folding (that is, the viewing angle of the target screen). viewing angle), and determines the stereoscopic display image (that is, the second object) according to the three-dimensional presentation angle.
  • the three-dimensional presentation angle of the three-dimensional display element must first be determined, that is, the side of the three-dimensional display element that is viewed from the best viewing angle after folding.
  • Stereoscopic display image that is, a three-dimensional perspective image of a three-dimensional display element viewed from the best viewing position after folding. It is a two-dimensional image.
  • the stereoscopic display element it is only necessary to determine an orientation in its three-dimensional model that is the same as the best viewing position after folding.
  • the image of the stereoscopic display element viewed from this position is the final displayed stereoscopic display image.
  • page a in (a) in FIG. 28 view page a in (a) in FIG. 28 from the best viewing angle in (b) in FIG. 28 .
  • page a includes display element x, display element y, and display element z, where x is a three-dimensional display element, and y and z are flat display elements.
  • the display element y and the display element z only have shape distortion, and the display element Not only is the shape of x distorted, but it also changes from a top view to a three-dimensional perspective. More content of the display element x can be seen, and part of the page a will be blocked.
  • the stereoscopic display image of the stereoscopic display element x is a cylindrical image observed from a stereoscopic perspective.
  • the rendering result is that after distorted rendering of page a in (b) in Figure 28, the distorted page a in (c) in Figure 28 is obtained. As shown in Figure 30.
  • the viewing angles are the same, it is only necessary to superimpose the stereoscopic display image on the position of the display element on the distorted and rendered page.
  • Figure 31 shows the process of obtaining the initial rendering by corresponding the positions of the stereoscopic display image and the stereoscopic display elements. Since the entire page and a single stereoscopic display element are observed from the same perspective, for the stereoscopic display element x, the stereoscopic display image and the display element x on the overall page (assuming that the display element x is a flat display element in this state) are at least Some parts are at the same position, and the pixels at these positions can be used as reference positions. For example, the bottom of the cylinder in Figure 31. Therefore, the correspondence of the stereoscopic display image can be realized according to the pixel points at the bottom of the cylinder, and the superposition of the stereoscopic display image or the replacement of the pixel points can be performed. Get the initial rendering image on the right (that is, the image of the three-dimensional object from the target perspective).
  • the image of the three-dimensional object from the target perspective can be obtained based on the pose relationship between the target screen viewing perspective and the folding screen.
  • a stretching and deformation process is performed to map the three-dimensional object onto the plane of the folding screen to obtain the second object.
  • the relative posture relationship between the folding screen mobile phone after folding and the folding screen before folding can be determined, and the two positions of the initial rendering image are determined based on the best viewing angle after folding and the relative position relationship before and after folding.
  • Secondary rendering parameters when users view the overlay image from the best viewing angle after folding, it has a better display effect. Therefore, as long as the relative position relationship before and after folding is determined, and then the superimposed image is projected onto the folded screen according to the best viewing angle after folding, the image on the folded screen can be obtained.
  • the split-screen rendering parameters are as follows: It is used to render the overlay image as an image projected onto the folded screen.
  • a reference object can be determined.
  • the split-screen junction line can be used as a reference to make the split-screen junction lines of the screens of the folding screen mobile phone before and after folding overlap, that is, it is assumed that the split-screen junction line does not move. In this way, the two split-screen images can be processed separately.
  • the folding screen mobile phone before folding is in a fully unfolded state
  • the folding screen mobile phone after folding is in a semi-folded state. Make the split-screen boundary lines of the folding screen phone before folding and the folding screen phone after folding overlap, as shown in Figure 32.
  • C1 is the screen of the folding screen mobile phone after folding
  • D1 is the screen of the folding screen mobile phone before folding. Since the user has a better visual effect when viewing the superimposed image on the screen C1 from the best viewing position after folding, Therefore, it is only necessary to ensure that the image on the screen D1 is projected onto the screen C1 to obtain a superimposed image on the screen C1, and the user will have a better visual effect when viewing the image on the screen C1. Therefore, based on this projection relationship, the corresponding projected pixel point of each pixel point on the folded screen can be determined, and the secondary rendering parameters can be obtained from the position of each pixel point and the projected pixel point on the screen.
  • the top edge or bottom edge of the screen can be based on the determined reference object, and then the projection pixel points of each pixel point can be determined according to the projection principle, and finally the secondary rendering parameters can be obtained.
  • the image of the upper split screen and the lower split screen The images are processed separately to obtain the parameters of the upper split screen and the parameters of the lower split screen. Processing the two split screens separately can improve the rendering effect.
  • the folding screen includes the first screen and the second screen
  • the second object includes A first sub-object and a second sub-object
  • the image includes a first image corresponding to the first sub-object and a second image corresponding to the second sub-object
  • the first sub-object can be compared with the first sub-object according to the target perspective.
  • the first image is stretched and deformed to map the first image to the first screen; according to the position between the target viewing angle and the second screen relationship, performing stretching and deformation processing on the second image to map the second image onto the second screen.
  • the primary rendering image can be processed according to the secondary rendering parameters, and the processed image can be displayed on the folded screen.
  • the initial rendering image can be processed.
  • the resulting image is the image in which the superimposed image is back-projected on the screen of the folded folding screen mobile phone.
  • the image is displayed on the screen of the folding screen mobile phone.
  • the three-dimensional rendering display function can be rendered in real time, or it can be rendered only once after the user completes the folding operation.
  • Real-time rendering means that as long as a change in angle is detected, the screen display content will be adjusted accordingly. This is beneficial to scenes where users need to watch continuous animations, such as three-dimensional rendering of lock screen wallpapers. However, in some cases, users may not pay much attention to the changes during the folding operation, but focus on the displayed content after the folding operation. In this case, single rendering can be applied to reduce computing power requirements. For example, three-dimensional display of charts.
  • some parameters can be used to determine that the user has completed the operation.
  • a series of parameters that refer to the completion of the user's operation can define a preset mode.
  • the display content on the screen includes a histogram, and if certain conditions are met, the histogram can be displayed in a stereoscopic perspective.
  • users switch from a flat histogram to a three-dimensional histogram display, they can first bend the screen and then place the folding screen phone on a flat surface for easier observation. Then a series of parameters corresponding to the state of placing the folding screen mobile phone on a flat surface can define a preset mode, for example, called the table setting mode.
  • the folding screen mobile phone enters the table setting mode, the folding screen mobile phone adjusts the histogram from a plane viewing angle to a stereoscopic viewing angle for display.
  • the display page when folded to a certain angle. changes, the two split screens respectively display part of the content of the original display interface. For example, in the video music/video playback interface, when folded to a certain angle, the playback controls are displayed on one split screen, and the lyrics/video screen is displayed on another split screen.
  • the current visualization page is displayed in split-screen according to the existing split-screen logic, and then the display interfaces on the two split-screens are rendered separately after the split-screen.
  • the display content on the two folded split screens has been obtained by the split screen logic, and there is no need to simulate the best viewing position of the folded screen to determine the initial rendering.
  • determine the split-screen page to be displayed on the two split-screens after split-screen display post-processing determine the three-dimensional presentation angle of the three-dimensional display element based on the best viewing position after folding, and determine the three-dimensional presentation angle of the three-dimensional display element based on the three-dimensional presentation angle.
  • Display the image and correspond the positions of the stereoscopic display image and the stereoscopic display elements.
  • split-screen display page can be directly determined according to the existing split-screen logic, there is no need to determine the split-screen rendering image.
  • the two split-screen display pages can be directly determined according to the existing logic. It is only necessary to determine the best viewing position after folding, and then process the stereoscopic display elements according to the changed best viewing position.
  • the process of processing stereoscopic display elements to make the stereoscopic display image correspond to the position of the stereoscopic display elements. After determining the pixels that can be used as reference positions, it can be achieved by image overlay or replacement of pixels.
  • the image on one side of the stereoscopic display element is viewed from the best viewing angle after folding, then the image on that side is the stereoscopic display image that needs to be presented on the final split screen, and then the three-dimensional display element is The stereoscopic display image and the original position of the three-dimensional display element (the position during flat display) are sufficient.
  • Figure 36 shows the music playback interface.
  • the original visual interface will be divided into two split-screen interfaces for display according to the split-screen logic.
  • the two split-screen interfaces are the playback control interface and the lyrics interface. Then it is determined whether the two split-screen interfaces contain three-dimensional display elements. Among them, multiple spaces in the playback control interface are three-dimensional display elements. Then according to the changed viewing angle, the three-dimensional display of each playback control (which is a three-dimensional display element) is determined. image, and then replace the original flat-presented controls with a three-dimensional image, so that it has a three-dimensional display effect, as shown in the picture on the right.
  • the cross-split screen display part is rendered according to the split-screen rendering parameters and displayed on another split screen.
  • the split-screen rendering parameters can be determined, the cross-split-screen display part is processed according to the split-screen rendering parameters, and then the processed image is rendered and displayed on another split screen. That is, the x01 part in Figure 37 is processed with split-screen rendering parameters, and then the resulting image is displayed on the upper split-screen.
  • the specific display position is also determined according to the split-screen rendering parameters.
  • FIG 33 (a) in Figure 33 is a fully unfolded mobile phone screen, which displays a document page in a planar manner.
  • (b) in Figure 33 and (c) in Figure 33 are respectively the screens after keeping the lower half of the mobile phone stationary and bending the upper half of the screen. The display page does not change before and after bending, and both show Figure 33 The same document page in (a), where (b) in Figure 33 shows the prior art.
  • Display effect, (c) in Figure 33 is the display effect after three-dimensional rendering according to this solution.
  • (b) in Figure 33 and (c) in Figure 33 have the same bending angle.
  • the plane where the lower half screen is located is the plane where the x-axis and y-axis are located, and a spatial rectangular coordinate system is constructed.
  • the coordinates of the best viewing position of the screen are (0,0,75).
  • the coordinates of the screen are (50,50,50)
  • the mobile phone's processor simulates a viewing angle, views the document page and columnar distribution chart of screen (a) in Figure 33 at coordinates (50, 50, 50), and obtains the initial rendering from this viewing angle.
  • the coordinate (50 ,50,50) What is seen is a three-dimensional image.
  • What is finally presented in the initial rendering image is a three-dimensional image, and the presentation angle of the columnar distribution chart has changed.
  • the screen (c) in Figure 33 overlaps the split-screen boundary line of the screen (a) in Figure 33, so that the simulation is rendered on the initial rendering of the screen in (a) in Figure 33
  • All the pixels on the screen are connected to the best viewing position (50, 50, 50).
  • the intersection point of the connection with the screen (c) in Figure 33 is the projection pixel. All the projection pixels constitute the (shown in Figure 33) c) on the display document page.
  • the current display interface may only include three-dimensional display elements.
  • the lock screen interface may be regarded as the display interface only displaying a single three-dimensional display element. At this time, since there is only one three-dimensional display element in the entire page, you only need to determine the viewing angle of the three-dimensional model according to step 112, and then perform subsequent steps.
  • Figure 33 shows an example of the lock screen interface in two bending directions of the screen, inward and outward.
  • An embodiment of the present application provides a display method for a folding screen, which is applied to an electronic device including a folding screen.
  • the method includes: when the folding angle of the folding screen is a first angle, displaying a third time on the folding screen.
  • An object detecting that the folding angle of the folding screen changes from a first angle to a second angle, and displaying a second object according to the first object when the first object is not displayed on the folding screen ;
  • the first object is an image of a two-dimensional object
  • the second object is an image of a three-dimensional object corresponding to the two-dimensional object.
  • the folding screen when the folding screen is in a half-folded state, the content in the display screen is transformed and replaced (for example, the original two-dimensional display content is replaced with a three-dimensional display content), so that the user can view the folding screen in a half-folded state.
  • the present application also provides a folding screen display device.
  • the image display device may be a terminal device.
  • Figure 38 is a schematic structural diagram of a folding screen display device provided by an embodiment of the present application, as shown in Figure 38 As shown, the folding screen display device 3800 includes:
  • the display module 3801 is configured to display the first object on the folding screen when the folding angle of the folding screen is the first angle;
  • the folding angle of the folding screen changes from the first angle to the second angle, and when the first object is not displayed on the folding screen, a second object is displayed according to the first object; wherein, The first object is an image of a two-dimensional object, and the second object is an image of a three-dimensional object corresponding to the two-dimensional object.
  • step 2401 For a specific description of the display module 3801, reference may be made to the description of step 2401 in the above embodiment, which will not be described again here.
  • the first object is a user interface UI, an icon or a string.
  • the second object contains semantic information or image information of the first object.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • the enabling module 3802 is configured to enable the three-dimensional display function of the first object based on detecting that the folding screen meets a first preset condition; the first preset condition includes at least one of the following:
  • the folding screen is in a semi-folded state
  • the angle between one of the first screen and the second screen and the horizontal plane is less than the angle threshold; or,
  • the change in the posture of the folding screen within the preset time is less than a threshold; or,
  • One of the first screen and the second screen is in close contact with the target table.
  • the folding screen is an inward folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • the enabling module 3802 is configured to enable the three-dimensional display function of the first object based on detecting that the folding screen meets a second preset condition; the second preset condition includes at least one of the following:
  • the folding screen is in a semi-folded state
  • the change in the posture of the folding screen within the preset time is less than a threshold; or,
  • the side of the folding screen perpendicular to the folding line is in close contact with the target table; or,
  • the angle between the side of the folding screen that is perpendicular to the folding line and the horizontal plane is less than the angle threshold.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • the enabling module 3802 is configured to enable the three-dimensional display function of the first object based on detecting that the folding screen meets a third preset condition; the third preset condition includes at least one of the following:
  • the folding screen is in a semi-folded state
  • One of the first screen and the second screen is in close contact with the target table; or,
  • the angle between one of the first screen and the second screen and the horizontal plane is less than the angle threshold; or,
  • the change in the posture of the folding screen within the preset time is less than the threshold.
  • the folding screen is an outward-folding screen, and the folding screen includes a first screen and a second screen; the device further includes:
  • the enabling module 3802 is configured to enable the three-dimensional display function of the first object based on detecting that the folding screen meets a fourth preset condition; the fourth preset condition includes at least one of the following:
  • the folding screen is in a semi-folded state
  • the change in the posture of the folding screen within the preset time is less than a threshold; or,
  • the two sides of the folding screen away from the central folding line are in close contact with the target table; or,
  • the angle between the two sides of the folding screen away from the central folding line and the horizontal plane is less than the angle threshold.
  • the second object is an image of the three-dimensional object in a target perspective mapped to the folding screen.
  • the target viewing angle is related to the second angle.
  • the display module 3801 is also used to:
  • the three-dimensional objects displayed on the folding screen at changing viewing angles are mapped to the folding screen of multi-frame images.
  • the first angle and the second angle are acute angles of the folding screen
  • the continuous change includes: changing to a horizontal viewing angle
  • the continuous change includes: changing to a vertical viewing angle.
  • the display module 3801 is specifically used to:
  • the second object is displayed at a display position matching the first object on the first screen and the second screen.
  • the device further includes:
  • the viewing angle determination module 3803 is configured to determine the second angle corresponding to the second angle according to the mapping relationship between the screen angle of the folding screen and the viewing angle of the three-dimensional object before displaying the second object. of the target perspective.
  • the display module 3801 is also used to:
  • the image of the three-dimensional object under the target viewing angle is stretched and deformed to map the three-dimensional object to the plane of the folding screen. to obtain the second object.
  • the difference between the target screen viewing angle and the center line of the second angle is within a preset range.
  • the folding screen includes a first screen and a second screen
  • the second object includes a first sub-object and a second sub-object
  • the image includes a third sub-object corresponding to the first sub-object.
  • An image, and a second image corresponding to the second sub-object; the display module is specifically used for:
  • the second image is stretched and deformed to map the second image onto the second screen.
  • the folding screen when the folding angle of the folding screen is the first angle, the folding screen is in a fully unfolded state.
  • the terminal device may be a folding screen display device in Figure 38. Please refer to Figure 39.
  • Figure 39 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • the terminal device 3900 may specifically be a virtual reality VR device, a mobile phone, a tablet, a laptop, a smart wearable device, etc., and is not limited here.
  • the terminal device 3900 includes: a receiver 3901, a transmitter 3902, a processor 3903 and a memory 3904 (the number of processors 3903 in the terminal device 3900 can be one or more, one processor is taken as an example in Figure 39) , wherein the processor 3903 may include an application processor 39031 and a communication processor 39032.
  • the receiver 3901, the transmitter 3902, the processor 3903, and the memory 3904 may be connected through a bus or other means.
  • Memory 3904 may include read-only memory and random access memory and provides instructions and data to processor 3903. A portion of memory 3904 may also include non-volatile random access memory (NVRAM).
  • NVRAM non-volatile random access memory
  • the memory 3904 stores processor and operating instructions, executable modules or data structures, or a subset thereof, or an extended set thereof, where the operating instructions may include various operating instructions for implementing various operations.
  • the processor 3903 controls the operation of the terminal device.
  • various components of the terminal equipment are coupled together through a bus system.
  • the bus system may also include a power bus, a control bus, a status signal bus, etc.
  • various buses are called bus systems in the figure.
  • the methods disclosed in the above embodiments of the present application can be applied to the processor 3903 or implemented by the processor 3903.
  • the processor 3903 may be an integrated circuit chip with signal processing capabilities. During the implementation process, each step of the above method can be completed by instructions in the form of hardware integrated logic circuits or software in the processor 3903.
  • the above-mentioned processor 3903 can be a general-purpose processor, a digital signal processor (DSP), a microprocessor or a microcontroller, and can further include an application specific integrated circuit (ASIC), a field programmable Gate array (field-programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the processor 3903 can implement or execute each method, step and logical block diagram disclosed in the embodiment of this application.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory 3904.
  • the processor 3903 reads the information in the memory 3904 and completes the steps of the above method in combination with its hardware.
  • the processor 3903 can read the information in the memory 3904, and combine its hardware to complete the steps related to data processing from step 301 to step 304 in the above embodiment, and the steps related to data processing from step 3001 to step 3004 in the above embodiment. Data processing related steps.
  • the receiver 3901 can be used to receive input numeric or character information, and generate signal input related to relevant settings and function control of the terminal device.
  • the transmitter 3902 can be used to output numeric or character information through the first interface; the transmitter 3902 can also be used to send instructions to the disk group through the first interface to modify the data in the disk group; the transmitter 3902 can also include a display device such as a display screen .
  • An embodiment of the present application also provides a computer program product that, when run on a computer, causes the computer to perform the steps of the folding screen display method described in the embodiment corresponding to FIG. 24B in the above embodiment.
  • Embodiments of the present application also provide a computer-readable storage medium, which stores a program for signal processing. When it is run on a computer, it causes the computer to perform the method described in the previous embodiment. The steps of display method of folding screen.
  • the image display device provided by the embodiment of the present application may be a chip.
  • the chip may include: a processing unit and a communication unit.
  • the processing unit may be, for example, a processor.
  • the communication unit may be, for example, an input/output interface, a pin or a circuit, etc. .
  • the processing unit can execute the computer execution instructions stored in the storage unit, so that the chip in the execution device executes the data processing method described in the above embodiment, or so that the chip in the training device executes the data processing method described in the above embodiment.
  • the storage unit is a storage unit within the chip, such as a register, cache, etc.
  • the storage unit may also be a storage unit located outside the chip in the wireless access device, such as Read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (random access memory, RAM), etc.
  • ROM Read-only memory
  • RAM random access memory
  • the device embodiments described above are only illustrative.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physically separate.
  • physical unit which can be located in One location, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • the connection relationship between modules indicates that there are communication connections between them, which can be specifically implemented as one or more communication buses or signal lines.
  • the present application can be implemented by software plus necessary general hardware. Of course, it can also be implemented by dedicated hardware including dedicated integrated circuits, dedicated CPUs, dedicated memories, Special components, etc. to achieve. In general, all functions performed by computer programs can be easily implemented with corresponding hardware. Moreover, the specific hardware structures used to implement the same function can also be diverse, such as analog circuits, digital circuits or special-purpose circuits. circuit etc. However, for this application, software program implementation is a better implementation in most cases. Based on this understanding, the technical solution of the present application can be embodied in the form of a software product in essence or that contributes to the existing technology.
  • the computer software product is stored in a readable storage medium, such as a computer floppy disk. , U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk, etc., including several instructions to cause a computer device (which can be a personal computer, server, or network device, etc.) to execute the method described in each embodiment of the application. .
  • a computer device which can be a personal computer, server, or network device, etc.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • wired such as coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that a computer can store, or a data storage device such as a server or data center integrated with one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk (Solid State Disk, SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un procédé d'affichage pour écran pliant, ledit procédé étant appliqué à un dispositif électronique comprenant un écran pliant. Le procédé consiste : lorsque l'angle de pliage d'un écran pliant est un premier angle, à afficher un premier objet sur l'écran pliant ; et lorsqu'il est détecté que l'angle de pliage de l'écran pliant est changé du premier angle à un second angle, à afficher un second objet, le premier objet étant une image d'un objet bidimensionnel, et le second objet étant une image d'un objet tridimensionnel correspondant à l'objet bidimensionnel. Au moyen de la présente invention, un utilisateur peut voir des images à contenu plus riche lorsqu'un écran pliant est dans un état semi-plié, de telle sorte que l'expérience d'interaction de l'utilisateur est améliorée.
PCT/CN2023/101661 2022-06-30 2023-06-21 Procédé d'affichage pour écran pliant, et dispositif associé WO2024001900A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210764059.2A CN117369756A (zh) 2022-06-30 2022-06-30 一种折叠屏的显示方法以及相关设备
CN202210764059.2 2022-06-30

Publications (1)

Publication Number Publication Date
WO2024001900A1 true WO2024001900A1 (fr) 2024-01-04

Family

ID=89383269

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/101661 WO2024001900A1 (fr) 2022-06-30 2023-06-21 Procédé d'affichage pour écran pliant, et dispositif associé

Country Status (2)

Country Link
CN (1) CN117369756A (fr)
WO (1) WO2024001900A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180252931A1 (en) * 2017-03-02 2018-09-06 SK Commercial Construction, Inc. Enhanced transparent display screen for mobile device and methods of operation
CN110347311A (zh) * 2019-07-02 2019-10-18 网易(杭州)网络有限公司 三维虚拟对象显示方法与装置、存储介质、电子设备
CN111338737A (zh) * 2020-02-28 2020-06-26 华为技术有限公司 内容呈现方法、装置、终端设备及计算机可读存储介质
CN114003321A (zh) * 2020-07-28 2022-02-01 华为技术有限公司 一种显示方法及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180252931A1 (en) * 2017-03-02 2018-09-06 SK Commercial Construction, Inc. Enhanced transparent display screen for mobile device and methods of operation
CN110347311A (zh) * 2019-07-02 2019-10-18 网易(杭州)网络有限公司 三维虚拟对象显示方法与装置、存储介质、电子设备
CN111338737A (zh) * 2020-02-28 2020-06-26 华为技术有限公司 内容呈现方法、装置、终端设备及计算机可读存储介质
CN114003321A (zh) * 2020-07-28 2022-02-01 华为技术有限公司 一种显示方法及电子设备

Also Published As

Publication number Publication date
CN117369756A (zh) 2024-01-09

Similar Documents

Publication Publication Date Title
EP4084450B1 (fr) Procédé d'affichage pour écran pliable et appareil associé
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
CN110597510B (zh) 一种界面的动态布局方法及设备
WO2021104008A1 (fr) Procédé d'affichage d'un écran pliable et appareil associé
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
CN114115587A (zh) 一种控制屏幕显示的方法和电子设备
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
WO2021052279A1 (fr) Procédé d'affichage sur écran pliable, et dispositif électronique
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
CN111190681A (zh) 显示界面适配方法、显示界面适配设计方法和电子设备
WO2023103951A1 (fr) Procédé d'affichage pour écran pliable et appareil associé
WO2021082564A1 (fr) Procédé d'invite d'opération et dispositif électronique
CN114115769A (zh) 一种显示方法及电子设备
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2021208723A1 (fr) Procédé et appareil d'affichage plein écran, et dispositif électronique
WO2022095744A1 (fr) Procédé de commande d'affichage vr, dispositif électronique et support de stockage lisible par ordinateur
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2021179829A1 (fr) Procédé et dispositif d'interaction homme-machine
EP4181494A1 (fr) Procédé d'affichage et appareil associé
CN113610943B (zh) 图标圆角化的处理方法及装置
WO2024001900A1 (fr) Procédé d'affichage pour écran pliant, et dispositif associé
CN115480849A (zh) 用户界面布局方法及相关设备
CN111982037B (zh) 一种测量高度的方法和电子设备
WO2024017090A1 (fr) Procédé d'affichage d'informations et dispositif électronique
WO2022111593A1 (fr) Appareil et procédé d'affichage d'interface graphique utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23830079

Country of ref document: EP

Kind code of ref document: A1