WO2022166386A1 - 图像显示方法、电子设备及存储介质 - Google Patents

图像显示方法、电子设备及存储介质 Download PDF

Info

Publication number
WO2022166386A1
WO2022166386A1 PCT/CN2021/136372 CN2021136372W WO2022166386A1 WO 2022166386 A1 WO2022166386 A1 WO 2022166386A1 CN 2021136372 W CN2021136372 W CN 2021136372W WO 2022166386 A1 WO2022166386 A1 WO 2022166386A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
displayed
electronic device
sub
display
Prior art date
Application number
PCT/CN2021/136372
Other languages
English (en)
French (fr)
Inventor
刘志远
马明刚
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21924379.7A priority Critical patent/EP4280055A1/en
Publication of WO2022166386A1 publication Critical patent/WO2022166386A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/005Adapting incoming signals to the display format of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0492Change of orientation of the displayed image, e.g. upside-down, mirrored
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller

Definitions

  • the embodiments of the present application relate to the field of computer technologies, and in particular, to an image display method, an electronic device, and a storage medium.
  • APP application programs
  • many APPs eg, long screenshot software, long screenshot splicing software, etc.
  • the bar graph can be a vertical bar graph. Since the above bar graphs can more completely display some coherent information, users usually capture some bar graphs in the form of web pages, chat records, and the like.
  • a TV is usually an electronic device with a horizontal screen.
  • the picture display is usually displayed in the center, the user needs to continuously move up and down to complete the browsing of the bar graph. The operation is cumbersome.
  • Embodiments of the present application provide an image display method, an electronic device, and a storage medium, so as to provide a way of adaptively displaying pictures.
  • an embodiment of the present application provides an image display method, which is applied to an electronic device, including:
  • the to-be-displayed image is acquired; specifically, the to-be-displayed image may be acquired from an album of the electronic device, or may be acquired during browsing on a webpage of the electronic device.
  • the source of the image to be displayed is not particularly limited in this embodiment of the present application.
  • the image to be displayed is identified, and the display type of the image to be displayed is determined; specifically, the display type may include a long horizontal image and a long vertical image.
  • the screen state of the electronic device is acquired, and it is determined whether the screen state of the electronic device matches the display type of the image to be displayed; specifically, the screen state may include a horizontal screen and a vertical screen.
  • the horizontal screen state of the electronic device may match the long horizontal image, that is, the long horizontal image can be displayed normally on the electronic device in the horizontal screen state.
  • the vertical screen state of the electronic device can be matched with the long vertical image, that is, the long vertical image can be displayed normally on the electronic device in the vertical screen state.
  • the display area is determined according to the matching result; if the screen state of the electronic device does not match the display type of the image to be displayed, the to-be-displayed image is segmented to obtain multiple sub-images, and the multiple sub-images are displayed in the display area.
  • the image to be displayed is segmented and displayed, thereby improving the efficiency of image display and improving the user's viewing experience. experience.
  • identifying the image to be displayed and determining the display type of the image to be displayed include:
  • the display type of the image to be displayed is a long-vertical image; if the aspect ratio of the image to be displayed is less than or equal to the preset second threshold, it is determined The display type of the displayed image is Longitudinal.
  • the type of the image to be displayed can be effectively identified.
  • the screen state of the electronic device includes a horizontal screen and a vertical screen
  • the display area determined according to the matching result includes:
  • the screen state of the electronic device is landscape, and the display type of the image to be displayed is vertical and horizontal; or the screen state of the electronic device is portrait, and the display type of the image to be displayed is vertical and horizontal; then determine the screen of the electronic device. If the state does not match the display type of the image to be displayed, determine that the screen of the electronic device is the first display area, and divide the first display area to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is landscape, and the display type of the image to be displayed is a long horizontal image; or the screen state of the electronic device is a vertical screen, and the display type of the image to be displayed is a long vertical image; then determine the screen of the electronic device The state matches the display type of the image to be displayed, and determines that the screen of the electronic device is the second display area.
  • the display area is determined according to the matching result between the screen state of the electronic device and the image type, which can effectively divide the display area, thereby improving the display efficiency of the image to be displayed.
  • one of the possible implementations includes:
  • the image to be displayed is displayed in the second display area.
  • the first display area is divided to obtain a plurality of sub-display areas including:
  • the width of the screen of the electronic device and the width of the image to be displayed are obtained, based on the ratio of the width of the screen of the electronic device to the width of the image to be displayed , the first display area is divided to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is a portrait screen and the display type of the image to be displayed is a long horizontal image
  • the height of the screen of the electronic device and the height of the image to be displayed are obtained, based on the ratio of the height of the screen of the electronic device to the height of the image to be displayed , the first display area is divided to obtain a plurality of sub-display areas.
  • displaying multiple sub-images in the display area includes:
  • a plurality of sub-images are displayed in a plurality of sub-display areas.
  • the to-be-displayed image is segmented to obtain multiple sub-images including:
  • the to-be-displayed image is evenly divided based on the total number of sub-display areas to obtain multiple sub-images of the same size.
  • the to-be-displayed image is segmented to obtain multiple sub-images including:
  • the image to be displayed is divided unevenly to obtain multiple sub-images of different sizes.
  • the method further includes:
  • the to-be-displayed image is re-segmented, and multiple sub-images obtained after the re-segmentation are displayed in multiple sub-display areas.
  • the first operation may be to zoom the image to be displayed.
  • the user can dynamically adjust the screen state of the electronic device, so that the images to be displayed can be dynamically displayed separately, and the flexibility of displaying the images to be displayed can be improved.
  • the method further includes:
  • the screen of the electronic device is determined as the first display area, and the first display area is divided to obtain a plurality of sub-display areas; specifically, the second operation may be to adjust the screen state of the electronic device, For example, it is adjusted from the horizontal state of the electronic device to the vertical state, or from the vertical state of the electronic device to the horizontal state.
  • the to-be-displayed image is divided to obtain multiple sub-images, and the multiple sub-images are displayed in the multiple sub-display areas.
  • the user can dynamically adjust the screen state of the electronic device, so that the to-be-displayed image can be dynamically divided and displayed, thereby improving the flexibility of the to-be-displayed image display.
  • an image display device which is applied to electronic equipment, including:
  • an acquisition module for acquiring the image to be displayed
  • an identification module used for identifying the image to be displayed and determining the display type of the image to be displayed
  • a determination module used for acquiring the screen state of the electronic device, and judging whether the screen state of the electronic device matches the display type of the image to be displayed; determining the display area according to the matching result;
  • the first display module is configured to divide the to-be-displayed image to obtain multiple sub-images and display the multiple sub-images in the display area if the screen state of the electronic device does not match the display type of the to-be-displayed image.
  • the above-mentioned identification module includes:
  • an acquisition unit for acquiring the aspect ratio of the image to be displayed
  • a comparison unit configured to compare the aspect ratio of the image to be displayed with a preset first threshold and a preset second threshold, wherein the preset first threshold and the second threshold are determined by the screen resolution of the electronic device;
  • the identification unit is configured to determine that the display type of the image to be displayed is a vertical and long image if the aspect ratio of the image to be displayed is greater than or equal to the preset first threshold; if the aspect ratio of the image to be displayed is less than or equal to the preset second threshold If the threshold is set, it is determined that the display type of the image to be displayed is a long horizontal image.
  • the screen state of the electronic device includes a horizontal screen and a vertical screen
  • the above-mentioned determining module is also used if the screen state of the electronic device is a horizontal screen, and the display type of the image to be displayed is a long-vertical image; Or the screen state of the electronic device is a vertical screen, and the display type of the image to be displayed is a long horizontal image; then it is determined that the screen state of the electronic device does not match the display type of the image to be displayed, and the screen of the electronic device is determined to be the first display area, and dividing the first display area to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is landscape, and the display type of the image to be displayed is a long horizontal image; or the screen state of the electronic device is a vertical screen, and the display type of the image to be displayed is a long vertical image; then determine the screen of the electronic device The state matches the display type of the image to be displayed, and determines that the screen of the electronic device is the second display area.
  • the above-mentioned device further includes:
  • the second display module is configured to display the to-be-displayed image in the second display area if the screen state of the electronic device matches the display type of the to-be-displayed image.
  • the above-mentioned determining module is further configured to obtain the width of the screen of the electronic device and the width of the image to be displayed if the screen state of the electronic device is landscape and the display type of the image to be displayed is vertical and horizontal , based on the ratio of the width of the screen of the electronic device to the width of the image to be displayed, divide the first display area to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is a portrait screen and the display type of the image to be displayed is a long horizontal image
  • the height of the screen of the electronic device and the height of the image to be displayed are obtained, based on the ratio of the height of the screen of the electronic device to the height of the image to be displayed , the first display area is divided to obtain a plurality of sub-display areas.
  • the above-mentioned first display module is further configured to display multiple sub-images in multiple sub-display areas.
  • the above determining module is further configured to divide the image to be displayed evenly based on the total number of sub-display areas to obtain multiple sub-images of the same size.
  • the above-mentioned determining module is further configured to perform uneven segmentation of the image to be displayed to obtain multiple sub-images of different sizes.
  • the above-mentioned device further includes:
  • the third display module is configured to re-segment the to-be-displayed image in response to the user's first operation, and display multiple sub-images obtained after the re-segmentation in multiple sub-display areas.
  • the above-mentioned device further includes:
  • the fourth display module is used for determining the screen of the electronic device as the first display area in response to the second operation of the user, and dividing the first display area to obtain a plurality of sub-display areas; dividing the image to be displayed to obtain a plurality of sub-display areas image, and display multiple sub-images in multiple sub-display areas.
  • an electronic device including:
  • Memory the memory is used to store computer program code, and the computer program code includes instructions, when the electronic equipment reads the instructions from the memory, so that the electronic equipment performs the following steps:
  • the to-be-displayed image is divided to obtain multiple sub-images, and the multiple sub-images are displayed in the display area.
  • the step of causing the above-mentioned electronic device to perform the identification of the to-be-displayed image and to determine the display type of the to-be-displayed image includes:
  • the aspect ratio of the image to be displayed is greater than or equal to the preset first threshold, it is determined that the display type of the image to be displayed is a long-vertical image
  • the aspect ratio of the image to be displayed is less than or equal to the preset second threshold, it is determined that the display type of the image to be displayed is a long-horizontal image.
  • the screen states of the above-mentioned electronic device include horizontal screen and vertical screen, and when the above-mentioned instruction is executed by the above-mentioned electronic device, the step of causing the above-mentioned electronic device to execute the determination of the display area according to the matching result includes:
  • the screen state of the electronic device is landscape, and the display type of the image to be displayed is vertical and horizontal; or the screen state of the electronic device is portrait, and the display type of the image to be displayed is vertical and horizontal; then determine the screen of the electronic device. If the state does not match the display type of the image to be displayed, determine that the screen of the electronic device is the first display area, and divide the first display area to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is landscape, and the display type of the image to be displayed is a long horizontal image; or the screen state of the electronic device is a vertical screen, and the display type of the image to be displayed is a long vertical image; then determine the screen of the electronic device The state matches the display type of the image to be displayed, and determines that the screen of the electronic device is the second display area.
  • the above-mentioned electronic device when executed by the above-mentioned electronic device, the above-mentioned electronic device further executes the following steps:
  • the image to be displayed is displayed in the second display area.
  • the step of causing the above-mentioned electronic device to execute the division of the first display area to obtain a plurality of sub-display areas includes:
  • the width of the screen of the electronic device and the width of the image to be displayed are obtained, based on the ratio of the width of the screen of the electronic device to the width of the image to be displayed , the first display area is divided to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is a portrait screen and the display type of the image to be displayed is a long horizontal image
  • the height of the screen of the electronic device and the height of the image to be displayed are obtained, based on the ratio of the height of the screen of the electronic device to the height of the image to be displayed , the first display area is divided to obtain a plurality of sub-display areas.
  • the step of causing the above-mentioned electronic device to display multiple sub-images in the display area includes:
  • a plurality of sub-images are displayed in a plurality of sub-display areas.
  • the step of causing the above-mentioned electronic device to execute the segmentation of the to-be-displayed image to obtain a plurality of sub-images includes:
  • the to-be-displayed image is evenly divided based on the total number of sub-display areas to obtain multiple sub-images of the same size.
  • the step of causing the above-mentioned electronic device to execute the segmentation of the to-be-displayed image to obtain a plurality of sub-images includes:
  • the image to be displayed is divided unevenly to obtain multiple sub-images of different sizes.
  • the to-be-displayed image is re-segmented, and multiple sub-images obtained after the re-segmentation are displayed in multiple sub-display areas.
  • the screen of the electronic device is the first display area, and the first display area is divided to obtain a plurality of sub-display areas;
  • the to-be-displayed image is divided to obtain multiple sub-images, and the multiple sub-images are displayed in the multiple sub-display areas.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, causes the computer to execute the method described in the first aspect.
  • an embodiment of the present application provides a computer program, which is used to execute the method described in the first aspect when the computer program is executed by a computer.
  • the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or may be stored in part or in part in a memory not packaged with the processor
  • FIG. 1 is a schematic flowchart of an embodiment of an image display method provided by the present application.
  • FIG. 2 is a schematic diagram of a display effect of a horizontal screen and a long-vertical image provided by an embodiment of the present application;
  • FIG. 3 is a schematic diagram of a display effect of a vertical screen and a long horizontal view according to an embodiment of the present application
  • FIG. 4 is a schematic diagram of segmentation of a display area of a horizontal screen according to an embodiment of the present application
  • FIG. 5 is a schematic diagram of segmentation of a display area of a vertical screen according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an embodiment of image segmentation of a long-vertical image provided by the present application.
  • FIG. 7 is a schematic diagram of image segmentation of a vertical and horizontal image provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a display effect of an embodiment of a user operation display provided by the present application.
  • FIG. 9 is a schematic diagram of a display effect of another embodiment of a user operation display provided by the present application.
  • FIG. 10 is a schematic flowchart of another embodiment of the image display method provided by the application.
  • FIG. 11 is a schematic diagram of another embodiment of image segmentation of a long-vertical image provided by this application.
  • FIG. 12 is a schematic diagram of a display effect of yet another embodiment of a user operation display provided by the present application.
  • FIG. 13-15 are schematic diagrams of display effects of another embodiment of the user operation display provided by the present application.
  • FIG. 16 is a schematic structural diagram of an image display device provided by an embodiment of the present application.
  • FIG. 17 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • the display area of the picture is usually obtained based on parameters such as a zoom ratio of the picture, a display position of the picture, and the like. Then, the bitmap information is obtained through technologies such as region decoding, and finally the image is rendered and displayed on the display area.
  • an embodiment of the present application proposes an image display method, which is applied to an electronic device.
  • the electronic device may be an electronic device with an image display function.
  • the specific form of the electronic device is not particularly limited in the embodiments of the present application.
  • FIG. 1 is a schematic flowchart of an embodiment of an image display method provided by this application, including:
  • Step 101 Acquire an image to be displayed, identify the image to be displayed, and determine a display type of the image to be displayed.
  • the to-be-displayed image may be an image selected by a user in an album of the electronic device, and the to-be-displayed image may also be an image in a web page selected by the user when the user browses a web page on the electronic device.
  • the present application does not specifically limit the source of the above image to be displayed.
  • the display type can be long vertical or long horizontal.
  • the preset aspect ratio first threshold I 1 may be obtained.
  • the above screen resolution and the values of I 1 are only illustrative, and do not constitute a limitation on the embodiments of the present application. In some embodiments, the above screen resolution and I 1 may also be other values.
  • the aspect ratio of the image to be displayed can be calculated, and the aspect ratio of the image to be displayed can be compared with the preset first threshold I1 of aspect ratio, if the aspect ratio of the image to be displayed is greater than or equal to By presetting the first threshold I 1 of the aspect ratio, it can be determined that the to-be-displayed image is a long-vertical image.
  • the image to be displayed 210 in the display interface 200 of the electronic device, includes a visible area 211 (for example, a solid gray area) and an invisible area 212 (for example, a diagonal area),
  • the visible area 211 is visible in the interface 200
  • the invisible area 212 is invisible in the interface 200
  • a preset second threshold value I 2 of the aspect ratio can also be obtained.
  • the preset second threshold value I 2 of the aspect ratio may also be determined according to the resolution of the electronic device.
  • the above screen resolution and the values of I 2 are only illustrative, and do not constitute a limitation to the embodiments of the present application. In some embodiments, the above screen resolution and I 2 may also be other values.
  • the aspect ratio of the image to be displayed can also be compared with the preset second threshold value I 2 of the aspect ratio. If the aspect ratio of the to-be-displayed image is less than or equal to the second preset aspect ratio threshold I 2 , it can be determined that the to-be-displayed image is a long-horizontal image.
  • Step 102 Judge the screen state of the electronic device to determine whether the screen state of the electronic device matches the display type of the image to be displayed.
  • the screen state of the electronic device may include a landscape screen or a portrait screen.
  • the screen state of the above electronic device can be determined by the angle between the direction of the screen and the direction of gravity.
  • the initial setting of an electronic device for example, a mobile phone
  • the mobile phone is usually embodied in the form of a vertical screen.
  • the orientation of the screen is At a 0 degree angle to the direction of gravity.
  • the mobile phone is in the horizontal screen state, and the orientation of the screen is at a 90-degree angle to the direction of gravity (for example, horizontal to the right) position) or a -90 degree angle (for example, horizontally to the left). Therefore, the screen state of the electronic device can be determined by the angle between the screen direction and the direction of gravity. It can be understood that the screen state of the electronic device can also be determined in other ways, and the above method of determining the screen state of the electronic device does not constitute a Limitations of Examples.
  • the screen state of the electronic device is landscape, and the image to be displayed is a long-vertical image, it is determined that the screen state of the electronic device does not match the display type of the image to be displayed.
  • the screen state of the electronic device is a vertical screen, and the image to be displayed is a long-horizontal image, it is determined that the screen state of the electronic device does not match the display type of the image to be displayed.
  • the screen state of the electronic device is landscape and the image to be displayed is a long landscape, it is determined that the screen state of the electronic device matches the display type of the image to be displayed.
  • the screen state of the electronic device is a vertical screen and the image to be displayed is a long vertical image, it is determined that the screen state of the electronic device matches the display type of the image to be displayed.
  • Step 103 Determine the display area based on the screen state of the electronic device and the display type of the image to be displayed.
  • the first display area can be determined according to the height or width of the image to be displayed and the height or width of the screen of the electronic device.
  • the electronic device is taken as a horizontal screen, and the image to be displayed is a long-vertical image as an example for description.
  • the electronic device is in a horizontal screen state, the display screen of the electronic device displays an interface 400 , and the width of the interface 400 is w 1 .
  • the to-be-displayed image 410 is a long vertical image, and the width of the to-be-displayed image 410 is w 2 .
  • the value of w 1 /w 2 may be calculated, and the first display area may be determined according to the value of w 1 /w 2 .
  • Exemplary Exemplary,
  • the above-mentioned interface 400 can be equally divided into two sub-display areas, for example, a first sub-display area 401 (for example, the left bold box) and a second sub-display area 402 ( For example, the right side bold box), wherein the widths of the first sub-display area 401 and the second sub-display area 402 are equal.
  • the division of the interface 400 and the number of sub-display areas are only exemplary descriptions, and do not constitute limitations to the embodiments of the present application.
  • the electronic device is taken as a vertical screen, and the image to be displayed is a long horizontal image as an example for description.
  • the electronic device is in a vertical screen state, the display screen of the electronic device displays an interface 500 , and the height of the interface 500 is h 1 .
  • the to-be-displayed image 510 is a vertical and horizontal view, and the height of the to-be-displayed image 510 is h 2 .
  • the value of h 1 /h 2 may be calculated, and the first display area may be determined according to the value of h 1 /h 2 .
  • Exemplary Exemplary,
  • the above-mentioned interface 500 can be equally divided into two sub-display areas, for example, a first sub-display area 501 (for example, the upper bold box) and a second sub-display area 502 ( For example, the lower side bold box), wherein the heights of the first sub-display area 501 and the second sub-display area 402 are equal.
  • the division of the interface 500 and the number of sub-display areas are only exemplary descriptions, and do not constitute limitations to the embodiments of the present application.
  • the entire screen of the electronic device can be determined as the second display area, and the to-be-displayed image can be directly displayed in the second display area.
  • Step 104 if the screen state of the electronic device does not match the display type of the image to be displayed, the image to be displayed is evenly divided based on the first display area to obtain a plurality of sub-images.
  • the to-be-displayed image may be divided according to the total number of sub-display areas in the first display area, thereby obtaining multiple sub-images.
  • the to-be-displayed image may be evenly divided according to the total number of sub-display areas.
  • the image to be displayed may be evenly divided into n blocks, and each block may be each corresponding sub-image, where n is the total number of sub-display areas.
  • an electronic device displays an interface 600 , the width of the interface 600 is w 1 , and the interface 600 is evenly divided into two sub-display areas on the left and right sides, wherein the above-mentioned two sub-display areas are the first sub-display area 601 and the second sub-display area respectively Area 602 is displayed.
  • the to-be-displayed image 610 may be equally divided into two blocks (eg, the first sub-image 611 and the second sub-image 612 ) in the longitudinal direction.
  • the width of the first sub-image 611 is w 2 and the height is h/2
  • the width of the second sub-image 612 is w 2 and the height is h/2.
  • the above-mentioned first sub-image 611 may be displayed in the first sub-display area 601
  • the above-mentioned second sub-image 612 may be displayed in the second sub-display area 602 .
  • an electronic device displays an interface 700 , the height of the interface 700 is h 1 , and the interface 700 is equally divided into two sub-display areas up and down, wherein the above-mentioned two sub-display areas are the first sub-display area 701 and the second sub-display area respectively Area 702 is displayed.
  • the to-be-displayed image 710 can be equally divided into two horizontal blocks (for example, the first sub-image 711 and the second sub-image 712 ).
  • the height of the first sub-image 711 is h 2 and the width is w/2
  • the height of the second sub-image 712 is h 2 and the width is w/2.
  • the above-mentioned first sub-image 711 may be displayed in the first sub-display area 701
  • the above-mentioned second sub-image 712 may be displayed in the second sub-display area 702 .
  • the width or height of the to-be-displayed image is not enough to allow the interface displayed by the electronic device to perform display area segmentation. For example, if w 1 /w 2 ⁇ 2, since the interface displayed by the electronic device cannot simultaneously display two images with the same width as the image to be displayed side by side, at this time, there is no need to divide the display area of the interface displayed by the electronic device. , w 1 is the width of the interface displayed by the electronic device, and w 2 is the width of the image to be displayed. Therefore, in some embodiments, the user can also zoom the size of the image to be displayed, for example, the user can zoom the image to be displayed through a multi-finger gesture.
  • the width of the interface displayed by the electronic device is reduced.
  • the interface displayed by the electronic device can be divided into display areas, and the images to be displayed can be divided into images, so that the images to be displayed can be displayed side by side in the above-mentioned display areas, thereby improving the display efficiency of images.
  • the electronic device displays an interface 800 , and the width of the interface 800 is w 1 and the height is h 1 .
  • the image to be displayed 810 has a width w 2 and a height h 2 . where w 1 /w 2 ⁇ 2.
  • the user can zoom out the to-be-displayed image 810 , thereby obtaining the to-be-displayed image 820 .
  • the width of the to-be-displayed image 820 is w 3 and the height is h 3 , and at this time, w 3 ⁇ w 2 .
  • the interface 800 can be divided, so that the first sub-display area 801 and the second sub-display area 802 can be obtained, wherein the first sub-display area 801 and the second sub-display area 801 802 are equal in width.
  • image segmentation can be performed on the to-be-displayed image 820, whereby a first sub-image 821 and a second sub-image 822 can be obtained, wherein the height and width of the first sub-image 821 and the second sub-image 822 are equal.
  • the first sub-image 821 has a height of h 3 /2 and a width of w 3 ;
  • the second sub-image 822 has a height of h 3 /2 and a width of w 3 .
  • Step 105 Display the sub-image in the sub-display area of the first display area.
  • the sub-display area of the first display area for example, the above-mentioned first sub-display area and the second sub-display area
  • the divided images to be displayed for example, the above-mentioned first sub-image and second sub-image
  • the above-mentioned first sub-image and the second sub-image may be displayed in the above-mentioned first sub-display area and the second sub-display area
  • the first sub-image may be displayed in the first sub-display area
  • the second sub-image is displayed in the second sub-display area.
  • a region decoder can be used to decode the first and second sub-images, so that the bitmap corresponding to each sub-image can be obtained. Then, each bitmap data above can be rendered and displayed in the corresponding sub-display area.
  • the image to be displayed before division can also be decoded as a whole, thereby obtaining bitmap data.
  • the above-mentioned bitmap data may be divided according to the total number of sub-display areas, thereby obtaining a first sub-bitmap and a second sub-bitmap, wherein the first sub-bitmap may be a bitmap of the above-mentioned first sub-image data, the second sub-bitmap may be the bitmap data of the second sub-image.
  • the first sub-bitmap may be rendered and displayed in the first sub-display area, and the second sub-bitmap may be rendered and displayed in the second sub-display area.
  • first sub-display area second sub-display area
  • first sub-image first sub-image
  • second sub-image second sub-image
  • Step 106 in response to the user's operation, update and display the to-be-displayed image.
  • the user may further operate the electronic device, so that the to-be-displayed image is updated and displayed.
  • the above operations may include performing gesture sliding on the screen of the electronic device, transposing the screen of the electronic device, and the like.
  • the user may also perform a sliding operation on the image to be displayed, so as to scroll through the image to be displayed.
  • the user can browse the to-be-displayed image through a sliding gesture, and in response to the user's sliding operation, the electronic device can obtain the display position of the sub-image, and can re-segment the to-be-displayed image based on the display position of the sub-image, and re-segment the sub-image.
  • the obtained sub-image is decoded and rendered, and displayed after rendering.
  • the electronic device displays an interface 900 , the height of the interface 900 is h 1 , and the interface 900 includes a first sub-display area 901 and a second sub-display area 902 .
  • the image to be displayed includes a first sub-image 911 and a second sub-image 912 .
  • the first sub-display area 901 displays the first sub-image 911
  • the second sub-display area 902 displays the first sub-image 912 .
  • the first sub-image 911 includes an invisible area 9111 and a visible area 9112
  • the second sub-image 912 includes a visible area 9121 and an invisible area 9122 .
  • the user can slide (eg, slide down) on the first sub-image 911 to browse the to-be-displayed image.
  • the height of the invisible area 9111 is continuously reduced, and the height of the visible area 9112 remains unchanged.
  • the electronic device can re-segment the to-be-displayed image as the sub-image slides.
  • the image to be displayed may be re-segmented, thereby obtaining the first sub-image 913 and the second sub-image 914 .
  • the first sub-image 913 no longer includes the invisible area 9111, that is, the height of the first sub-image 913 is equal to the height of the first display area 901, for example, the height of the first sub-image 913 is h 1 .
  • the second sub-image 914 includes a visible area 9141 and an invisible area 9142. The height of the visible area 9141 and the visible area 9121 are the same, and the height of the invisible area 9142 is greater than that of the invisible area 9122.
  • the user can transpose the electronic device.
  • the electronic device is placed in landscape orientation. Taking the electronic device as a landscape screen as an example, if the image to be displayed is a long landscape image, the electronic device may determine to display the above image to be displayed in the second display area. At this time, the user may vertically place the electronic device in a vertical screen state. Since the vertical screen state does not match the long-horizontal image, the long-horizontal image can be divided and displayed. For a specific process of split display, reference may be made to steps 103 to 105, which will not be repeated here.
  • FIG. 10 is a schematic flowchart of another embodiment of the image display method provided by the application, including:
  • Step 201 Acquire an image to be displayed, identify the image to be displayed, and determine a display type of the image to be displayed.
  • the display type may be a long vertical image or a long horizontal image.
  • the preset aspect ratio first threshold I 1 may be obtained.
  • the above screen resolution and the values of I 1 are only illustrative, and do not constitute a limitation on the embodiments of the present application. In some embodiments, the above screen resolution and I 1 may also be other values.
  • the aspect ratio of the image to be displayed can be calculated, and the aspect ratio of the image to be displayed can be compared with the preset first threshold I1 of aspect ratio, if the aspect ratio of the image to be displayed is greater than or equal to By presetting the first threshold I 1 of the aspect ratio, it can be determined that the to-be-displayed image is a long-vertical image.
  • a preset second threshold value I 2 of the aspect ratio can also be obtained.
  • the preset second threshold value I 2 of the aspect ratio may also be determined according to the resolution of the electronic device.
  • the above screen resolution and the values of I 2 are only illustrative, and do not constitute a limitation to the embodiments of the present application. In some embodiments, the above screen resolution and I 2 may also be other values.
  • the aspect ratio of the image to be displayed can also be compared with the preset second threshold value I 2 of the aspect ratio. If the aspect ratio of the to-be-displayed image is less than or equal to the second preset aspect ratio threshold I 2 , it can be determined that the to-be-displayed image is a long-horizontal image.
  • Step 202 Judge the screen state of the electronic device to determine whether the screen state of the electronic device matches the display type of the image to be displayed.
  • the screen state of the electronic device may include a landscape screen or a portrait screen.
  • the screen state of the above electronic device may be determined by the angle between the direction of the screen and the direction of gravity.
  • the initial setting of an electronic device for example, a mobile phone
  • the mobile phone is usually embodied in the form of a vertical screen.
  • the orientation of the screen is At a 0 degree angle to the direction of gravity.
  • the mobile phone is in the horizontal screen state, and the orientation of the screen is at a 90-degree angle to the direction of gravity (for example, horizontal to the right) position) or a -90 degree angle (for example, horizontally to the left). Therefore, the screen state of the electronic device can be determined by the angle between the screen direction and the direction of gravity. It is understood that the screen state of the electronic device can also be determined in other ways. The above-mentioned way of determining the screen state of the electronic device does not constitute a Limitations of Examples.
  • the screen state of the electronic device is landscape, and the image to be displayed is a long-vertical image, it is determined that the screen state of the electronic device does not match the display type of the image to be displayed.
  • the screen state of the electronic device is a portrait screen, and the image to be displayed is a long horizontal image, it is determined that the screen state of the electronic device does not match the display type of the image to be displayed.
  • the screen state of the electronic device is landscape and the image to be displayed is a long landscape, it is determined that the screen state of the electronic device matches the display type of the image to be displayed.
  • the screen state of the electronic device is a vertical screen and the image to be displayed is a long vertical image, it is determined that the screen state of the electronic device matches the display type of the image to be displayed.
  • Step 203 Determine the first display area based on the screen state of the electronic device and the display type of the image to be displayed.
  • the first display area can be determined according to the height or width of the image to be displayed and the height or width of the screen of the electronic device.
  • the entire screen of the electronic device can be determined as the second display area, and the to-be-displayed image can be directly displayed in the second display area.
  • Step 204 if the screen state of the electronic device does not match the display type of the image to be displayed, the image to be displayed is unevenly divided based on the first display area to obtain multiple sub-images.
  • the to-be-displayed image may be divided unevenly.
  • the image to be displayed is usually formed by splicing multiple images, and when multiple images are spliced, there will be obvious splicing traces at the edges of the splicing. Therefore, in the specific implementation, the above-mentioned images to be displayed can be scanned line by line from top to bottom, and the edge detection algorithm can be used to detect the edges of the images to be displayed, so that the splicing points between the spliced images can be obtained. According to the above splicing Points are divided, so that multiple stitched images can be obtained. Next, the plurality of spliced images obtained by segmentation through the edge detection algorithm can be used as sub-images.
  • Step 205 Display the sub-image in the sub-display area in the first display area.
  • each sub-image may be displayed in the corresponding sub-display area.
  • the images to be displayed may be displayed in the sub-display area according to the sequence of division of the images to be displayed.
  • the to-be-displayed image 1110 is formed by splicing a first sub-image 1111 , a second sub-image 1112 , a third sub-image 1113 and a fourth sub-image 1114 .
  • a first sub-image 1111 , a second sub-image 1112 , a third sub-image 1113 and a fourth sub-image 1114 can be obtained respectively.
  • the electronic device displays an interface 1100 , and the interface 1100 includes a first sub-display area 1101 and a second sub-display area 1102 .
  • the first sub-image 1111 can be displayed in the first sub-display area 1101
  • the second sub-image 1112 can be displayed in the second sub-display area 1102
  • the third sub-image 1113 and the fourth sub-image 1114 are respectively arranged in After the second sub-image 1112, it is waiting to be displayed.
  • FIG. 11 exemplarily shows a scene in which the electronic device is a landscape screen and the image to be displayed is a long vertical image.
  • the above method of unevenly dividing and displaying the image to be displayed is also applicable to the scene where the electronic device is a vertical screen and the image to be displayed is a long horizontal image.
  • Step 206 in response to the user's operation, update and display the to-be-displayed image.
  • the user may further operate the electronic device, so that the to-be-displayed image is updated and displayed.
  • the above operations may include clicking, sliding, transposing the screen of the electronic device, and the like on the screen of the electronic device.
  • the user may perform a sliding operation on the interface of the electronic device to browse the sub-images not displayed in the display area.
  • the user may swipe (eg, swipe left, swipe right, swipe up, or swipe down) on the interface of the electronic device.
  • swipe eg, swipe left, swipe right, swipe up, or swipe down
  • the interface of the electronic device displays a new image.
  • the to-be-displayed image 1210 includes a first sub-image 1211 , a second sub-image 1212 , a third sub-image 1213 and a fourth sub-image 1214 .
  • the electronic device displays an interface 1200 , and the interface 1200 includes a first sub-display area 1201 and a second sub-display area 1202 .
  • the first sub-image 1211 is displayed in the first sub-display area 1201
  • the second sub-image 1212 is displayed in the second sub-display area 1202
  • the third sub-image 1213 and the fourth sub-image 1214 are respectively arranged after the second sub-image 1212 , waiting to be displayed.
  • the user can slide the interface 1200 to the left.
  • the first sub-image 1211 is moved out of the interface 1200, the second sub-image 1212 is displayed in the first sub-display area 1201, the third sub-image 1213 is displayed in the second sub-display area 1202, and the fourth sub-image 1213 is displayed in the second sub-display area 1202.
  • the sub-image 1214 is arranged after the third sub-image 1213, waiting to be displayed.
  • the user may also perform a click operation on the interface of the electronic device, so as to browse the sub-images displayed in the sub-display area.
  • the user may click on any sub-image displayed on the interface of the electronic device.
  • the electronic device may determine the sub-image selected by the user, and the sub-image selected by the user may be focused.
  • the user can perform an operation on the selected sub-image.
  • the user can click on the selected sub-image, or the user can also perform a zoom-in operation on the selected sub-image through a multi-finger gesture.
  • the electronic device may display the sub-image selected by the user in an enlarged manner (for example, the enlarged display may be a full-screen display), thereby enabling the user to browse the details of the sub-image.
  • the electronic device displays an interface 1300
  • the interface 1300 includes a first sub-display area 1301 and a second sub-display area 1302, wherein the first sub-display area 1301 displays the first sub-display area 1301.
  • the sub-image 1311, the second sub-display area 1302 displays the second sub-image 1312.
  • the user may operate the second sub-image 1312 (eg, click the second sub-image 1312).
  • the second sub-image 1312 is focused, thereby obtaining the interface 1400 shown in FIG. 14 .
  • FIG. 13 As shown in FIG.
  • the interface 1400 includes a first sub-display area 1301 and a second sub-display area 1302 , wherein the first sub-display area 1301 displays the first sub-image 1311 , and the second sub-display area 1302 displays the second sub-image 1312 , the second sub-image 1312 is in a focused state.
  • the user may operate (eg, click or slide with multiple fingers) on the second sub-image 1312 in the focused state, so as to zoom in and display the above-mentioned second sub-image 1312 .
  • the electronic device may zoom in and display the second sub-image 1312 to obtain the interface 1500 shown in FIG. 15 .
  • the above-mentioned second sub-image 1312 can be displayed in full screen according to the screen size of the electronic device, so that the resolution of the second sub-image 1312 can be adapted to the screen size of the electronic device, thereby improving the user's viewing experience .
  • the interface 1500 includes an enlarged second sub-image 1312 . Since the enlarged second sub-image 1312 may not be completely displayed on the screen of the electronic device, the above-mentioned enlarged second sub-image 1312 may include an invisible area 13121 and a visible area 13122. or zoom in) or swipe (eg, slide up or down) to further browse the above-mentioned invisible area 13121 and visible area 13122.
  • the user can transpose the electronic device.
  • the electronic device is placed in landscape orientation. Taking the electronic device as a landscape screen as an example, if the image to be displayed is a long landscape image, the electronic device may determine to display the above image to be displayed in the second display area. At this time, the user may vertically place the electronic device in a vertical screen state. Since the vertical screen state does not match the long-horizontal image, the long-horizontal image can be divided and displayed. For the specific process of segmented display, reference may be made to steps 203 to 205, which will not be repeated here.
  • FIG. 16 is a schematic structural diagram of an embodiment of an image display device of the present application.
  • the above-mentioned image display device 1600 may include: an acquisition module 1610, an identification module 1620, a determination module 1630, and a first display module 1640; wherein,
  • an acquisition module 1610 configured to acquire an image to be displayed
  • the identification module 1620 is used to identify the image to be displayed and determine the display type of the image to be displayed;
  • the determination module 1630 is used to obtain the screen state of the electronic device, and determine whether the screen state of the electronic device matches the display type of the image to be displayed; determine the display area according to the matching result;
  • the first display module 1640 is configured to divide the to-be-displayed image to obtain multiple sub-images and display the multiple sub-images in the display area if the screen state of the electronic device does not match the display type of the to-be-displayed image.
  • the above-mentioned identification module 1620 includes: an acquisition unit 1621, a comparison unit 1622 and an identification unit 1623; wherein,
  • Obtaining unit 1621 used to obtain the aspect ratio of the image to be displayed
  • a comparison unit 1622 configured to compare the aspect ratio of the image to be displayed with a preset first threshold and a preset second threshold, wherein the preset first threshold and the second threshold are determined by the screen resolution of the electronic device;
  • the identification unit 1623 is configured to determine that the display type of the image to be displayed is a vertical and vertical image if the aspect ratio of the image to be displayed is greater than or equal to the preset first threshold; if the aspect ratio of the image to be displayed is less than or equal to the preset first threshold Two thresholds, then it is determined that the display type of the image to be displayed is a long-horizontal image.
  • the screen state of the electronic device includes a horizontal screen and a vertical screen
  • the determining module 1630 is further configured to be used if the screen state of the electronic device is a horizontal screen, and the display type of the image to be displayed is a long-vertical image ; Or the screen state of the electronic device is a vertical screen, and the display type of the image to be displayed is a long horizontal image; then it is determined that the screen state of the electronic device does not match the display type of the image to be displayed, and the screen of the electronic device is determined to be the first display area. , and the first display area is divided to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is landscape, and the display type of the image to be displayed is a long horizontal image; or the screen state of the electronic device is a vertical screen, and the display type of the image to be displayed is a long vertical image; then determine the screen of the electronic device The state matches the display type of the image to be displayed, and determines that the screen of the electronic device is the second display area.
  • the above-mentioned apparatus 1600 further includes: a second display module 1650; wherein,
  • the second display module 1650 is configured to display the image to be displayed in the second display area if the screen state of the electronic device matches the display type of the image to be displayed.
  • the above-mentioned determining module 1630 is further configured to obtain the width of the screen of the electronic device and the size of the image to be displayed if the screen state of the electronic device is landscape and the display type of the image to be displayed is vertical and horizontal. Width, based on the ratio of the width of the screen of the electronic device to the width of the image to be displayed, divides the first display area to obtain a plurality of sub-display areas;
  • the screen state of the electronic device is a portrait screen and the display type of the image to be displayed is a long horizontal image
  • the height of the screen of the electronic device and the height of the image to be displayed are obtained, based on the ratio of the height of the screen of the electronic device to the height of the image to be displayed , the first display area is divided to obtain a plurality of sub-display areas.
  • the above-mentioned first display module 1640 is further configured to display multiple sub-images in multiple sub-display areas.
  • the above-mentioned determining module 1630 is further configured to divide the image to be displayed evenly based on the total number of sub-display areas to obtain multiple sub-images of the same size.
  • the above-mentioned determining module 1630 is further configured to perform uneven segmentation on the to-be-displayed image to obtain multiple sub-images of different sizes.
  • the above-mentioned apparatus 1600 further includes: a third display module 1660; wherein,
  • the third display module 1660 is configured to re-segment the to-be-displayed image in response to the user's first operation, and display multiple sub-images obtained after the re-segmentation in multiple sub-display areas.
  • the above-mentioned apparatus further includes: a fourth display module 1670; wherein,
  • the fourth display module 1670 is used to determine the screen of the electronic device as the first display area in response to the second operation of the user, and divide the first display area to obtain multiple sub-display areas; sub-images, and display multiple sub-images in multiple sub-display areas.
  • the image display device provided by the embodiment shown in FIG. 16 can be used to implement the technical solutions of the method embodiments shown in FIG. 1 to FIG. 15 of the present application, and the implementation principles and technical effects can be further referred to the relevant descriptions in the method embodiments.
  • each module of the image display apparatus shown in FIG. 16 is only a division of logical functions, and may be fully or partially integrated into a physical entity in actual implementation, or may be physically separated.
  • these modules can all be implemented in the form of software calling through processing elements; they can also all be implemented in hardware; some modules can also be implemented in the form of software calling through processing elements, and some modules can be implemented in hardware.
  • the detection module may be a separately established processing element, or may be integrated in a certain chip of the electronic device.
  • the implementation of other modules is similar.
  • all or part of these modules can be integrated together, and can also be implemented independently.
  • each step of the above-mentioned method or each of the above-mentioned modules can be completed by an integrated logic circuit of hardware in the processor element or an instruction in the form of software.
  • the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as: ASIC), or, one or more microprocessors Digital Signal Processor (hereinafter referred to as: DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array; hereinafter referred to as: FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • these modules can be integrated together and implemented in the form of a system-on-a-chip (System-On-a-Chip; hereinafter referred to as: SOC).
  • FIG. 17 exemplarily shows a schematic structural diagram of the electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 150, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 200 . The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the execution of the application sharing method provided by this embodiment of the present application may be completed by the processor 110 controlling or calling other components, for example, calling the processing program of the present application embodiment stored in the internal memory 121 , or calling a third party through the external memory interface 120
  • the processing program of the embodiment of the present application stored in the device controls the wireless communication module 160 to perform data communication with other electronic devices, so as to realize application sharing among multiple electronic devices and improve user experience.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code Division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may display a user interface through the display screen 194 .
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the electronic device 100 may receive a user's operation through the touch sensor 180K, for example, an operation such as a single click, a double click, or a slide.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate a charging state, a change in power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the above-mentioned electronic device 100 includes corresponding hardware structures and/or software modules for executing each function.
  • the embodiments of the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Experts may use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of the embodiments of the present application.
  • the electronic device 100 can be divided into functional modules according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • Each functional unit in each of the embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • a computer-readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请实施例提供一种图像显示方法、电子设备及存储介质,涉及计算机技术领域,该方法包括:获取待显示图像;对所述待显示图像进行识别,确定所述待显示图像的显示类型;获取所述电子设备的屏幕状态,判断所述电子设备的屏幕状态与所述待显示图像的显示类型是否匹配;根据匹配结果确定显示区域;若所述电子设备的屏幕状态与所述待显示图像的显示类型不匹配,则对所述待显示图像进行分割,得到多个子图像,并在所述显示区域显示多个所述子图像。本申请实施例提供的方法,能够提高待显示图像在电子设备上的显示效率,提高用户的观看体验。

Description

图像显示方法、电子设备及存储介质
本申请要求于2021年02月07日提交中国专利局、申请号为202110179798.0、申请名称为“图像显示方法、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,尤其涉及一种图像显示方法、电子设备及存储介质。
背景技术
随着计算机技术的快速发展,越来越多的应用程序(APP)应运而生。当前很多APP(例如,长截图软件、长截图拼接软件等)可以产生长条图,例如,该长条图可以是竖状的长条图。由于上述长条图能更完整的展示一些连贯的信息,用户通常会截取一些诸如网页、聊条记录等形式的长条图。
然而,对于上述长条图,用户在通过电子设备进行浏览的时候会给用户带来较差的体验。例如,电视通常是横屏的电子设备,在通过电视对长条图进行浏览的时候,由于图片显示通常是居中显示,因此,用户需要不断的通过上下移动操作来完成对长条图的浏览,操作繁琐。
发明内容
本申请实施例提供了一种图像显示方法、电子设备及存储介质,以提供一种对图片进行自适应显示的方式。
第一方面,本申请实施例提供了一种图像显示方法,应用于电子设备,包括:
获取待显示图像;具体地,该待显示图像可以是电子设备的相册中获取,也可以是在电子设备的网页中进行浏览时获取。本申请实施例对待显示图像的来源不作特殊限定,
对待显示图像进行识别,确定待显示图像的显示类型;具体地,该显示类型可以包括长横图和长竖图。
获取电子设备的屏幕状态,判断电子设备的屏幕状态与待显示图像的显示类型是否匹配;具体地,该屏幕状态可以包括横屏和竖屏。其中,电子设备的横屏状态可以与长横图匹配,也就是说,长横图可以在横屏状态的电子设备上正常显示。电子设备的竖屏状态可以与长竖图匹配,也就是说,长竖图可以在竖屏状态的电子设备上正常显示。
根据匹配结果确定显示区域;若电子设备的屏幕状态与待显示图像的显示类型不匹配,则对待显示图像进行分割,得到多个子图像,并在显示区域显示多个子图像。
本申请实施例中,通过判断待显示图像的类型以及电子设备的屏幕状态,在图像类型与屏幕状态不匹配时,对待显示图像进行分割显示,由此可以提高图像显示的效 率,提高用户的观看体验。
其中一种可能的实现方式中,对待显示图像进行识别,确定待显示图像的显示类型包括:
获取待显示图像的高宽比;将待显示图像的高宽比与预设第一阈值及预设第二阈值进行比较,其中,预设第一阈值及第二阈值由电子设备的屏幕分辨率确定;
若待显示图像的高宽比大于或等于预设第一阈值,则确定待显示图像的显示类型为长竖图;若待显示图像的高宽比小于或等于预设第二阈值,则确定待显示图像的显示类型为长横图。
本申请实施例中,通过将待显示图像的高宽比与预设高宽比进行比较,以确定图像类型,可以有效识别待显示图像的类型。
其中一种可能的实现方式中,电子设备的屏幕状态包括横屏及竖屏,根据匹配结果确定显示区域包括:
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图;则确定电子设备的屏幕状态与待显示图像的显示类型不匹配,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长横图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长竖图;则确定电子设备的屏幕状态与待显示图像的显示类型匹配,并确定电子设备的屏幕为第二显示区域。
本申请实施例中,根据电子设备的屏幕状态与图像类型的匹配结果确定显示区域,可以有效的对显示区域进行划分,进而可以提高待显示图像的显示效率。
为了进一步提高待显示图像的显示效率,其中一种可能的实现方式中,包括:
若电子设备的屏幕状态与待显示图像的显示类型匹配,则在第二显示区域显示待显示图像。
为了对显示区域进行有效分隔,其中一种可能的实现方式中,对第一显示区域进行分割,得到多个子显示区域包括:
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图,则获取电子设备屏幕的宽度及待显示图像的宽度,基于电子设备屏幕的宽度与待显示图像的宽度的比值,对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图,则获取电子设备屏幕的高度及待显示图像的高度,基于电子设备屏幕的高度与待显示图像的高度的比值,对第一显示区域进行分割,得到多个子显示区域。
为了提高待显示图像的显示效率,其中一种可能的实现方式中,在显示区域显示多个子图像包括:
在多个子显示区域显示多个子图像。
为了对待显示图像进行有效分割,其中一种可能的实现方式中,对待显示图像进行分割,得到多个子图像包括:
基于子显示区域的总数对待显示图像进行均匀分割,得到同尺寸的多个子图像。
为了对待显示图像进行有效分割,其中一种可能的实现方式中,对待显示图像进 行分割,得到多个子图像包括:
对待显示图像进行不均匀分割,得到不同尺寸的多个子图像。
其中一种可能的实现方式中,在显示区域显示多个子图像之后,还包括:
响应于用户的第一操作,对待显示图像进行重分割,并将重分割后得到的多个子图像在多个子显示区域中显示。具体地,该第一操作可以是对待显示图像进行缩放。
本申请实施例中,用户可以动态调整电子设备的屏幕状态,由此可以使得待显示图像可以动态的分隔显示,提高待显示图像显示的灵活性。
其中一种可能的实现方式中,在第二显示区域显示待显示图像之后,还包括:
响应于用户的第二操作,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;具体地,该第二操作可以是调整电子设备的屏幕状态,例如,从电子设备的横置状态调整为竖置状态,或从电子设备的竖置状态调整为横置状态。
对待显示图像进行分割,得到多个子图像,并在多个子显示区域显示多个子图像。
本申请实施例中,用户可以动态调整电子设备的屏幕状态,由此可以使得待显示图像可以动态的分割后显示,进而可以提高待显示图像显示的灵活性。
第二方面,本申请实施例提供一种图像显示装置,应用于电子设备,包括:
获取模块,用于获取待显示图像;
识别模块,用于对待显示图像进行识别,确定待显示图像的显示类型;
确定模块,用于获取电子设备的屏幕状态,判断电子设备的屏幕状态与待显示图像的显示类型是否匹配;根据匹配结果确定显示区域;
第一显示模块,用于若电子设备的屏幕状态与待显示图像的显示类型不匹配,则对待显示图像进行分割,得到多个子图像,并在显示区域显示多个子图像。
其中一种可能的实现方式中,上述识别模块包括:
获取单元,用于获取待显示图像的高宽比;
比较单元,用于将待显示图像的高宽比与预设第一阈值及预设第二阈值进行比较,其中,预设第一阈值及第二阈值由电子设备的屏幕分辨率确定;
识别单元,用于若待显示图像的高宽比大于或等于预设第一阈值,则确定待显示图像的显示类型为长竖图;若待显示图像的高宽比小于或等于预设第二阈值,则确定待显示图像的显示类型为长横图。
其中一种可能的实现方式中,上述电子设备的屏幕状态包括横屏及竖屏,上述确定模块还用于若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图;则确定电子设备的屏幕状态与待显示图像的显示类型不匹配,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长横图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长竖图;则确定电子设备的屏幕状态与待显示图像的显示类型匹配,并确定电子设备的屏幕为第二显示区域。
其中一种可能的实现方式中,上述装置还包括:
第二显示模块,用于若电子设备的屏幕状态与待显示图像的显示类型匹配,则在 第二显示区域显示待显示图像。
其中一种可能的实现方式中,上述确定模块还用于若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图,则获取电子设备屏幕的宽度及待显示图像的宽度,基于电子设备屏幕的宽度与待显示图像的宽度的比值,对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图,则获取电子设备屏幕的高度及待显示图像的高度,基于电子设备屏幕的高度与待显示图像的高度的比值,对第一显示区域进行分割,得到多个子显示区域。
其中一种可能的实现方式中,上述第一显示模块还用于在多个子显示区域显示多个子图像。
其中一种可能的实现方式中,上述确定模块还用于基于子显示区域的总数对待显示图像进行均匀分割,得到同尺寸的多个子图像。
其中一种可能的实现方式中,上述确定模块还用于对待显示图像进行不均匀分割,得到不同尺寸的多个子图像。
其中一种可能的实现方式中,上述装置还包括:
第三显示模块,用于响应于用户的第一操作,对待显示图像进行重分割,并将重分割后得到的多个子图像在多个子显示区域中显示。
其中一种可能的实现方式中,上述装置还包括:
第四显示模块,用于响应于用户的第二操作,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;对待显示图像进行分割,得到多个子图像,并在多个子显示区域显示多个子图像。
第三方面,本申请实施例提供一种电子设备,包括:
存储器,上述存储器用于存储计算机程序代码,上述计算机程序代码包括指令,当上述电子设备从上述存储器中读取上述指令,以使得上述电子设备执行以下步骤:
获取待显示图像;
对待显示图像进行识别,确定待显示图像的显示类型;
获取电子设备的屏幕状态,判断电子设备的屏幕状态与待显示图像的显示类型是否匹配;
根据匹配结果确定显示区域;
若电子设备的屏幕状态与待显示图像的显示类型不匹配,则对待显示图像进行分割,得到多个子图像,并在显示区域显示多个子图像。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行对待显示图像进行识别,确定待显示图像的显示类型的步骤包括:
获取待显示图像的高宽比;
将待显示图像的高宽比与预设第一阈值及预设第二阈值进行比较,其中,预设第一阈值及第二阈值由电子设备的屏幕分辨率确定;
若待显示图像的高宽比大于或等于预设第一阈值,则确定待显示图像的显示类型为长竖图;
若待显示图像的高宽比小于或等于预设第二阈值,则确定待显示图像的显示类型 为长横图。
其中一种可能的实现方式中,上述电子设备的屏幕状态包括横屏及竖屏,上述指令被上述电子设备执行时,使得上述电子设备执行根据匹配结果确定显示区域的步骤包括:
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图;则确定电子设备的屏幕状态与待显示图像的显示类型不匹配,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长横图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长竖图;则确定电子设备的屏幕状态与待显示图像的显示类型匹配,并确定电子设备的屏幕为第二显示区域。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备还执行以下步骤:
若电子设备的屏幕状态与待显示图像的显示类型匹配,则在第二显示区域显示待显示图像。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行对第一显示区域进行分割,得到多个子显示区域的步骤包括:
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图,则获取电子设备屏幕的宽度及待显示图像的宽度,基于电子设备屏幕的宽度与待显示图像的宽度的比值,对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图,则获取电子设备屏幕的高度及待显示图像的高度,基于电子设备屏幕的高度与待显示图像的高度的比值,对第一显示区域进行分割,得到多个子显示区域。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行在显示区域显示多个子图像的步骤包括:
在多个子显示区域显示多个子图像。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行对待显示图像进行分割,得到多个子图像的步骤包括:
基于子显示区域的总数对待显示图像进行均匀分割,得到同尺寸的多个子图像。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行对待显示图像进行分割,得到多个子图像的步骤包括:
对待显示图像进行不均匀分割,得到不同尺寸的多个子图像。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行在显示区域显示多个子图像的步骤之后,还执行以下步骤:
响应于用户的第一操作,对待显示图像进行重分割,并将重分割后得到的多个子图像在多个子显示区域中显示。
其中一种可能的实现方式中,上述指令被上述电子设备执行时,使得上述电子设备执行在第二显示区域显示待显示图像的步骤之后,还执行以下步骤:
响应于用户的第二操作,确定电子设备的屏幕为第一显示区域,并对第一显示区 域进行分割,得到多个子显示区域;
对待显示图像进行分割,得到多个子图像,并在多个子显示区域显示多个子图像。
第四方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行如第一方面所述的方法。
第五方面,本申请实施例提供一种计算机程序,当上述计算机程序被计算机执行时,用于执行第一方面所述的方法。
在一种可能的设计中,第五方面中的程序可以全部或者部分存储在与处理器封装在一起的存储介质上,也可以部分或者全部存储在不与处理器封装在一起的存储器上
附图说明
图1为本申请提供的图像显示方法一个实施例的流程示意图;
图2为本申请实施例提供的横屏屏幕及长竖图的显示效果示意图;
图3为本申请实施例提供的竖屏屏幕及长横图的显示效果示意图;
图4为本申请实施例提供的横屏屏幕的显示区域分割示意图;
图5为本申请实施例提供的竖屏屏幕的显示区域分割示意图;
图6为本申请提供的长竖图的图像分割一个实施例的示意图;
图7为本申请实施例提供的长横图的图像分割示意图;
图8为本申请提供的用户操作显示一个实施例的显示效果示意图;
图9为本申请提供的用户操作显示另一个实施例的显示效果示意图;
图10为本申请提供的图像显示方法另一个实施例的流程示意图;
图11为本申请提供的长竖图的图像分割另一个实施例的示意图;
图12为本申请提供的用户操作显示再一个实施例的显示效果示意图;
图13-图15为本申请提供的用户操作显示再一个实施例的显示效果示意图;
图16为本申请实施例提供的图像显示装置的结构示意图;
图17为本申请实施例提供的电子设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
目前,图片在电子设备中显示时,通常都是基于图片的缩放比、图片的显示位置等参数,得到图片的显示区域。然后通过区域解码等技术得到位图信息,最后在显示区域上完成图像渲染并进行显示。
然而,当竖状的长条图在横屏的电子设备中进行显示时,由于该长条图的高和宽 的比例较大,导致图像的显示区域仅占整张图片的一小部分,且用户需要不断的滑动该图片,来浏览整张图片,给用户带来较差的体验。
基于上述问题,本申请实施例提出了一种图像显示方法,应用于电子设备。该电子设备可以是具有图像显示功能的电子设备。本申请实施例对电子设备的具体形式不作特殊限定。
现结合图1-图15对本申请实施例提供的图像显示方法进行说明。
图1为本申请提供的图像显示方法一个实施例的流程示意图,包括:
步骤101,获取待显示图像,对该待显示图像进行识别,确定该待显示图像的显示类型。
具体地,该待显示图像可以是上述电子设备的相册中的用户选取的图像,该待显示图像也可以是用户在上述电子设备上进行浏览网页时,用户选取的网页中的图像。本申请对上述待显示图像的来源不作特殊限定。该显示类型可以是长竖图或长横图。
接着,可以获取预设高宽比第一阈值I 1。在具体实现时,该预设高宽比第一阈值I 1可以根据电子设备的分辨率确定。示例性的,若电子设备的屏幕分辨率为3840*2160时,可以将上述高宽比第一阈值I 1设定为3<=I 1<=5中的任一数值。可以理解的是,I 1可以在上述取值范围内任意取值。上述屏幕分辨率及I 1的取值仅是示例性说明,并不构成对本申请实施例的限定,在一些实施例中,上述屏幕分辨率和I 1也可以是其他数值。
然后,可以计算上述待显示图像的高宽比,并可以将上述待显示图像的高宽比与预设高宽比第一阈值I 1进行比较,若上述待显示图像的高宽比大于或等于预设高宽比第一阈值I 1,则可以确定该待显示图像为长竖图。
现结合图2进行说明,如图2所示,电子设备的显示屏显示界面200,待显示图像210包括可见区域211(例如,纯灰色区域)及不可见区域212(例如,斜线区域),可见区域211在界面200中可见,不可见区域212在界面200中不可见,该待显示图像210的高度为h及宽度为w。若h/w>=I 1,则待显示图像210为长竖图。
可选地,还可以获取预设高宽比第二阈值I 2。在具体实现时,该预设高宽比第二阈值I 2也可以根据电子设备的分辨率确定。示例性的,若电子设备的屏幕分辨率为3840*2160时,可以将上述高宽比第二阈值I 2设定为0.2<=I 2<=0.3中的任一数值。可以理解的是,I 2可以在上述取值范围内任意取值。上述屏幕分辨率及I 2的取值仅是示例性说明,并不构成对本申请实施例的限定,在一些实施例中,上述屏幕分辨率和I 2也可以是其他数值。同样地,也可以将上述待显示图像的高宽比与预设高宽比第二阈值I 2进行比较。若上述待显示图像的高宽比小于或等于预设高宽比第二阈值I 2,则可以确定该待显示图像为长横图。
现结合图3进行说明,如图3所示,电子设备的显示屏显示界面300,待显示图像310包括可见区域311及不可见区域312,该待显示图像310的高度为h及宽度为w。若h/w<=I 2,则待显示图像200为长横图。
步骤102,对电子设备的屏幕状态进行判断,确定电子设备的屏幕状态是否与待显示图像的显示类型匹配。
具体地,该电子设备的屏幕状态可以包括横屏或竖屏。在具体实现时,上述电子 设备的屏幕状态可以通过屏幕的方向与重力方向的夹角确定。示例性的,电子设备(例如,手机)的初始设置可以是竖屏,也就是说,根据用户通常的使用习惯和手机的尺寸,手机通常是以竖屏的形态体现,此时,屏幕的方向与重力方向成0度角。当用户使用一些应用(例如,观看视频)时,为了观看方便,用户通常可以将手机横置,此时,手机处于横屏状态,屏幕的方向与重力方向成90度角(例如,向右横置)或-90度角(例如,向左横置)。因此,通过屏幕方向与重力方向的夹角可以确定电子设备的屏幕状态,可以理解的是,也可以通过其他方式确定电子设备的屏幕状态,上述确定电子设备的屏幕状态的方式并不构成对本申请实施例的限定。
此时,可以根据电子设备的屏幕状态判断电子设备的屏幕状态是否与待显示图像的显示类型匹配。
若电子设备的屏幕状态为横屏,且待显示图像为长竖图,则确定电子设备的屏幕状态与待显示图像的显示类型不匹配。
若电子设备的屏幕状态为竖屏,且待显示图像为长横图,则确定电子设备的屏幕状态与待显示图像的显示类型不匹配。
若电子设备的屏幕状态为横屏,且待显示图像为长横图,则确定电子设备的屏幕状态与待显示图像的显示类型匹配。
若电子设备的屏幕状态为竖屏,且待显示图像为长竖图,则确定电子设备的屏幕状态与待显示图像的显示类型匹配。
步骤103,基于电子设备的屏幕状态与待显示图像的显示类型确定显示区域。
具体地,若电子设备的屏幕状态与待显示图像的显示类型不匹配,则可以根据待显示图像的高度或宽度以及电子设备的屏幕的高度或宽度确定第一显示区域。
以电子设备为横屏,且待显示图像为长竖图为例进行说明。如图4所示,电子设备为横屏状态,该电子设备的显示屏显示界面400,该界面400的宽度为w 1。待显示图像410为长竖图,该待显示图像410的宽度为w 2。此时,可以计算w 1/w 2的数值,并可以根据w 1/w 2的数值确定第一显示区域。示例性的,
若w 1/w 2>=2,则可以将上述界面400平均分割成两个子显示区域,例如,第一子显示区域401(例如,左侧粗体方框)及第二子显示区域402(例如,右侧粗体方框),其中,第一子显示区域401及第二子显示区域402的宽度相等。可以理解的是,也可以将上述界面400分割成三个或更多的子显示区域,例如,若w 1/w 2>=3,则可以将上述界面400平均分割成三个子显示区域,上述对界面400的分割以及子显示区域的数目仅是示例性说明,并不构成对本申请实施例的限定。
接着,以电子设备为竖屏,且待显示图像为长横图为例进行说明。如图5所示,电子设备为竖屏状态,该电子设备的显示屏显示界面500,该界面500的高度为h 1。待显示图像510为长横图,该待显示图像510的高度为h 2。此时,可以计算h 1/h 2的数值,并可以根据h 1/h 2的数值确定第一显示区域。示例性的,
若h 1/h 2>=2,则可以将上述界面500平均分割成两个子显示区域,例如,第一子显示区域501(例如,上侧粗体方框)及第二子显示区域502(例如,下侧粗体方框),其中,第一子显示区域501及第二子显示区域402的高度相等。可以理解的是,也可以将上述界面500分割成三个或更多的子显示区域,例如,若h 1/h 2>=3,则可以将上 述界面500平均分割成三个子显示区域,上述对界面500的分割以及子显示区域的数目仅是示例性说明,并不构成对本申请实施例的限定。
若电子设备的屏幕状态与待显示图像的显示类型匹配,此时可以确定上述电子设备的整个屏幕为第二显示区域,并可以在上述第二显示区域中直接显示该待显示图像。
步骤104,若电子设备的屏幕状态与待显示图像的显示类型不匹配,则基于第一显示区域对待显示图像进行平均分割,得到多张子图像。
具体地,当确定第一显示区域后,可以根据第一显示区域中子显示区域的总数对待显示图像进行分割,由此可以得到多张子图像。在具体实现时,可以根据子显示区域的总数对待显示图像进行均匀分割。示例性的,可以将待显示图像平均分割成n块,每个块可以是每个对应的子图像,其中,n为子显示区域的总数。
现以图4所示的左右两个子显示区域为例进行说明。如图6所示,电子设备显示界面600,界面600的宽度为w 1,界面600左右平均分割成两个子显示区域,其中,上述两个子显示区域分别为第一子显示区域601及第二子显示区域602。待显示图像610的宽度为w 2以及高度为h,且w 1/w 2>=2。此时,可以将待显示图像610纵向平均分割成两块(例如,第一子图像611及第二子图像612)。其中,第一子图像611的宽度为w 2以及高度为h/2,第二子图像612的宽度为w 2以及高度为h/2。接着,可以将上述第一子图像611显示在第一子显示区域601中,并可以将上述第二子图像612显示在第二子显示区域602中。
接着,以图5所示的上下两个子显示区域为例进行说明。如图7所示,电子设备显示界面700,界面700的高度为h 1,界面700上下平均分割成两个子显示区域,其中,上述两个子显示区域分别为第一子显示区域701及第二子显示区域702。待显示图像710的宽度为w以及高度为h 2,且h 1/h 2>=2。此时,可以将待显示图像710横向平均分割成两块(例如,第一子图像711及第二子图像712)。其中,第一子图像711的高度为h 2以及宽度为w/2,第二子图像712的高度为h 2以及宽度为w/2。接着,可以将上述第一子图像711显示在第一子显示区域701中,并可以将上述第二子图像712显示在第二子显示区域702中。
可选地,在某些场景中,待显示图像的宽度或高度不足以使得对电子设备显示的界面进行显示区域的分割。例如,若w 1/w 2<2,由于电子设备显示的界面无法同时左右并列显示与待显示图像相同宽度的两幅图像,此时,无需对电子设备显示的界面进行显示区域的分割,其中,w 1为电子设备显示的界面的宽度,w 2为待显示图像的宽度。因此,在一些实施例中,用户也可以对待显示图像的尺寸进行缩放,例如,用户可以通过多指手势对待显示图像进行缩放。响应于用户的缩放操作(例如,缩小),电子设备显示的界面的宽度缩小,当电子设备显示的界面的宽度缩小到一定数值时,例如,若w 1/w 2>=2,此时,可以对电子设备显示的界面进行显示区域的分割,并可以对待显示图像进行图像分割,由此可以使得待显示图像可以在上述显示区域中并列显示,进而可以提高图像的显示效率。
现结合图8进行说明,如图8所示,电子设备显示界面800,界面800的宽度为w 1以及高度为h 1。待显示图像810的宽度为w 2以及高度为h 2。其中,w 1/w 2<2。此时,用户可以对待显示图像810进行缩小,由此可以得到待显示图像820,该待显示图像 820的宽度为w 3以及高度为h 3,此时,w 3<w 2。若w 1/w 3>=2,则可以对界面800进行分割,由此可以得到第一子显示区域801及第二子显示区域802,其中,第一子显示区域801及第二子显示区域802的宽度相等。接着,可以对待显示图像820进行图像分割,由此可以得到第一子图像821及第二子图像822,其中,第一子图像821及第二子图像822的高度及宽度相等,示例性的,第一子图像821的高度为h 3/2以及宽度为w 3;第二子图像822的高度为h 3/2以及宽度为w 3
步骤105,在第一显示区域的子显示区域显示上述子图像。
具体地,当获取到第一显示区域的子显示区域(例如,上述第一子显示区域及第二子显示区域)以及分割后的待显示图像(例如,上述第一子图像及第二子图像)后,可以将上述第一子图像及第二子图像显示在上述第一子显示区域及第二子显示区域中,示例性的,可以将第一子图像显示在第一子显示区域中,将第二子图像显示在第二子显示区域中。在具体实现时,可以在得到上述第一子图像及第二子图像后,使用区域解码器对上述第一子图像和第二子图像进行解码,由此可以得到与各个子图像对应的位图数据,接着,可以对上述每个位图数据在对应的子显示区域内进行渲染并进行显示。
可选地,当获取到子显示区域后,也可以对分割前的待显示图像进行整体解码,由此可以得到位图数据。接着,可以根据子显示区域的总数对上述位图数据进行分割,由此可以得到第一子位图及第二子位图,其中,第一子位图可以是上述第一子图像的位图数据,第二子位图可以是上述第二子图像的位图数据。然后,可以在第一子显示区域内对第一子位图进行渲染并进行显示,在第二子显示区域内对第二子位图进行渲染并进行显示。
需要说明的是,上述第一子显示区域、第二子显示区域、第一子图像及第二子图像仅为示例性说明,并不构成对本申请实施例的限定,在一些实施例中,可以将电子设备显示的界面分割成3个或更多的子显示区域,也可以将待显示图像分割成三个或更多的子图像。
步骤106,响应于用户的操作,对待显示图像进行更新显示。
具体地,当在第一显示区域或第二显示区域上显示待显示图像后,用户还可以对电子设备进行操作,以使得待显示图像进行更新显示。其中,上述操作可以包括在电子设备的屏幕上进行手势滑动、对电子设备的屏幕进行转置等。
若上述电子设备确定第一显示区域显示待显示图像,则用户还可以对待显示图像进行滑动操作,以便对待显示图像进行滑动浏览。示例性的,用户可以通过滑动手势对待显示图像进行浏览,响应于用户的滑动操作,电子设备可以获取子图像的显示位置,可以基于上述子图像的显示位置对待显示图像进行重新分割,对重新分割后得到的子图像进行解码及渲染等操作,并在渲染后进行显示。
现结合图9进行说明,如图9所示,电子设备显示界面900,界面900的高度为h 1,界面900包括第一子显示区域901及第二子显示区域902。待显示图像包括第一子图像911及第二子图像912。其中,第一子显示区域901显示第一子图像911,第二子显示区域902显示第一子图像912。第一子图像911包括不可见区域9111及可见区域9112,第二子图像912包括可见区域9121及不可见区域9122。此时,用户可以在第 一子图像911上进行滑动(例如,下滑),以便对待显示图像进行浏览。随着用户对第一子图像911的不断下滑,不可见区域9111的高度不断缩小,可见区域9112的高度不变,电子设备可以随着上述子图像的滑动对待显示图像进行重分割。示例性的,当用户浏览到待显示图像的顶部时,可以对待显示图像进行重分割,由此可以得到第一子图像913及第二子图像914。其中,第一子图像913中不再包含不可见区域9111,也就是说,第一子图像913的的高度与第一显示区域901的高度相等,例如,第一子图像913的的高度为h 1。第二子图像914包括可见区域9141及不可见区域9142,可见区域9141与可见区域9121的高度相同,不可见区域9142的高度大于不可见区域9122的高度,其中,第二子图像914的高度为h 3,h 3>h 1,且h 3+h 1=2*h 2
若上述电子设备确定第二显示区域显示待显示图像,则用户可以对电子设备进行转置,例如,用户可以将横屏状态的电子设备竖置为竖屏状态,或者用户可以将竖屏状态的电子设备横置为横屏状态。以电子设备为横屏为例进行说明,若待显示图像为长横图,则电子设备可以确定在第二显示区域显示上述待显示图像。此时,用户可以将上述电子设备竖置为竖屏状态。由于竖屏状态与长横图不匹配,因此,可以对该长横图进行分割并进行显示。具体分割显示的过程可以参考步骤103-步骤105,在此不再赘述。
上文通过图1-图9以对待显示图像进行平均分割为例进行了说明,下文通过图10-15以对待显示图像进行不平均分割为例进行说明。
图10为本申请提供的图像显示方法另一个实施例的流程示意图,包括:
步骤201,获取待显示图像,对该待显示图像进行识别,确定该待显示图像的显示类型。
具体地,该显示类型可以是长竖图或长横图。
接着,可以获取预设高宽比第一阈值I 1。在具体实现时,该预设高宽比第一阈值I 1可以根据电子设备的分辨率确定。示例性的,若电子设备的屏幕分辨率为3840*2160时,可以将上述高宽比第一阈值I 1设定为3<=I 1<=5中的任一数值。可以理解的是,I 1可以在上述取值范围内任意取值。上述屏幕分辨率及I 1的取值仅是示例性说明,并不构成对本申请实施例的限定,在一些实施例中,上述屏幕分辨率和I 1也可以是其他数值。
然后,可以计算上述待显示图像的高宽比,并可以将上述待显示图像的高宽比与预设高宽比第一阈值I 1进行比较,若上述待显示图像的高宽比大于或等于预设高宽比第一阈值I 1,则可以确定该待显示图像为长竖图。
可选地,还可以获取预设高宽比第二阈值I 2。在具体实现时,该预设高宽比第二阈值I 2也可以根据电子设备的分辨率确定。示例性的,若电子设备的屏幕分辨率为3840*2160时,可以将上述高宽比第二阈值I 2设定为0.2<=I 2<=0.3中的任一数值。可以理解的是,I 2可以在上述取值范围内任意取值。上述屏幕分辨率及I 2的取值仅是示例性说明,并不构成对本申请实施例的限定,在一些实施例中,上述屏幕分辨率和I 2也可以是其他数值。同样地,也可以将上述待显示图像的高宽比与预设高宽比第二阈值I 2进行比较。若上述待显示图像的高宽比小于或等于预设高宽比第二阈值I 2,则可以确定该待显示图像为长横图。
步骤202,对电子设备的屏幕状态进行判断,确定电子设备的屏幕状态是否与待显示图像的显示类型匹配。
具体地,该电子设备的屏幕状态可以包括横屏或竖屏。在具体实现时,上述电子设备的屏幕状态可以通过屏幕的方向与重力方向的夹角确定。示例性的,电子设备(例如,手机)的初始设置可以是竖屏,也就是说,根据用户通常的使用习惯和手机的尺寸,手机通常是以竖屏的形态体现,此时,屏幕的方向与重力方向成0度角。当用户使用一些应用(例如,观看视频)时,为了观看方便,用户通常可以将手机横置,此时,手机处于横屏状态,屏幕的方向与重力方向成90度角(例如,向右横置)或-90度角(例如,向左横置)。因此,通过屏幕方向与重力方向的夹角可以确定电子设备的屏幕状态,可以理解的是,也可以通过其他方式确定电子设备的屏幕状态,上述确定电子设备的屏幕状态的方式并不构成对本申请实施例的限定。
此时,可以根据电子设备的屏幕状态判断电子设备的屏幕状态是否与待显示图像的显示类型匹配。
若电子设备的屏幕状态为横屏,且待显示图像为长竖图,则确定电子设备的屏幕状态与待显示图像的显示类型不匹配。
若电子设备的屏幕状态为竖屏,且待显示图像为长横图,则确定电子设备的屏幕状态与待显示图像的显示类型不匹配。
若电子设备的屏幕状态为横屏,且待显示图像为长横图,则确定电子设备的屏幕状态与待显示图像的显示类型匹配。
若电子设备的屏幕状态为竖屏,且待显示图像为长竖图,则确定电子设备的屏幕状态与待显示图像的显示类型匹配。
步骤203,基于电子设备的屏幕状态与待显示图像的显示类型确定第一显示区域。
具体地,若电子设备的屏幕状态与待显示图像的显示类型不匹配,则可以根据待显示图像的高度或宽度以及电子设备的屏幕的高度或宽度确定第一显示区域。
若电子设备的屏幕状态与待显示图像的显示类型匹配,此时可以确定上述电子设备的整个屏幕为第二显示区域,并可以在上述第二显示区域中直接显示该待显示图像。
步骤204,若电子设备的屏幕状态与待显示图像的显示类型不匹配,则基于第一显示区域对待显示图像进行不平均分割,得到多张子图像。
具体地,当对上述电子设备的界面进行第一显示区域的分割后,可以对待显示图像进行不平均分割。由于在实际应用中,待显示图像通常由多张图像拼接而成,而多张图像在拼接时,拼接的边缘处会存在明显的拼接痕迹。因此,在具体实现时,可以对上述待显示图像从上至下,逐行进行扫描,并采用边缘检测算法对待显示图像进行边缘检测,由此可以得到拼接图像之间的拼接点,根据上述拼接点进行分割,由此可以得到多张拼接的图像。接着,可以将上述通过边缘检测算法分割得到的多张拼接图像作为子图像。
步骤205,在第一显示区域中的子显示区域显示上述子图像。
具体地,当获取到上述子图像后,可以将各个子图像在对应的子显示区域中显示。在具体实现时,可以按照待显示图像的分割的先后顺序在子显示区域中显示。
现结合图11进行说明,如图11所示,待显示图像1110由第一子图像1111、第 二子图像1112、第三子图像1113及第四子图像1114拼接而成。使用边缘检测算法对待显示图像1110进行分割之后,可以分别得到第一子图像1111、第二子图像1112、第三子图像1113及第四子图像1114。电子设备显示界面1100,界面1100包括第一子显示区域1101及第二子显示区域1102。此时,可以将第一子图像1111显示在第一子显示区域1101中,将第二子图像1112显示在第二子显示区域1102中,第三子图像1113及第四子图像1114分别排在第二子图像1112之后,等待显示。
可以理解的是,上述图11示例性的示出了电子设备为横屏、待显示图像为长竖图的场景。上述对待显示图像进行不平均分割以及显示的方式也同样适用于电子设备为竖屏、待显示图像为长横图的场景。
步骤206,响应于用户的操作,对待显示图像进行更新显示。
具体地,当在第一显示区域或第二显示区域上显示待显示图像后,用户还可以对电子设备进行操作,以使得待显示图像进行更新显示。其中,上述操作可以包括在电子设备的屏幕上进行点击、滑动、对电子设备的屏幕进行转置等。
若上述电子设备确定第一显示区域显示待显示图像,则用户可以在电子设备的界面上进行滑动操作,以便浏览上述未在显示区域中显示的子图像。示例性的,用户可以在电子设备的界面上进行滑动(例如,左滑动、右滑动、上滑动或下滑动)。响应于用户的滑动操作,电子设备的界面显示新的图像。
现结合图12进行说明,如图12所示,待显示图像1210包括第一子图像1211、第二子图像1212、第三子图像1213及第四子图像1214。电子设备显示界面1200,界面1200包括第一子显示区域1201及第二子显示区域1202。第一子图像1211显示在第一子显示区域1201中,第二子图像1212显示在第二子显示区域1202中,第三子图像1213及第四子图像1214分别排在第二子图像1212之后,等待显示。此时,用户可以向左滑动界面1200。响应于用户的滑动操作,第一子图像1211移出界面1200之外,第二子图像1212显示在第一子显示区域1201中,第三子图像1213显示在第二子显示区域1202中,第四子图像1214排在第三子图像1213之后,等待显示。
可选地,用户还可以在电子设备的界面上进行点击操作,以便浏览在子显示区域中显示的子图像。示例性的,用户可以点击电子设备的界面上显示的任一子图像。响应于用户的操作,电子设备可以确定用户选取的子图像,上述用户选取的子图像可以获焦。接着,用户可以在选取的子图像上进行操作,示例性的,用户可以点击上述选取的子图像,或者用户也可以通过多指手势对上述选取的子图像进行放大操作。响应于用户的操作,电子设备可以将上述用户选取的子图像进行放大显示(例如,该放大显示可以是全屏显示),由此可以使得用户浏览子图像的细节。
现结合图13-图15进行说明,如图13所示,电子设备显示界面1300,界面1300包括第一子显示区域1301及第二子显示区域1302,其中,第一子显示区域1301显示第一子图像1311,第二子显示区域1302显示第二子图像1312。此时,用户可以对第二子图像1312进行操作(例如,点击第二子图像1312)。响应于用户的点击操作,第二子图像1312获焦,由此可以得到如图14所示的界面1400。如图14所示,界面1400包括第一子显示区域1301及第二子显示区域1302,其中,第一子显示区域1301显示第一子图像1311,第二子显示区域1302显示第二子图像1312,第二子图像1312 处于获焦状态。接着,用户可以对处于获焦状态的第二子图像1312进行操作(例如,点击或者多指滑动),以便对上述第二子图像1312进行放大显示。响应于用户的操作,电子设备可以对上述第二子图像1312进行放大显示,得到如图15所示的界面1500。在具体实现时,上述第二子图像1312可以根据电子设备的屏幕尺寸进行全屏显示,由此可以使得第二子图像1312的分辨率可以适配电子设备的屏幕尺寸,进而可以提高用户的观看体验。如图15所示,界面1500包括放大后的第二子图像1312。由于放大后的第二子图像1312可能不能完全在电子设备的屏幕中显示,因此,上述放大后的第二子图像1312可以包括不可见区域13121及可见区域13122,用户可以通过缩放(例如,缩小或放大)或滑动(例如,上滑或下滑)等操作进一步浏览上述不可见区域13121及可见区域13122。
若上述电子设备确定第二显示区域显示待显示图像,则用户可以对电子设备进行转置,例如,用户可以将横屏状态的电子设备竖置为竖屏状态,或者用户可以将竖屏状态的电子设备横置为横屏状态。以电子设备为横屏为例进行说明,若待显示图像为长横图,则电子设备可以确定在第二显示区域显示上述待显示图像。此时,用户可以将上述电子设备竖置为竖屏状态。由于竖屏状态与长横图不匹配,因此,可以对该长横图进行分割并进行显示。具体分割显示的过程可以参考步骤203-步骤205,在此不再赘述。
图16为本申请图像显示装置一个实施例的结构示意图,如图16所示,上述图像显示装置1600可以包括:获取模块1610、识别模块1620、确定模块1630及第一显示模块1640;其中,
获取模块1610,用于获取待显示图像;
识别模块1620,用于对待显示图像进行识别,确定待显示图像的显示类型;
确定模块1630,用于获取电子设备的屏幕状态,判断电子设备的屏幕状态与待显示图像的显示类型是否匹配;根据匹配结果确定显示区域;
第一显示模块1640,用于若电子设备的屏幕状态与待显示图像的显示类型不匹配,则对待显示图像进行分割,得到多个子图像,并在显示区域显示多个子图像。
其中一种可能的实现方式中,上述识别模块1620包括:获取单元1621、比较单元1622及识别单元1623;其中,
获取单元1621,用于获取待显示图像的高宽比;
比较单元1622,用于将待显示图像的高宽比与预设第一阈值及预设第二阈值进行比较,其中,预设第一阈值及第二阈值由电子设备的屏幕分辨率确定;
识别单元1623,用于若待显示图像的高宽比大于或等于预设第一阈值,则确定待显示图像的显示类型为长竖图;若待显示图像的高宽比小于或等于预设第二阈值,则确定待显示图像的显示类型为长横图。
其中一种可能的实现方式中,上述电子设备的屏幕状态包括横屏及竖屏,上述确定模块1630还用于若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图;则确定电子设备的屏幕状态与待显示图像的显示类型不匹配,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长横图;或电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长竖图;则确定电子设备的屏幕状态与待显示图像的显示类型匹配,并确定电子设备的屏幕为第二显示区域。
其中一种可能的实现方式中,上述装置1600还包括:第二显示模块1650;其中,
第二显示模块1650,用于若电子设备的屏幕状态与待显示图像的显示类型匹配,则在第二显示区域显示待显示图像。
其中一种可能的实现方式中,上述确定模块1630还用于若电子设备的屏幕状态为横屏,且待显示图像的显示类型为长竖图,则获取电子设备屏幕的宽度及待显示图像的宽度,基于电子设备屏幕的宽度与待显示图像的宽度的比值,对第一显示区域进行分割,得到多个子显示区域;
若电子设备的屏幕状态为竖屏,且待显示图像的显示类型为长横图,则获取电子设备屏幕的高度及待显示图像的高度,基于电子设备屏幕的高度与待显示图像的高度的比值,对第一显示区域进行分割,得到多个子显示区域。
其中一种可能的实现方式中,上述第一显示模块1640还用于在多个子显示区域显示多个子图像。
其中一种可能的实现方式中,上述确定模块1630还用于基于子显示区域的总数对待显示图像进行均匀分割,得到同尺寸的多个子图像。
其中一种可能的实现方式中,上述确定模块1630还用于对待显示图像进行不均匀分割,得到不同尺寸的多个子图像。
其中一种可能的实现方式中,上述装置1600还包括:第三显示模块1660;其中,
第三显示模块1660,用于响应于用户的第一操作,对待显示图像进行重分割,并将重分割后得到的多个子图像在多个子显示区域中显示。
其中一种可能的实现方式中,上述装置还包括:第四显示模块1670;其中,
第四显示模块1670,用于响应于用户的第二操作,确定电子设备的屏幕为第一显示区域,并对第一显示区域进行分割,得到多个子显示区域;对待显示图像进行分割,得到多个子图像,并在多个子显示区域显示多个子图像。
图16所示实施例提供的图像显示装置可用于执行本申请图1-图15所示方法实施例的技术方案,其实现原理和技术效果可以进一步参考方法实施例中的相关描述。
应理解以上图16所示的图像显示装置的各个模块的划分仅仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。且这些模块可以全部以软件通过处理元件调用的形式实现;也可以全部以硬件的形式实现;还可以部分模块以软件通过处理元件调用的形式实现,部分模块通过硬件的形式实现。例如,检测模块可以为单独设立的处理元件,也可以集成在电子设备的某一个芯片中实现。其它模块的实现与之类似。此外这些模块全部或部分可以集成在一起,也可以独立实现。在实现过程中,上述方法的各步骤或以上各个模块可以通过处理器元件中的硬件的集成逻辑电路或者软件形式的指令完成。
例如,以上这些模块可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个特定集成电路(Application Specific Integrated Circuit;以下简称:ASIC),或,一个或多个微处理器(Digital Signal Processor;以下简称:DSP),或,一个或 者多个现场可编程门阵列(Field Programmable Gate Array;以下简称:FPGA)等。再如,这些模块可以集成在一起,以片上系统(System-On-a-Chip;以下简称:SOC)的形式实现。
图17示例性的示出了电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块150,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是电子设备200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
本申请实施例提供的应用共享方法的执行可以由处理器110来控制或调用其他部件来完成,比如调用内部存储器121中存储的本申请实施例的处理程序,或者通过外部存储器接口120调用第三方设备中存储的本申请实施例的处理程序,来控制无线通信模块160向其他电子设备进行数据通信,以实现多个电子设备间的应用共享,提升用户的体验。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry  processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其它电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明, 并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其它功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其它设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobilecommunications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access, TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
本申请实施例中,电子设备100可以通过显示屏194显示用户界面。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其它数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
本申请实施例中,电子设备100可以通过触摸传感器180K接收用户的操作,例如,单击、双击或滑动等操作。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM 卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
可以理解的是,上述电子设备100为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述电子设备100进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因 此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (22)

  1. 一种图像显示方法,应用于电子设备,其特征在于,所述方法包括:
    获取待显示图像;
    对所述待显示图像进行识别,确定所述待显示图像的显示类型;
    获取所述电子设备的屏幕状态,判断所述电子设备的屏幕状态与所述待显示图像的显示类型是否匹配;
    根据匹配结果确定显示区域;
    若所述电子设备的屏幕状态与所述待显示图像的显示类型不匹配,则对所述待显示图像进行分割,得到多个子图像,并在所述显示区域显示多个所述子图像。
  2. 根据权利要求1所述的方法,其特征在于,所述对所述待显示图像进行识
    别,确定所述待显示图像的显示类型包括:
    获取所述待显示图像的高宽比;
    将所述待显示图像的高宽比与预设第一阈值及预设第二阈值进行比较,其中,所述预设第一阈值及所述第二阈值由所述电子设备的屏幕分辨率确定;
    若所述待显示图像的高宽比大于或等于所述预设第一阈值,则确定所述待显示图像的显示类型为长竖图;
    若所述待显示图像的高宽比小于或等于所述预设第二阈值,则确定所述待显示图像的显示类型为长横图。
  3. 根据权利要求2所述的方法,其特征在于,所述电子设备的屏幕状态包括横屏及竖屏,所述根据匹配结果确定显示区域包括:
    若所述电子设备的屏幕状态为横屏,且所述待显示图像的显示类型为长竖图;或所述电子设备的屏幕状态为竖屏,且所述待显示图像的显示类型为长横图;则确定所述电子设备的屏幕状态与所述待显示图像的显示类型不匹配,确定所述电子设备的屏幕为第一显示区域,并对所述第一显示区域进行分割,得到多个子显示区域;
    若所述电子设备的屏幕状态为横屏,且所述待显示图像的显示类型为长横图;或所述电子设备的屏幕状态为竖屏,且所述待显示图像的显示类型为长竖图;则确定所述电子设备的屏幕状态与所述待显示图像的显示类型匹配,并确定所述电子设备的屏幕为第二显示区域。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    若所述电子设备的屏幕状态与所述待显示图像的显示类型匹配,则在所述第二显示区域显示所述待显示图像。
  5. 根据权利要求3所述的方法,其特征在于,所述对所述第一显示区域进行分割,得到多个子显示区域包括:
    若所述电子设备的屏幕状态为横屏,且所述待显示图像的显示类型为长竖图,则获取所述电子设备屏幕的宽度及所述待显示图像的宽度,基于所述电子设备屏幕的宽度与所述待显示图像的宽度的比值,对所述第一显示区域进行分割,得到多个子显示区域;
    若所述电子设备的屏幕状态为竖屏,且所述待显示图像的显示类型为长横图,则获取所述电子设备屏幕的高度及所述待显示图像的高度,基于所述电子设备屏幕的高 度与所述待显示图像的高度的比值,对所述第一显示区域进行分割,得到多个子显示区域。
  6. 根据权利要求3所述的方法,其特征在于,所述在所述显示区域显示多个所述子图像包括:
    在多个所述子显示区域显示多个所述子图像。
  7. 根据权利要求3所述的方法,其特征在于,所述对所述待显示图像进行分割,得到多个子图像包括:
    基于所述子显示区域的总数对所述待显示图像进行均匀分割,得到同尺寸的多个子图像。
  8. 根据权利要求1所述的方法,其特征在于,所述对所述待显示图像进行分割,得到多个子图像包括:
    对所述待显示图像进行不均匀分割,得到不同尺寸的多个子图像。
  9. 根据权利要求3所述的方法,其特征在于,所述在所述显示区域显示多个所述子图像之后,所述方法还包括:
    响应于用户的第一操作,对所述待显示图像进行重分割,并将重分割后得到的多个子图像在多个所述子显示区域中显示。
  10. 根据权利要求4所述的方法,其特征在于,所述在所述第二显示区域显示所述待显示图像之后,所述方法还包括:
    响应于用户的第二操作,确定所述电子设备的屏幕为第一显示区域,并对所述第一显示区域进行分割,得到多个子显示区域;
    对所述待显示图像进行分割,得到多个子图像,并在多个所述子显示区域显示多个所述子图像。
  11. 一种电子设备,其特征在于,包括:存储器,所述存储器用于存储计算机程序代码,所述计算机程序代码包括指令,当所述电子设备从所述存储器中读取所述指令,以使得所述电子设备执行以下步骤:
    获取待显示图像;
    对所述待显示图像进行识别,确定所述待显示图像的显示类型;
    获取所述电子设备的屏幕状态,判断所述电子设备的屏幕状态与所述待显示图像的显示类型是否匹配;
    根据匹配结果确定显示区域;
    若所述电子设备的屏幕状态与所述待显示图像的显示类型不匹配,则对所述待显示图像进行分割,得到多个子图像,并在所述显示区域显示多个所述子图像。
  12. 根据权利要求11所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行对所述待显示图像进行识别,确定所述待显示图像的显示类型的步骤包括:
    获取所述待显示图像的高宽比;
    将所述待显示图像的高宽比与预设第一阈值及预设第二阈值进行比较,其中,所述预设第一阈值及所述第二阈值由所述电子设备的屏幕分辨率确定;
    若所述待显示图像的高宽比大于或等于所述预设第一阈值,则确定所述待显示图 像的显示类型为长竖图;
    若所述待显示图像的高宽比小于或等于所述预设第二阈值,则确定所述待显示图像的显示类型为长横图。
  13. 根据权利要求12所述的电子设备,所述电子设备的屏幕状态包括横屏及竖屏,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行根据匹配结果确定显示区域的步骤包括:
    若所述电子设备的屏幕状态为横屏,且所述待显示图像的显示类型为长竖图;或所述电子设备的屏幕状态为竖屏,且所述待显示图像的显示类型为长横图;则确定所述电子设备的屏幕状态与所述待显示图像的显示类型不匹配,确定所述电子设备的屏幕为第一显示区域,并对所述第一显示区域进行分割,得到多个子显示区域;
    若所述电子设备的屏幕状态为横屏,且所述待显示图像的显示类型为长横图;或所述电子设备的屏幕状态为竖屏,且所述待显示图像的显示类型为长竖图;则确定所述电子设备的屏幕状态与所述待显示图像的显示类型匹配,并确定所述电子设备的屏幕为第二显示区域。
  14. 根据权利要求13所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备还执行以下步骤:
    若所述电子设备的屏幕状态与所述待显示图像的显示类型匹配,则在所述第二显示区域显示所述待显示图像。
  15. 根据权利要求13所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行对所述第一显示区域进行分割,得到多个子显示区域的步骤包括:
    若所述电子设备的屏幕状态为横屏,且所述待显示图像的显示类型为长竖图,则获取所述电子设备屏幕的宽度及所述待显示图像的宽度,基于所述电子设备屏幕的宽度与所述待显示图像的宽度的比值,对所述第一显示区域进行分割,得到多个子显示区域;
    若所述电子设备的屏幕状态为竖屏,且所述待显示图像的显示类型为长横图,则获取所述电子设备屏幕的高度及所述待显示图像的高度,基于所述电子设备屏幕的高度与所述待显示图像的高度的比值,对所述第一显示区域进行分割,得到多个子显示区域。
  16. 根据权利要求13所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行在所述显示区域显示多个所述子图像的步骤包括:
    在多个所述子显示区域显示多个所述子图像。
  17. 根据权利要求13所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行对所述待显示图像进行分割,得到多个子图像的步骤包括:
    基于所述子显示区域的总数对所述待显示图像进行均匀分割,得到同尺寸的多个子图像。
  18. 根据权利要求11所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行对所述待显示图像进行分割,得到多个子图像的步骤 包括:
    对所述待显示图像进行不均匀分割,得到不同尺寸的多个子图像。
  19. 根据权利要求13所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行在所述显示区域显示多个所述子图像的步骤之后,还执行以下步骤:
    响应于用户的第一操作,对所述待显示图像进行重分割,并将重分割后得到的多个子图像在多个所述子显示区域中显示。
  20. 根据权利要求14所述的电子设备,其特征在于,所述指令被所述电子设备执行时,使得所述电子设备执行在所述第二显示区域显示所述待显示图像的步骤之后,还执行以下步骤:
    响应于用户的第二操作,确定所述电子设备的屏幕为第一显示区域,并对所述第一显示区域进行分割,得到多个子显示区域;
    对所述待显示图像进行分割,得到多个子图像,并在多个所述子显示区域显示多个所述子图像。
  21. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在所述电子设备上运行时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
  22. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-10中任一项所述的方法。
PCT/CN2021/136372 2021-02-07 2021-12-08 图像显示方法、电子设备及存储介质 WO2022166386A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21924379.7A EP4280055A1 (en) 2021-02-07 2021-12-08 Image display method, electronic device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110179798.0A CN114911546A (zh) 2021-02-07 2021-02-07 图像显示方法、电子设备及存储介质
CN202110179798.0 2021-02-07

Publications (1)

Publication Number Publication Date
WO2022166386A1 true WO2022166386A1 (zh) 2022-08-11

Family

ID=82741863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/136372 WO2022166386A1 (zh) 2021-02-07 2021-12-08 图像显示方法、电子设备及存储介质

Country Status (3)

Country Link
EP (1) EP4280055A1 (zh)
CN (1) CN114911546A (zh)
WO (1) WO2022166386A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014073521A1 (ja) * 2012-11-09 2014-05-15 富士フイルム株式会社 画像表示装置及び方法並びにプログラム
CN105868235A (zh) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 智能终端的图片预览方法及装置
CN107608606A (zh) * 2017-10-18 2018-01-19 维沃移动通信有限公司 一种图片显示方法、移动终端及计算机可读存储介质
CN109126131A (zh) * 2018-07-09 2019-01-04 网易(杭州)网络有限公司 游戏画面显示方法、存储介质及终端
CN111176526A (zh) * 2019-12-30 2020-05-19 维沃移动通信有限公司 图片显示方法和电子设备
CN111586237A (zh) * 2020-04-30 2020-08-25 维沃移动通信有限公司 一种图像显示方法及电子设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9449254B1 (en) * 2015-08-04 2016-09-20 Adobe Systems Incorporated Adaptive environment targeting

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014073521A1 (ja) * 2012-11-09 2014-05-15 富士フイルム株式会社 画像表示装置及び方法並びにプログラム
CN105868235A (zh) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 智能终端的图片预览方法及装置
CN107608606A (zh) * 2017-10-18 2018-01-19 维沃移动通信有限公司 一种图片显示方法、移动终端及计算机可读存储介质
CN109126131A (zh) * 2018-07-09 2019-01-04 网易(杭州)网络有限公司 游戏画面显示方法、存储介质及终端
CN111176526A (zh) * 2019-12-30 2020-05-19 维沃移动通信有限公司 图片显示方法和电子设备
CN111586237A (zh) * 2020-04-30 2020-08-25 维沃移动通信有限公司 一种图像显示方法及电子设备

Also Published As

Publication number Publication date
EP4280055A1 (en) 2023-11-22
CN114911546A (zh) 2022-08-16

Similar Documents

Publication Publication Date Title
WO2020233553A1 (zh) 一种拍摄方法及终端
US11669242B2 (en) Screenshot method and electronic device
WO2021078284A1 (zh) 一种内容接续方法及电子设备
WO2021036771A1 (zh) 具有可折叠屏幕的电子设备及显示方法
US11759143B2 (en) Skin detection method and electronic device
WO2021036715A1 (zh) 一种图文融合方法、装置及电子设备
WO2021037227A1 (zh) 一种图像处理方法、电子设备及云服务器
WO2021052111A1 (zh) 图像处理方法及电子装置
WO2021057277A1 (zh) 一种暗光下拍照的方法及电子设备
WO2021013132A1 (zh) 输入方法及电子设备
WO2022100610A1 (zh) 投屏方法、装置、电子设备及计算机可读存储介质
WO2022042285A1 (zh) 一种应用程序界面显示的方法及电子设备
WO2022001619A1 (zh) 一种截屏方法及电子设备
WO2022001258A1 (zh) 多屏显示方法、装置、终端设备及存储介质
WO2020107463A1 (zh) 一种电子设备的控制方法及电子设备
EP3893495A1 (en) Method for selecting images based on continuous shooting and electronic device
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2023241209A9 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
WO2020233593A1 (zh) 一种前景元素的显示方法和电子设备
WO2022143180A1 (zh) 协同显示方法、终端设备及计算机可读存储介质
WO2022007678A1 (zh) 一种打开文件的方法及设备
WO2023029916A1 (zh) 批注展示方法、装置、终端设备及可读存储介质
WO2022166386A1 (zh) 图像显示方法、电子设备及存储介质
WO2022194007A1 (zh) 截屏方法、电子设备及存储介质
WO2024032421A1 (zh) 屏幕显示方法、终端设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21924379

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021924379

Country of ref document: EP

Effective date: 20230818

NENP Non-entry into the national phase

Ref country code: DE