CN111669462B - Method and related device for displaying image - Google Patents

Method and related device for displaying image Download PDF

Info

Publication number
CN111669462B
CN111669462B CN202010480991.3A CN202010480991A CN111669462B CN 111669462 B CN111669462 B CN 111669462B CN 202010480991 A CN202010480991 A CN 202010480991A CN 111669462 B CN111669462 B CN 111669462B
Authority
CN
China
Prior art keywords
image
horizontal
electronic device
vertical screen
screen state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010480991.3A
Other languages
Chinese (zh)
Other versions
CN111669462A (en
Inventor
张帆
崔晓颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010480991.3A priority Critical patent/CN111669462B/en
Publication of CN111669462A publication Critical patent/CN111669462A/en
Application granted granted Critical
Publication of CN111669462B publication Critical patent/CN111669462B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Abstract

Disclosed is a method of displaying an image, including: the method comprises the steps that the electronic equipment displays a preview interface, wherein the preview interface comprises a first image collected by a first camera, and the first image is displayed in a first area of a display screen; detecting a first user operation in a preview interface, shooting a first image through a first camera, and shooting a second image through a second camera; determining a first horizontal and vertical screen state of the electronic equipment according to the second image, and storing the first image and the first horizontal and vertical screen state in an associated manner; detecting a second user operation for viewing the first image, determining a second horizontal and vertical screen state of the electronic equipment, and displaying the first image in a second area of the display screen; a first rotation angle of a first image displayed in a second area on the display screen relative to a second image displayed in a first area is determined by a first horizontal and vertical screen state and a second horizontal and vertical screen state, and the size ratio of the second area is the same as that of the first area. Therefore, the horizontal and vertical screen states of the electronic equipment can be accurately identified.

Description

Method and related device for displaying image
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a method for displaying an image and a related device.
Background
With the increasing living standard, the photographing function of the smart phone is accepted and loved by common users. Compared with a professional camera, the smart phone is lighter and simpler to control. When a user utilizes the smart phone to take a picture, various machine positions and angles can be easily changed, and different shooting requirements are met.
At present, most of mobile devices are internally provided with a gravity acceleration sensor, and the gravity acceleration sensor is utilized to detect the horizontal and vertical screen states of a mobile phone, so that the display direction of a picture shot by a user can be judged. However, due to technical limitations, the sensor sometimes cannot accurately capture the change of the horizontal and vertical screen states of the mobile phone, and at this time, a problem occurs that a picture displayed by the electronic device is inconsistent with a preview image displayed on a shooting interface seen by a user. For example, when a user of a smart phone puts the smart phone parallel to the ground plane or close to the ground plane, and takes a video, the built-in sensor of the smart phone cannot accurately identify the horizontal and vertical screen states of the smart phone, thereby causing troubles to the photographer and trouble in use. Of course, the above problem is not limited to the overhead shooting, and the built-in sensor of the mobile phone cannot accurately identify the horizontal and vertical screen states of the mobile phone in any scene with a certain probability.
In summary, in the prior art, the mobile phone cannot accurately identify the horizontal and vertical screen states of the electronic device.
Disclosure of Invention
The embodiment of the application provides a method for displaying an image and a related device, which can accurately identify the horizontal and vertical screen states of electronic equipment.
In a first aspect, the present application provides a method of displaying an image, comprising: the method comprises the steps that the electronic equipment displays a preview interface, wherein the preview interface comprises a first image collected by a first camera, and the first image is displayed in a first area of a display screen; detecting a first user operation in a preview interface, responding to the first user operation, shooting a first image through a first camera, and shooting a second image through a second camera; the second image comprises a human face; determining a first horizontal and vertical screen state of the electronic equipment according to the second image, and storing the first image and the first horizontal and vertical screen state in an associated manner; detecting a second user operation for viewing the first image, responding to the second user operation, determining a second horizontal and vertical screen state of the electronic equipment, and displaying the first image in a second area of the display screen; a first rotation angle of a first image displayed in a second area on the display screen relative to a second image displayed in a first area is determined by a first horizontal and vertical screen state and a second horizontal and vertical screen state, and the size ratio of the second area is the same as that of the first area.
By implementing the embodiment of the application, when the electronic equipment shoots the first image through the first camera, the second camera can shoot the second image, and the first horizontal and vertical screen states of the electronic equipment when the first image is shot are determined by utilizing the second image. When a user views the first image in the second horizontal and vertical screen state through the electronic equipment, the electronic equipment determines the rotation angle of the first image displayed by the electronic equipment according to the first horizontal and vertical screen state and the second horizontal and vertical screen state. By implementing the method for displaying the image, which is provided by the embodiment of the application, the performance limitation of the traditional sensor is not limited, and the horizontal and vertical screen states which accord with the action intention of the user are obtained by detecting the action of the user. Therefore, the transverse and vertical screen states of the electronic equipment can be accurately identified, and the shooting experience of a user is improved.
In one implementation, the determining the second landscape screen state of the electronic device includes: acquiring a third image through a second camera; the third image comprises a human face; and determining a second horizontal and vertical screen state of the electronic equipment according to the third image.
In one implementation manner, the electronic device represents a first horizontal and vertical screen state through a first parameter, a first parameter of a first value is taken to represent an upward vertical screen, a first parameter of a second value is taken to represent a left horizontal screen, a first parameter of a third value is taken to represent a downward vertical screen, and a first parameter of a fourth value is taken to represent a right horizontal screen; the method for storing the first image and the first horizontal and vertical screen states in an associated mode comprises the following steps: the first image and the first parameter are stored in association.
In one implementation, the determining a first landscape screen state of the electronic device according to the second image includes: and when the included angle between the screen of the electronic equipment and the horizontal plane is smaller than a preset angle, determining the first horizontal and vertical screen states of the electronic equipment according to the second image.
In one implementation, the determining a first landscape screen state of the electronic device according to the second image includes: and determining a first horizontal and vertical screen state according to a first included angle between the standard reference line of the second image and the face reference line corresponding to the second image.
In one implementation manner, when the second image includes a plurality of faces, the face reference line corresponding to the second image is a face reference line of a face with a largest area among the plurality of faces in the second image.
In one implementation, the determining a first horizontal and vertical screen state according to a first included angle between a standard reference line of a second image and a face reference line corresponding to the second image includes: when the first included angle is in a first range, determining that the first horizontal and vertical screen state is an upward and vertical screen; when the first included angle is in a second range, determining that the first horizontal and vertical screen state is a left horizontal screen; when the first included angle is in a third range, determining that the first horizontal and vertical screen state is a downward vertical screen state; and when the first included angle is in the fourth range, determining that the first horizontal and vertical screen state is a right horizontal screen.
In one implementation, the standard reference line of the second image is oriented along a side of the second image from the bottom of the second image to the top of the second image; the orientation of the face reference line corresponding to the second image is along the vertical direction of the binocular connecting line of the face in the second image, and points from the chin to the top of the head; the first range is greater than-45 ° and less than or equal to 45 °, the second range is greater than-135 ° and less than or equal to-45 °, the third range is greater than 135 ° and less than or equal to 225 °, and the fourth range is greater than 45 ° and less than or equal to 135 °.
In one implementation, the first parameter is equal to a second rotation angle when the second horizontal screen state is the vertical screen up; when the second transverse screen state is a left transverse screen, the first rotating angle is equal to the first parameter plus 90 degrees; when the second transverse screen is in a downward vertical screen state, the first rotating angle is equal to the first parameter plus 180 degrees; when the second horizontal screen state is a right horizontal screen state, the first rotation angle is equal to the first parameter plus-90 degrees.
In one implementation, when the first horizontal screen and the vertical screen are in the upward vertical screen state, and the second horizontal screen is in the upward vertical screen state, the first rotation angle is equal to 0 degree; when the first transverse screen state is a left transverse screen and the second transverse screen state is an upward vertical screen, the first rotating angle is equal to-90 degrees; the first horizontal screen state is a downward vertical screen, and when the second horizontal screen state is an upward vertical screen, the first rotation angle is equal to 180 degrees; the first horizontal screen state is a right horizontal screen, and when the second horizontal screen state is an upward vertical screen, the first rotation angle is equal to 90 degrees.
In one implementation, the first camera is a rear camera, and the second camera is a front camera.
In a second aspect, the present application provides an electronic device comprising: the system comprises a first camera, a second camera, a display screen, one or more processors, and one or more memories; the one or more memories are for storing computer program code, the computer program code including computer instructions; the computer instructions, when executed on the processor, cause the electronic device to perform the method for displaying images in any one of the possible implementations of the above aspect.
In a third aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions that, when executed on an electronic device, cause a communication apparatus to perform a method for displaying an image in any one of the possible implementations of the foregoing aspect.
In a fourth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the method for displaying an image in any one of the possible implementations of the foregoing aspect.
Drawings
FIG. 1 is a schematic diagram of a user interface provided by an embodiment of the present application;
fig. 2 is a schematic view of a shooting interface provided in an embodiment of the present application;
FIG. 3 is a schematic view of an album interface according to an embodiment of the present application;
fig. 4A to 4D are schematic diagrams illustrating a horizontal and vertical screen status according to an embodiment of the present disclosure;
fig. 5 is a schematic coordinate system diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 6 is a schematic view of another exemplary embodiment of a horizontal and vertical screen configuration;
fig. 7 to 9 are schematic diagrams of user interfaces for shooting and displaying according to an embodiment of the present disclosure;
FIG. 10 is a schematic view of a standard reference line provided in an embodiment of the present application;
fig. 11A and fig. 11B are schematic diagrams of a face reference line provided in an embodiment of the present application;
fig. 12 is a schematic view of a reference image provided in an embodiment of the present application;
FIGS. 13 and 14 are schematic views of a first included angle provided by an embodiment of the present application;
fig. 15 is a flowchart illustrating a method for displaying an image according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of a software system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and exhaustively described below with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
Some exemplary graphical User Interfaces (UIs) provided by embodiments of the present application are described below. The term "user interface" in the description and claims and drawings of the present application is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
FIG. 1 illustrates an exemplary user interface 10 on an electronic device 100 for exposing applications installed by the electronic device 100.
The user interface 10 may include: status bar 101, navigation bar 102, calendar indicator 103, weather indicator 104, tray 105 with common application icons, and other application icons. Wherein:
status bar 101 may include: one or more signal strength indicators 101A for mobile communication signals (which may also be referred to as cellular signals), operator name (e.g., "china mobile") 101B, one or more signal strength indicators 101C for wireless fidelity (Wi-Fi) signals, battery status indicator 101D, and time indicator 101E.
The navigation bar 102 may include: a return key 102A, a home screen key 102B, a multitasking key 102C, and other system navigation keys. When it is detected that the user clicks the return key 102A, the electronic apparatus 100 may display a page previous to the current page. When the user is detected to click home screen key 102B, electronic device 100 may display a home interface. Upon detecting that the user has clicked the multi-task key 102C, the electronic device 100 may display the task that was recently opened by the user. The names of the navigation keys can be other keys, and the application does not limit the names. Not limited to virtual keys, each navigation key in the navigation bar 102 may also be implemented as a physical key.
Calendar indicator 103 may be used to indicate the current time, e.g., date, day of week, time division information, etc.
The weather indicator 104 may be used to indicate a weather type, such as cloudy sunny, light rain, etc., and may also be used to indicate information such as temperature, etc.
The tray 105 with the common application icons may show: phone icon 105A, contact icon 105B, short message icon 105C, camera icon 105D.
Other application icons may be, for example: an icon 106 for mutual communication, an icon 107 for a gallery, an icon 108 for music, an icon 109 for an application, an icon 110 for a mailbox, an icon 111 for cloud sharing, an icon 112 for a memo, and an icon 113 for setting. The user interface 10 may also include a page indicator 114. Other application icons may be distributed across multiple pages, and page indicator 106 may be used to indicate which page the user is currently viewing for the application. The user may slide the area of the other application icons side-to-side to view the application icons in the other pages.
In some embodiments, the user interface 10 exemplarily shown in FIG. 1 may be a Home screen (Home Screen).
In other embodiments, the electronic device 100 may further include a front camera, which may also be referred to as a sub-camera, located mainly above the screen of the electronic device 100, and the front camera may be used for self-timer shooting, video call, and the like.
It is understood that fig. 1 only illustrates the user interface on the electronic device 100 by way of example, and should not be construed as limiting the embodiments of the present application.
Illustratively, as shown in fig. 2, the user may click an icon 105D of the camera on the user interface 10, the electronic device 100 detects the user operation, and in response to the user operation, the electronic device 100 displays the camera's shooting interface 11.
The user interface 11 may include: navigation bar 201, shooting control 202, photo album control 203, camera switching control 204, shooting mode 205, image preview area 206, flash icon 207, setting icon 208. Wherein:
the photographing control 202 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 may capture image data using a camera and display the image.
The album control 203 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic apparatus 100 may display a newly saved photo in the album, and the expression form of the album control 203 may be a reduced view of the newly taken photo.
The camera switching control 204 is used to switch cameras. The electronic device 100 may detect a touch operation (e.g., a click operation on the camera switching control 204) acting on the camera switching control 204, and in response to the touch operation, the electronic device 100 may switch the camera. For example, the camera used by the electronic apparatus 100 for shooting is switched from the front camera to the rear camera, or the camera used by the electronic apparatus 100 for shooting is switched from the rear camera to the front camera.
The photographing mode 205 may include: night mode 205A, professional mode 205B, photo mode 205C, video mode 205D, portrait mode 205E, and the like. The night view mode 205A, the professional mode 205B, and the portrait mode 205E are all photographing modes optimized for a specific scene. Any of the shooting modes 205 described above may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 may display a shooting interface in the shooting mode.
It can be appreciated that if the current shooting mode is the photographing mode, the photographing control 204 can be used to take a photograph; if the current shooting mode is the video recording mode, the shooting control 204 can be used to start or stop the video recording.
The image preview area 206 may be used to display images captured by a front or rear camera (i.e., a camera currently used for shooting) of the electronic device 100.
It can be understood that, if the camera currently used for shooting by the electronic device 100 is a front camera, the image preview area 206 is used for displaying an image collected by the front camera of the electronic device 100; if the camera currently used by the electronic device 100 for shooting is a rear camera, the image preview area 206 is used for displaying an image collected by the rear camera of the electronic device 100.
The flashing icon 207 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 may display a setting option of flashing.
The setting icon 208 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic apparatus 100 may display a setting interface of the camera.
Illustratively, as shown in fig. 3, the user may click on the album control 205 of the camera on the user interface 11, the electronic device 100 detects the user operation, and in response to the user operation, the electronic device 100 displays the latest shot photo on the album interface 12.
The album interface 12 may include: status bar 301, navigation bar 302, image display area 303, return key 304, property control 305, album control 306, share icon 307, edit icon 308, delete icon 309, and more icons 310. Wherein:
the image display area 303 may be used to display the latest shot photo displayed by the electronic device 100 and may also be used to display other images in the album of the electronic device 100.
The return key 304 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic apparatus 100 may display a page previous to the current page.
The property control 305 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 may display a file property of an image displayed in the image display area 304. The file attributes of the image may include creation time, size, file size, storage path, and the like.
In some embodiments of the present application, the file attribute of the image may further include a rotation parameter, where the rotation parameter is used to represent a second rotation angle required by the direction of the image displayed in the image display area 303 on the display screen of the electronic device when the user looks up to the vertical screen to view the image displayed in the image display area 303 of the display screen, relative to the direction of the image displayed in the image preview area 206 during shooting.
In some embodiments of the present application, the file attribute of the image may include a direction parameter, where the direction parameter is used to represent a horizontal and vertical screen state of the electronic device 100 when the electronic device 100 takes the image.
It will be appreciated that the file property of the image may comprise only one of the rotation parameter and the orientation parameter.
The image and the rotation parameter/the direction parameter are stored in association, and the rotation parameter/the direction parameter may be stored in a file attribute of the image.
The album control 306 may receive a user operation (e.g., a touch operation) and in response to detecting the user operation, the electronic device 100 may display more images in the album of the electronic device 100.
The sharing icon 307 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic apparatus 100 may display a sharing interface of the image displayed in the image display area 303.
The edit icon 308 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic apparatus 100 may display an edit interface of the image displayed in the image display area 303.
The delete icon 308 may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic apparatus 100 may delete the image displayed in the image display area 303.
The further icon 310 may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, the electronic device 100 may display one or more function options of the image displayed in the image display area 303. For example, set to wallpaper, move, magnitude, rename, etc.
In the embodiment of the present application, in the user interface shown in fig. 2, the current shooting mode is a photographing mode. The capture control 202 may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, the electronic device 100 captures a photograph and may display the user album interface 12, and the image display area 303 of the user album interface 12 is used to display the photograph. Alternatively, the capture control 204 may receive a user operation (e.g., a touch operation), in response to which the electronic device 100 captures and saves a photograph, and the electronic device 100 may continue to display the capture interface 11, and the user may view the captured photograph by clicking the album control 203 on the capture interface 11. This is not particularly limited in the embodiments of the present application.
It is understood that fig. 1 to 3 only illustrate the user interface on the electronic device 100 by way of example, and should not be construed as limiting the embodiments of the present application.
The Image data acquired by the application camera of the electronic device 100 is from an Image Sensor (Image Sensor) of the camera. The user can rotate the electronic device 100 at any angle while taking a picture. The electronic equipment displays an image acquired by a rear camera in an image preview area of a shooting interface; after the electronic device 100 receives the shooting operation of the user, the electronic device 100 stores the image in response to the shooting operation. In order to ensure that the direction of the image displayed by the electronic device 100 is consistent with the direction of the image seen by the user in the preview image area when the user takes the picture when the user views the picture, the electronic device 100 generally displays the image on the display screen according to the horizontal and vertical screen states of the image when the electronic device 100 takes the picture, so that the image displayed on the display screen can be consistent with the preview image seen by the user in the preview image area when the user takes the picture no matter how the electronic device 100 is rotated when the user takes the picture. The landscape screen state of the electronic device 100 may include four states of an upward landscape screen, a right landscape screen, a downward landscape screen, and a left landscape screen.
For example, fig. 4A to 4D are schematic diagrams illustrating shooting by the electronic device 100 according to an embodiment of the present application, wherein the electronic device is configured to shoot an upward vertical screen, a left horizontal screen, a downward vertical screen, and a right horizontal screen. As shown in fig. 4A to 4D, when the electronic device 100 takes a picture, the first row of the image data collected by the camera is the first row of the image data close to the top of the electronic device 100, and the first column of the image data is the first column of the image data close to the left of the electronic device 100. The top, the bottom, the left side and the right side of the electronic device 100 are fixed, that is, the top, the bottom, the left side and the right side of the electronic device 100 when the electronic device 100 is vertically up, and the top, the bottom, the left side and the right side of the electronic device 100 are unchanged when the electronic device 100 is vertically down or horizontally down. For example, the top of the electronic device 100 is provided with a camera, and no matter the electronic device is vertically or horizontally arranged, the top of the electronic device 100 is the end with the camera.
As shown in fig. 4A, the electronic device 100 takes a picture while standing up on the screen, and when the user views the picture while standing up on the screen, the first row of image data is located at the top of the picture and the first column of image data is located at the left of the picture.
As shown in fig. 4B, the electronic device 100 takes a picture of the left landscape. To ensure that the picture displayed on the electronic device 100 coincides with the image that the user sees in the preview image area when taking the picture, the first row of image data of the picture is located to the left of the image displayed on the electronic device 100 and the first column of image data of the picture is located at the bottom of the image displayed on the electronic device 100 when the user views the picture in a portrait orientation, at which time the image displayed on the display of the electronic device 100 is rotated by-90 ° relative to the image displayed in the preview image area, i.e., by 90 ° counterclockwise.
As shown in fig. 4C, the electronic device 100 takes a picture by erecting the screen downward. To ensure that the picture displayed by the electronic device 100 is consistent with the image seen by the user in the preview image area when the user takes the picture, the first row of image data for the picture is located at the bottom of the image displayed by the electronic device 100 and the first column of image data for the picture is located to the right of the image displayed by the electronic device 100 when the user views the picture in a portrait up screen. At this time, the image displayed on the display screen of the electronic apparatus 100 is rotated by 180 ° with respect to the image displayed in the preview image area.
As shown in fig. 4D, the electronic device 100 takes a picture of the right landscape. To ensure that the photograph displayed by the electronic device 100 coincides with the image that the user viewed in the preview image area when taking the photograph, the first row of image data for the photograph is located to the right of the image displayed by the electronic device 100 and the first column of image data for the photograph is located at the top of the image displayed by the electronic device 100 when the user views the photograph in a portrait orientation. At this time, the image displayed on the display screen of the electronic apparatus 100 is rotated by 90 ° with respect to the image displayed in the preview image area.
In summary, when the user views the captured image, the electronic device 100 determines, according to the horizontal and vertical screen states of the captured image, a first rotation angle required by the image to be displayed on the display screen relative to the image displayed in the preview image area, and then the electronic device 100 determines, according to the first rotation angle, a display area of the image on the display screen, draws the image, and displays the drawn image in the display area.
For how to determine the horizontal and vertical screen states of the electronic device 100, a currently common technical scheme is to determine the gravitational acceleration of the electronic device 100 in three directions, namely an X axis, a Y axis and a Z axis, by using a three-axis gravitational acceleration sensor, and determine whether the electronic device 100 is in the horizontal screen state or the vertical screen state according to the gravitational acceleration in the three directions; or, the direction sensor is used to determine the included angles between the X-axis, the Y-axis and the Z-axis of the electronic device 100 and the gravity direction, and the state of the electronic device 100 whether being a horizontal screen or a vertical screen is determined according to the included angles.
Illustratively, as shown in fig. 5, the coordinate system of the electronic device 100 is defined in the following manner: the X axis is parallel to the short side direction of the screen of the electronic device 100 and points from the left side of the screen to the right side of the screen; the Y axis is parallel to the long edge direction of the screen and points to the upper side of the screen from the lower side of the screen; the Z axis is perpendicular to the plane formed by the X axis and the Y axis, namely the Z axis is perpendicular to the plane of the screen. When the electronic device 100 is placed horizontally and the screen is facing upward, the direction of the Z-axis is opposite to the direction of gravity, and the acceleration values of the three axes of the X-axis, the Y-axis and the Z-axis are 0, 0 and-9.8 respectively (the acceleration of gravity is 9.8). When the electronic device 100 is turned over, the gravity acceleration values of the corresponding X-axis, Y-axis and Z-axis will also change.
For example, as shown in fig. 4A to 4D, when the electronic device 100 is tilted up, the acceleration values of gravity of the corresponding X-axis, Y-axis, and Z-axis are 0, -9.8, and 0, respectively; when the screen is erected downwards, the gravity acceleration values of the corresponding X axis, Y axis and Z axis are 0, 9.8 and 0 respectively. When the electronic device 100 is horizontally arranged on the left side, the gravity acceleration values of the corresponding X axis, Y axis and Z axis are-9.8, 0 and 0 respectively; when the electronic device 100 is tilted to the right, the acceleration values of gravity of the corresponding X-axis, Y-axis, and Z-axis are 9.8, 0, and 0, respectively. It can be understood that, when the user holds the electronic device 100, it is not exactly the same as fig. 4A to 4D, and the electronic device 100 may determine the landscape screen state or the portrait screen state of the electronic device 100 according to that the three-axis gravitational acceleration values belong to the preset range of the three-axis gravitational acceleration values corresponding to the landscape screen or the portrait screen.
As shown in fig. 4A to 4D, when the electronic device 100 erects the screen upward, the included angles between the corresponding gravity acceleration direction and the X axis, the Y axis, and the Z axis are 90 °, -180 °, and 90 °; when the screen is vertically arranged downwards, the included angles between the corresponding gravity acceleration direction and the X axis, the included angles between the corresponding gravity acceleration direction and the Y axis, and the included angles between the corresponding gravity acceleration direction and the Z axis are-90 degrees, 180 degrees and 90 degrees respectively. When the electronic device 100 is horizontally arranged on the left side, the included angles between the corresponding gravity acceleration direction and the X axis, the included angles between the corresponding gravity acceleration direction and the Y axis are-180 degrees, -90 degrees and 90 degrees respectively; when the electronic device 100 is horizontally positioned on the right side, the included angles between the corresponding gravity acceleration direction and the X axis, the Y axis and the Z axis are 180 °, 90 ° and 90 °, respectively. It can be understood that, when the user holds the electronic device 100, it is not exactly the same as fig. 4A to 4D, and according to that the included angle between the gravity acceleration direction and the three axes belongs to the preset range of the included angle of the three axes corresponding to the horizontal screen or the vertical screen, the electronic device 100 may determine the horizontal screen or the vertical screen state of the electronic device 100.
Illustratively, as shown in FIG. 6, the gravitational acceleration direction is at a 90 angle to the Z axis. When the included angle between the gravity acceleration direction and the Y axis is between-135 degrees and-180 degrees or between 135 degrees and 180 degrees, the electronic device 100 is in an upward vertical screen state; when the included angle between the gravity acceleration direction and the Y axis is between-45 degrees and-135 degrees, the electronic device 100 is in a left horizontal screen state; when the included angle between the gravity acceleration direction and the Y axis is between-45 degrees and 0 degrees or between 0 degrees and 180 degrees, the electronic device 100 is in a downward vertical screen state; when the gravitational acceleration direction forms an angle between 45 ° and 135 ° with the Y-axis, the electronic device 100 is in the right landscape state.
In addition to the gravitational acceleration sensor, the electronic device 100 may acquire the gravitational acceleration on three axes of the electronic device 100 through a magnetic sensor and an acceleration sensor.
When the electronic apparatus 100 is horizontally placed (or nearly horizontally placed), the direction of the gravitational acceleration g coincides or nearly coincides with the z-axis in the aforementioned coordinate system, and the component values of the vectors of the gravitational acceleration in the X-axis and the Y-axis are 0 (or nearly 0). In the above situation, the electronic device 100 cannot obtain effective gravity acceleration components of the X-axis acceleration and the Y-axis acceleration, so that the horizontal and vertical screen states of the electronic device 100 cannot be accurately determined. Therefore, when the user views the photographed picture, the direction of the picture displayed by the electronic device 100 may not coincide with the direction of the preview image seen when the user photographs, which degrades the user experience.
One of the more common shooting scenarios in which the above-mentioned problems occur is that the user takes a video of the electronic device 100 parallel to the ground plane, or close to parallel to the ground plane. In fact, except for the shooting scene, under the limitation of the prior art, the gravity acceleration sensor built in the electronic device 100 has a certain probability in any scene, and cannot accurately identify the horizontal and vertical screen states of the electronic device 100, so that trouble in use is brought to a user, and user experience is reduced.
In order to solve the above problem, an embodiment of the present application provides a method for displaying an image, which can accurately obtain a first horizontal and vertical screen state when the electronic device 100 takes a picture. When the user views the photo, the first rotation angle required by the photo displayed on the display screen relative to the image displayed in the preview image area when the photo is taken can be determined.
When a user wants to take a picture with the rear camera of the electronic device 100, the user may open a shooting interface through a photographing application (e.g., a camera application as shown in fig. 2), and the shooting interface may display a first image captured by the rear camera. At this time, the electronic apparatus 100 is in the first landscape state (e.g., the left landscape state as shown in fig. 7).
Illustratively, as shown in fig. 7, the user holds the electronic device 100 across the screen to the left, and the image preview area 206 of the capture interface of the electronic device 100 is used to display the image captured by the rear camera. The current shooting mode of the electronic apparatus 100 shown in fig. 7 is photographing.
In addition to opening the capture interface via the capture application, the user may also open the capture interface in other manners. For example, the user may open the shooting interface through a shooting function in other applications, such as instant messaging software, art designing software, and the like.
In some embodiments of the present application, the electronic apparatus 100 receives a shooting operation of a user, and in response to detecting the shooting operation, the electronic apparatus 100 shoots a first image using the rear camera and captures a first reference image using the front camera. The first reference image is used for determining a first horizontal and vertical screen state of the electronic device, and the first horizontal and vertical screen state is a horizontal and vertical screen state of the electronic device 100 during shooting. After the electronic device 100 determines the first horizontal and vertical screen states, the first image and the first parameter of the first image are saved, and the first parameter may be used to represent the horizontal and vertical screen states of the electronic device 100 when a photo is taken.
Illustratively, as shown in fig. 8, the shooting operation may be a user clicking a shooting control 202 on the shooting interface 11. As shown in fig. 8, the front camera captures a first reference image in response to a photographing operation by a user. The electronic device 100 may determine that the first landscape and portrait screen status is left landscape from the first reference image.
In some embodiments of the present application, the display screen is configured with a touch panel, and is configured to receive a shooting operation of a user, and in response to the shooting operation, the electronic device 100 takes a picture with the rear camera. The shooting operation may be a touch operation in which a user touches the display screen with a hand, an elbow, a stylus, or the like. For example, single click, double click, long press, drawing a particular shape (e.g., a Z-shape, rectangle, circle, etc.) through a knuckle, and so on. In addition to touch operations, there may be other ways to implement the above user operations in specific implementations. For example, a user can start a rear camera for shooting by pressing a key; or, the rear camera is started to shoot by detecting voice input by a user; or, the rear camera is opened through a specific gesture to shoot. This is not particularly limited in the embodiments of the present application.
In some embodiments of the present application, the electronic device 100 receives a viewing operation of a user, and in response to the detected viewing operation, the electronic device 100 acquires a second reference image by using the front camera, and determines a second landscape state and a portrait state of the electronic device 100 according to the second reference image; and then, determining a first rotation angle required by the first image to be displayed according to the first parameter and the second horizontal and vertical screen states, and displaying the first image according to the first rotation angle.
Illustratively, as shown in fig. 9, after the user takes a picture on the left landscape screen shown in fig. 8, the user wants to view the taken picture, the user erects the screen upwards to hold the electronic device, and clicks the album control 203 of the shooting interface 11, in response to the above viewing operation, the electronic device 100 determines that the second landscape screen state of the electronic device 100 is the left landscape screen according to the reference picture collected by the front camera, and determines that the first rotation angle required by the picture is-90 ° according to the first parameter and the second landscape screen state of the picture, the electronic device 100 determines the display area 304 of the picture on the display screen according to the first rotation angle and draws the picture, and then the picture is displayed on the display area 304 of the album interface 12.
In addition to viewing the photos by clicking on the album control 203 of the capture interface 11, the user may view the photos in other ways. And is not particularly limited herein. For example, the user may also view the photos through the gallery application corresponding to the gallery icon 107 shown in fig. 1, and the user may also call the gallery through other application programs such as instant messaging software and shopping software to view the photos in the gallery.
In some embodiments of the present application, the first parameter may be a direction parameter or a rotation parameter. For example, the Orientation parameter may be a parameter "Orientation" in an exchangeable image file format (Exif) of a photograph.
The scheme provided by the implementation of the application is not limited by the performance defects of the built-in sensor of the electronic equipment 100, and the horizontal and vertical screen states meeting the user behavior intention are obtained through detecting the user behavior. Therefore, the scheme provided by the application can reduce the possibility of inaccurate horizontal and vertical screen states and improve the shooting experience of users.
How to determine the landscape and portrait screen states of the electronic apparatus 100 in the embodiment of the present application is described below.
In the embodiment of the application, the horizontal and vertical screen states when the electronic device 100 shoots can be determined according to the first reference image, and the horizontal and vertical screen states when the electronic device 100 receives the viewing operation can be determined according to the second reference image.
In some embodiments of the present application, the electronic device 100 may determine the horizontal and vertical screen states of the electronic device 100 according to a first included angle, where the first included angle is an included angle between a standard reference line of the reference image and a face reference line in the reference image. The reference picture is the first reference picture or the second reference picture.
In some embodiments of the present application, the standard reference line may be oriented along the side of the first column of the reference image toward the top of the reference image (i.e., the side of the first row of the reference image), as shown in FIG. 10 for example.
In some embodiments of the present application, the face reference line may be determined according to one or more of the above-mentioned positions of eyes, nose and mouth of the human face. For example, the above-mentioned face reference line includes at least two of a middle point of both eyes, a middle point of a nose, and a middle point of a mouth of the face.
For example, as shown in fig. 11A, the orientation of the face reference line may be directed from the chin to the vertex direction along a vertical straight line of the binocular connecting line. For example, the face reference line is a vertical straight line of a binocular connecting line of the face.
For example, as shown in fig. 11B, the orientation of the face reference line may be directed from the right eye to the left eye along the binocular connecting line. For example, the face reference line may also be a straight line parallel to the binocular connecting line.
It should be noted that, the orientation of the standard reference line and the orientation of the face reference line are exemplary orientations provided in the embodiment of the present application, and should not be limited to the embodiment of the present application.
How to determine the landscape and portrait screen states of the electronic apparatus 100 based on the reference image in the embodiment of the present application is described below.
(1) First, it is determined whether the reference image contains a human face.
In some embodiments of the present application, the electronic device 100 performs face recognition on the reference image, and determines whether the reference image includes a face. When the reference image contains a human face, determining a human face reference line of the human face as a human face reference line of the reference image; when the reference image does not contain a human face, the electronic device 100 determines the horizontal and vertical screen states of the electronic device 100 in a conventional manner.
In some embodiments of the present application, a reference image is input into a trained face detection model, and classification information output by the face detection model is obtained, where the classification information is a processing result of a neural network and is used to indicate whether the reference image includes a face.
Specifically, the electronic device 100 extracts the face features on the reference image through the face detection model, and matches the extracted face features with the face features of the face template to obtain a correlation value between the face image and the face template. And then judging whether a human face exists or not by using a classifier according to the correlation value.
In some embodiments of the present application, the electronic device 100 may determine whether a face is included in the reference image using a Viola-Jones face detection algorithm. The Viola-Jones face detection algorithm uses Haar features to describe the common features of faces, for example, the eyes of a person in a positive face image will be darker than the cheeks, the lips will be darker than the surrounding areas, and the nose will be lighter than the cheeks.
In addition to the above face recognition scheme, it may also be determined whether the reference image includes a face through other schemes, which is not specifically limited in this embodiment of the application.
In some embodiments of the present application, the electronic device 100 performs face recognition on a reference image, and determines whether the reference image includes a face of a preset user. When the reference image contains the face of a preset user, determining the face reference line of the preset user in the reference image as the face reference line of the reference image; when the reference image does not contain the face of the preset user, the electronic device 100 displays the horizontal and vertical screen states of the electronic device 100 in a conventional manner.
It is understood that the electronic device 100 stores therein the face image of the preset user.
Specifically, when the electronic device 100 recognizes that the reference image includes one or more faces, the electronic device 100 may match the one or more face images with a face image of a preset user, and when the matching degree of the face features of the first face image and the face image of the preset user exceeds a certain threshold, it is determined that the reference image includes the face of the preset user, and the one or more face images include the first face image.
(2) And when the reference image contains a human face, determining a human face reference line of the reference image.
In some embodiments of the present application, when the electronic device 100 recognizes that the reference image includes one or more faces, the electronic device 100 determines a face datum line of a face with a largest area among the one or more faces as a face datum line of the reference image.
Illustratively, as shown in fig. 12, the reference image includes a face of a user 1 and a face of a user 2, the face area of the user 2 in the reference image is larger than the face area of the user 1, and the face reference line of the user 2 is determined as the face reference line of the reference image.
In some embodiments of the present application, a reference image is input into a trained face detection model, and classification information and fixed-point position information output by the face detection model are obtained, where the classification information and the fixed-point position information are processing results of a neural network. The pointing position information may be used to indicate the positions of the eyes, nose, and mouth in the reference image.
In some embodiments of the present application, a reference image is input into a trained face detection model, and classification information and alignment position information output by the face detection model are obtained, where the classification information and the position information are processing results of a neural network, and the alignment position information is used to indicate a position of a face reference line in the reference image.
The face detection model may be a Convolutional Neural Network (CNN), a Multi-task convolutional neural network (MTCNN), a Deep Convolutional Neural Network (DCNN), a Regional Convolutional Neural Network (RCNN), a Generative Adaptive Network (GAN), a super-resolution generative adaptive network (SRGAN), or the like. And is not particularly limited herein.
In some embodiments of the present application, the face detection model may include all or part of the MTCNN model. MTCNN includes four submodels: the system comprises a face classification model, a face frame regression model, a face key point positioning model and a face key line positioning model. Each sub-model is a CNN model. The face classification model outputs two values, which are the probability of a face and the probability of a non-face, respectively. The face detection model can judge whether the reference image contains the face according to the output value of the face classification model. The face frame regression model outputs a four-dimensional vector including the top left corner coordinates and width and height of the face region. The face key point positioning model outputs a ten-dimensional vector which is the coordinates of the left eye, the right eye, the nose, the left mouth and the right mouth respectively. The face detection model can determine the coordinates of the middle point of the eyes, the middle point of the nose and/or the middle point of the mouth of the face in the reference image according to the output value of the face key point positioning model. The human face key point positioning model outputs a four-dimensional vector and coordinates of any two different points in the human face datum line.
The method for determining the face reference line is not limited to the above determination method, and other methods may also be used to determine the face reference line in the embodiment of the present application, which is not limited specifically here.
(3) Finally, the electronic device 100 determines the horizontal and vertical screen states of the electronic device 100 according to the human face reference line in the reference image.
How to determine the horizontal and vertical screen states of the electronic device 100 is described below with respect to two types of human face reference lines shown in fig. 11A and 11B, respectively.
In some embodiments of the present application, the orientation of the face reference line may be from chin to vertex along a vertical line of the binocular connecting line. When the first angle is between-45 ° and 45 °, the horizontal and vertical screen states of the electronic device 100 are the vertical and horizontal screen states. When the first angle is between 45 ° and 135 °, the landscape and portrait screen status of the electronic device 100 is right landscape. When the first included angle is between 135 ° and 180 ° or-135 ° (225 °) and-180 ° (180 °), the landscape-portrait state of the electronic apparatus 100 is a landscape-portrait state. When the first angle is between-45 ° (225 °) to-135 ° (315 °), the landscape/portrait screen state of the electronic apparatus 100 is a left landscape screen.
Illustratively, as shown in fig. 13, the first included angle is equal to-88 °, and the first included angle is between-45 ° and-135 °, the electronic device 100 determines that the horizontal and vertical screen states of the electronic device 100 are left horizontal and vertical screens when the electronic device 100 acquires the reference image.
In some embodiments of the present application, the orientation of the face reference line may be directed from the right eye to the left eye along the binocular connecting line. When the first included angle is between 45 ° and 135 °, the electronic device 100 is in the horizontal and vertical screen states of the up-screen mode. When the first included angle is between 135 ° and 180 ° or-135 ° (225 °) and-180 ° (180 °), the landscape/portrait screen state of the electronic apparatus 100 is a right landscape screen. When the first included angle is between-45 ° (225 °) to-135 ° (315 °), the landscape screen state of the electronic apparatus 100 is a landscape screen state. When the first angle is between-45 ° and 45 °, the horizontal and vertical screen states of the electronic device 100 are left horizontal screens.
Illustratively, as shown in fig. 14, the first included angle is equal to 2 °, and the first included angle is between-45 ° and 45 °, and the electronic device 100 determines that the horizontal and vertical screen states of the electronic device 100 are left horizontal and vertical screens when the electronic device 100 acquires the reference image.
How to determine the first parameter of the image according to the horizontal and vertical screen states of the electronic device 100 when the image is captured will be described below, and the first parameter may be a direction parameter or a rotation parameter.
In some embodiments of the present application, after the electronic device 100 determines the first landscape screen state when the image is captured, the orientation parameter of the image is determined according to the first landscape screen state. The electronic device 100 saves the image and the orientation parameters of the image.
In some embodiments of the present application, when the horizontal and vertical screen states of the electronic device 100 when capturing an image are vertical screen up, the direction parameter of the photo is equal to 1; when the horizontal and vertical screen states of the electronic device 100 during shooting are left horizontal screens, the direction parameter of the image is equal to 8; when the horizontal and vertical screen states of the electronic device 100 when shooting an image are vertical screen down, the direction parameter of the image is equal to 3; the horizontal and vertical screen states of the electronic device 100 when capturing an image are characterized as representing a right horizontal screen, and the orientation parameter of the image is equal to 6.
In this embodiment, in addition to the above representation manner, the direction parameter may also represent the horizontal and vertical screen states of the electronic device 100 through other numerical values or characters, which is not specifically limited herein.
In some embodiments of the present application, after the electronic device 100 determines a first landscape state of the electronic device 100 when the image is captured, the electronic device determines a rotation parameter of the image according to the first landscape state. The electronic device 100 saves the image and the rotation parameter of the image. The rotation parameter is used to represent a second rotation angle required for the electronic device 100 to display the photo when the user views the image in the vertical screen.
As shown in fig. 4A, when the electronic device 100 takes a photo, if the horizontal and vertical screen states are vertical and vertical, the rotation parameter of the photo should be equal to 0 °; as shown in fig. 4B, if the horizontal and vertical screen states are left horizontal screen, the rotation parameter of the photo should be equal to-90 °; as shown in fig. 4C, if the horizontal screen and the vertical screen are in the downward vertical screen state, the rotation parameter of the photo should be equal to 180 °; as shown in fig. 4D, if the landscape screen state is a left landscape screen, the rotation parameter of the photograph should be equal to 90 °.
It is to be understood that the face reference line of the first reference image may be a projection of a face reference line of an actual face of the user on the screen of the electronic device 100, and the standard reference line may be parallel to the side of the electronic device 100. If the face standard line is perpendicular to the binocular connecting line, when the standard reference line rotates by a first included angle relative to the face reference line of the first reference image, in order to ensure that an image viewed by a user through the vertical screen of the electronic device 100 is consistent with a preview image viewed by the user when the image is shot, the image displayed on the display screen of the electronic device 100 is the image of the preview image after the preview image rotates by the first included angle, and at this time, the length-width size ratio of the image displayed by the electronic device 100 is the same as that of the preview image. Since the rotation angle of the image displayed by the electronic device 100 is only four cases, 0 °, 90 °, 180 °, and-90 ° (270 °), when the first angle is between-45 ° and-135 °, the rotation parameter is equal to-90 °; the rotation parameter is equal to 180 ° when the first angle is between 135 ° and 225 °; the rotation parameter is equal to 90 ° when the first angle is between 45 ° and 135 °.
To sum up, when the orientation of the human face reference line is directed from the chin to the vertex of the head along the vertical straight line of the binocular connecting line, as shown in table 1, the orientation is the corresponding relationship between the first included angle, the first horizontal and vertical screen state, the direction parameter, and the rotation parameter provided in the embodiment of the present application.
TABLE 1
First included angle First horizontal and vertical screen state Direction parameter Rotation parameter
-45 ° to 45 ° Vertical screen 1
-135 ° to-45 ° Left horizontal screen 8 -90°
135 to 225 DEG Vertical screen 3 180°
45 DEG to 135 DEG Right horizontal screen 6 90°
To sum up, when the orientation of the face reference line is directed from the bottom of the first reference image to the top of the first reference image along the binocular connecting line, as shown in table 2, the orientation is the corresponding relationship between the first included angle, the first horizontal and vertical screen state, the direction parameter, and the rotation parameter provided in the embodiment of the present application.
TABLE 2
First included angle First horizontal and vertical screen state Direction parameter Rotation parameter
45 DEG to 135 DEG Upward vertical screen 1
-45 ° to 45 ° Left horizontal screen 8 -90°
-135 to-45 ° Vertical screen 3 180°
135 to 225 DEG Right horizontal screen 6 90°
How the electronic device 100 determines the first rotation angle required by the first image to be displayed according to the second horizontal and vertical screen states of the electronic device 100 and the first parameter of the image when the user views the photographed first image is described below. The following description will be made with respect to the first parameter being a direction parameter and a rotation parameter, respectively.
(1) The first parameter is a direction parameter.
In this embodiment, the electronic device 100 receives a viewing operation of a user, and in response to the detected viewing operation, the electronic device 100 determines a first rotation angle required by the image according to a current horizontal and vertical screen state (i.e., a second horizontal and vertical screen state) of the electronic device 100 and a direction parameter of the first image, and displays the first image according to the first rotation angle. The following is a description of the 4 possible orientations for the second landscape and portrait screen position, respectively.
In some embodiments of the present application, when the user views the first image in a portrait screen, the electronic device 100 determines that the second landscape screen state of the electronic device 100 is the landscape-up screen. Referring to fig. 4A, when the direction parameter of the image that the user intends to view represents that the screen is vertically displayed upwards, the first rotation angle required by the image is 0 °; referring to FIG. 4B, the first rotation angle required for the image that the user intends to view is-90 when the orientation parameter of the image characterizes the left landscape screen; referring to fig. 4C, the direction parameter of the image that the user intends to view represents a first rotation angle of 180 ° required for the image when the screen is vertically displayed downward; referring to fig. 4D, the first rotation angle required for the image that the user intends to view is 90 ° when the orientation parameter of the image characterizes the right landscape. When the second horizontal and vertical screen state is the upward vertical screen state, the first rotating angle is equal to the second rotating angle.
In some embodiments of the present application, when the user performs the viewing operation on the left landscape screen, the electronic device 100 determines that the second landscape screen state of the electronic device 100 is the left landscape screen, and the first rotation angle required for the first image may be another rotation angle of 90 ° based on the first rotation angle required when the user views the image in the portrait screen. Therefore, when the direction parameter of the first image represents that the screen is vertically displayed upwards, the first rotation angle required by the image is 90 degrees; when the direction parameter of the first image represents the left transverse screen, the first rotation angle required by the image is 0 degree; when the direction parameter of the first image represents that the screen is erected downwards, the first rotation angle required by the image is-90 degrees; the first rotation angle required for the first image is 180 deg. when the orientation parameter of the first image is characteristic of the right landscape.
In some embodiments of the application, similarly, when the user performs the viewing operation while viewing the image in the vertical screen, the electronic device 100 determines that the second horizontal and vertical screen state of the electronic device 100 is the vertical screen, and the first rotation angle required for the first image may be rotated by 180 ° again based on the first rotation angle required when the user views the image in the vertical screen. Therefore, when the direction parameter representation of the first image is vertically displayed upwards, the first rotation angle required by the image is 180 degrees; when the direction parameter of the first image represents the left transverse screen, the first rotation angle required by the image is 90 degrees; when the direction parameter representation of the first image is vertically displayed downwards, the first rotation angle required by the image is 0 degree; the first rotation angle required for the first image is-90 deg. when the orientation parameter of the first image is characteristic of the right landscape.
In some embodiments of the present application, similarly, when the user performs the viewing operation on the right landscape screen, the electronic device 100 determines that the second landscape screen state of the electronic device 100 is the right landscape screen, and the first rotation angle required for the first image may be rotated by-90 ° based on the first rotation angle required for the user to view the image in the portrait screen. Therefore, when the direction parameter of the first image represents that the screen is vertically arranged upwards, the first rotation angle required by the image is-90 degrees; when the direction parameter of the first image represents the left transverse screen, the first rotation angle required by the image is 180 degrees; when the direction parameter of the first image represents that the screen is erected downwards, the first rotation angle required by the image is 90 degrees; when the orientation parameter of the first image represents the right landscape screen, the first rotation angle required by the image is 0 degrees.
(2) The first parameter is a rotation parameter.
It is understood that the rotation parameter, i.e., the second landscape screen state, is a second rotation angle required for the image displayed by the electronic apparatus 100 to be displayed while the screen is portrait-up. The rotation parameter is also used for representing the horizontal and vertical screen states when the picture is taken, and the direction parameter and the rotation parameter are different in expression form.
In this embodiment of the application, the electronic device 100 receives a viewing operation of a user for a first image, and in response to the detected viewing operation, the electronic device 100 determines a first rotation angle required by the image according to a second landscape screen state of the electronic device 100 and a rotation parameter of the first image, and displays the first image according to the first rotation angle.
In some embodiments of the present application, when the user views the first image in the vertical and horizontal screen positions (i.e., the second horizontal and vertical screen positions are in the vertical and horizontal screen positions), the first rotation angle required for the first image is equal to the rotation parameter. The rotation parameters can refer to the foregoing embodiments, and are not described herein again.
Similar to the direction parameter, in the embodiment of the present application, when the user performs the viewing operation on the left landscape screen (i.e., the second landscape screen state is the left landscape screen), the first rotation angle required for the first image may be the rotation parameter directly plus 90 °. When the user performs a viewing operation while the screen is being displayed vertically downward (i.e., the second landscape screen state is the screen being displayed vertically downward), the first rotation angle required for the first image may be the rotation parameter directly added to 180 °. When the user performs the viewing operation on the right landscape screen (i.e., the second landscape screen state is the right landscape screen), the first rotation angle required for the first image may be the rotation parameter directly plus-90 °.
In summary, as shown in table 3, the second horizontal and vertical screen states, the direction parameters, the rotation parameters, and the first rotation angle required by the to-be-displayed photo provided in the embodiment of the present application are a corresponding relationship.
TABLE 3
Figure BDA0002517362510000131
Figure BDA0002517362510000141
In some embodiments, when the second landscape screen state is the preset landscape screen state, the first rotation angle required by the electronic device 100 to display the first image is zero. That is, when the second landscape screen state is the preset landscape screen state, the direction of the first image displayed by the electronic apparatus 100 coincides with the direction of the preview image when the first image is captured. For example, the preset horizontal and vertical screen state is a downward vertical screen, and when the user views the first image through the downward vertical screen, the first rotation angle required by the first image is zero. The preset horizontal and vertical screen states may be set by the electronic device 100 by default or preset by the user.
The method for displaying an image provided by the embodiment of the application is described in detail below.
Fig. 15 is a schematic flowchart of a method for displaying an image according to an embodiment of the present application. As shown in fig. 15, the method for displaying an image provided by the embodiment of the present application includes, but is not limited to, steps S101 to S108. Possible implementations of embodiments of the method are described further below.
S101, the electronic device 100 receives shooting operation of a user.
For example, as shown in fig. 8 or 9, the electronic device 100 may receive a shooting operation of the user through the camera application, where the shooting operation may be that the user clicks the shooting control 202 of the shooting interface 11.
And S102, in response to the detected shooting operation, the electronic device 100 shoots a first image by using the first camera and acquires a first reference image collected by the second camera.
In some embodiments of the present application, the first camera is a rear camera and the second camera may be a front camera; in other embodiments, the first camera is a front-facing camera and the second camera may be a front-facing camera. And is not particularly limited herein.
The first camera is not limited to be controlled to shoot by clicking the shooting control 202 with the finger of the user, and the first camera can be controlled to shoot by other user operations in the embodiment of the application. And is not particularly limited herein.
Illustratively, the first reference image is shown in fig. 9.
In some embodiments of the present application, in response to a photographing operation by a user, the electronic apparatus 100 photographs a first image using a first camera and acquires a first reference image using a second camera.
In some embodiments of the present application, after the electronic device 100 is powered on, the electronic device 100 utilizes the second camera to capture images at regular time. The first reference image may be an image newly captured by the second camera when the electronic device 100 receives a shooting operation of the user.
In some embodiments of the present application, when the electronic device 100 turns on a camera and a camera currently used by the electronic device 100 for taking pictures is a first camera, the electronic device 100 uses a second camera to capture images at regular time. The first reference image may be an image newly captured by the second camera when the electronic device 100 receives a shooting operation of the user.
In some embodiments of the present application, the electronic device 100 turns on the camera, and when the electronic device 100 meets a preset condition, the electronic device 100 acquires the first reference image by using the second camera. The first reference image may be an image newly captured by the second camera when the electronic device 100 receives a shooting operation of the user.
In some embodiments of the present application, the preset condition may be that an included angle between the screen of the electronic device 100 and the horizontal plane is smaller than a preset angle. The preset condition may be that the gravitational acceleration of the three axes of the electronic device 100 is within a first preset range. The preset condition may be that an included angle between the gravity acceleration direction and three axes of the electronic device 100 is within a second preset range.
For example, the above-mentioned preset angle is equal to 10 °, 20 °, or 30 °.
It is understood that when the electronic device 100 does not satisfy the preset condition, the horizontal and vertical screen states of the electronic device 100 may be determined in a conventional manner. For example, the horizontal and vertical screen states of the electronic apparatus 100 are determined using a gravitational acceleration sensor or a direction sensor. When the electronic device 100 meets the preset condition, that is, when the electronic device 100 is placed approximately horizontally, the horizontal and vertical screen states of the electronic device 100 may be determined by using the scheme provided in the embodiment of the present application.
It should be noted that, the second camera mentioned in the embodiments of the present application may be a low power consumption camera.
Illustratively, as shown in fig. 8, the user clicks the shooting control 202, and in response to the user operation, the electronic device 100 shoots an image with the first camera and captures the first reference image with the second camera.
And S103, when the first reference image comprises a human face, determining a human face reference line of the first reference image.
In some embodiments of the present application, the face reference line may be determined according to one or more of the positions of eyes, nose and mouth of the human face. For example, the above-mentioned face reference line includes at least two of a middle point of both eyes, a middle point of a nose, and a middle point of a mouth of the face.
For example, as shown in fig. 11A, the orientation of the face reference line may be directed from the chin to the vertex along the above-mentioned two-eye connecting line vertical straight line. Illustratively, as shown in fig. 11B, the orientation of the face reference line may also be from the bottom of the first reference image to the top of the first reference image along the binocular connecting line.
It can be understood that the orientation of the face reference line shown in fig. 11A and 11B is an exemplary orientation provided in the embodiment of the present application, and should not be limited to the embodiment of the present application, and the face reference line may have other orientations in the embodiment of the present application.
In some embodiments of the application, when the first reference image includes a face of a preset user, the face reference line of the preset user in the first reference image is determined to be the face reference line of the first reference image.
In some embodiments of the present application, when the first reference image includes one or more faces, the electronic device 100 determines a face datum line of a face with a largest area among the one or more faces as a face datum line of the first reference image. Illustratively, as shown in fig. 12.
In this embodiment of the application, if the first reference image does not include a human face, the electronic device 100 may determine the horizontal and vertical screen states of the current electronic device 100 according to a conventional manner.
The method for determining the face reference line is not limited to the above determination method, and other methods may also be used to determine the face reference line in the embodiment of the present application, which is not limited specifically here. How to determine whether the first reference image includes a face may refer to the foregoing embodiments, and details are not repeated herein.
And S104, determining a first horizontal and vertical screen state when the electronic device 100 takes a picture according to a first included angle between the standard reference line of the first reference image and the face reference line in the first reference image.
The standard reference line may be oriented to point to the top of the first reference image (the first row of the first reference image) along the side of the first column of the first reference image. For example, a standard reference line may be as shown in fig. 10. The first included angle is an included angle of the standard datum line relative to a human face datum line in the first reference image. For example, the first included angle may be as shown in fig. 13 or fig. 14.
It is understood that the first reference image is a first reference image which is newly acquired when the electronic device 100 receives a shooting operation of the user, and a horizontal and vertical screen state when the electronic device 100 acquires the first reference image can be regarded as a horizontal and vertical screen state when the electronic device 100 performs shooting in response to the shooting operation.
In some embodiments of the present application, when the first included angle is within a first range, it is determined that the first horizontal and vertical screen state is an upward vertical screen state; when the first included angle is in a second range, determining that the first horizontal and vertical screen state is a left horizontal screen; when the first included angle is in a third range, determining that the first horizontal and vertical screen state is a downward vertical screen state; and when the first included angle is in the fourth range, determining that the first horizontal and vertical screen state is the right horizontal screen.
In some embodiments of the present application, when the orientation of the face reference line is from the chin to the vertex along the above-mentioned perpendicular line of the binocular connecting line, the correspondence between the first angle and the landscape and portrait states may be referred to in table 1, in which case the first range may be greater than-45 ° and equal to or less than 45 °, the second range may be greater than-135 ° and equal to or less than-45 °, the third range may be greater than 135 ° and equal to or less than 225 °, and the fourth range may be greater than 45 ° and equal to or less than 135 °. When the orientation of the face reference line is from the bottom of the first reference image to the top of the first reference image along the binocular connecting line, table 2 may be referred to for the corresponding relationship between the first included angle and the horizontal and vertical screen states. And will not be described in detail herein.
S105, the electronic device 100 stores the first image and a first parameter corresponding to the image, wherein the first parameter is used for representing a first horizontal and vertical screen state when the image is shot.
The image saved by the electronic device 100 is typically comprised of two parts. One part is the image data itself, which records the color value of each pixel in the image; the other part is the file header of the image, in which the attribute information of the image, such as the width, height, etc. of the image, is recorded.
In some embodiments of the application, the electronic device represents a first horizontal and vertical screen state through a first parameter, a first parameter of a first value represents an upward vertical screen, a first parameter of a second value represents a left horizontal screen, a first parameter of a third value represents a downward vertical screen, and a first parameter of a fourth value represents a right horizontal screen.
In some embodiments of the present application, the first parameter may be a direction parameter. For example, the first parameter is the parameter "organization" of Exif in the header of the image file. In some embodiments, a direction parameter of 1 characterizes the up vertical screen; the direction parameter is 8 for representing the right transverse screen; the direction parameter is 3 to represent the downward vertical screen; the orientation parameter 6 characterizes the left landscape.
In some embodiments of the present application, the first parameter may be a rotation parameter. Referring to fig. 4A to 4D, the rotation parameter is equal to 0 °, indicating that the first horizontal and vertical screen state is the vertical screen upward; the rotation parameter is equal to minus 90 degrees, and the first horizontal and vertical screen state is represented as a left horizontal screen; the rotation parameter is equal to 180 degrees, and the first horizontal and vertical screen state is represented as a downward vertical screen; the rotation parameter is equal to 90 degrees, and the first horizontal and vertical screen state is represented as a right horizontal screen.
In some embodiments of the application, when the electronic device 100 is powered on, or when the electronic device 100 turns on a camera and a camera currently used by the electronic device 100 for taking pictures is a first camera, or when the electronic device 100 turns on the camera and the electronic device 100 meets the preset condition, the electronic device 100 utilizes a second camera to capture images at regular time. In the above case, in one implementation, the electronic device 100 may generate the first parameter of each image when the second camera periodically acquires the image. When the electronic device 100 receives a shooting operation of a user, the electronic device 100 directly acquires a first parameter generated by a newly captured image. In this way, the first parameter acquisition speed can be increased. In the above case, in another implementation manner, the electronic device 100 generates the first parameter of the latest captured image of the second camera only when receiving the shooting operation of the user. In this way, power consumption of the electronic device 100 may be saved.
Different from the conventional method for determining the first parameter, the method for determining the first parameter provided by the embodiment of the present application accurately obtains the horizontal and vertical screen states of the electronic device 100 by detecting the behavior intention of the user, and then determines the first parameter. Therefore, when the user views the photos, the photos that meet the user's behavioral intent may be displayed.
S106, the electronic device 100 receives the viewing operation of the user.
Illustratively, as shown in fig. 15, the above-mentioned viewing operation may be that the user clicks an album control 203 on the shooting interface 11.
S107, in response to the detected viewing operation, the electronic device 100 determines a first rotation angle required by the first image to be displayed according to the first parameter of the electronic device 100 and the current horizontal and vertical screen states of the electronic device 100.
The current landscape screen state of the electronic device 100 may refer to the second landscape screen state in the foregoing embodiments.
In some embodiments of the present application, when the electronic device 100 receives a viewing operation of a user, if the electronic device 100 meets the preset condition, the electronic device 100 determines a second horizontal and vertical screen state of the electronic device 100 through a reference image acquired by a second camera; if the electronic device 100 does not satisfy the preset condition, the electronic device 100 determines the second landscape state of the electronic device 100 in a conventional manner. In some embodiments, the electronic device 100 determines the second landscape state of the electronic device 100 in a conventional manner without considering the aforementioned preset condition.
In some embodiments of the present application, in response to the detected viewing operation, the electronic device 100 acquires a second reference image captured by a second camera, and determines a second landscape state of the electronic device 100 according to the second reference image. Then, the electronic device 100 determines a first rotation angle required by the image to be displayed according to the first parameter of the electronic device 100 and the second landscape/portrait screen state of the electronic device 100.
Similar to the first horizontal and vertical screen states, the electronic device 100 may determine the second horizontal and vertical screen states of the electronic device 100 according to an angle between the standard reference line of the second reference image and the face reference line of the second reference image. And will not be described in detail herein.
Referring to the foregoing embodiment, when the first parameter is a direction parameter or a rotation parameter, the corresponding relationship between the first parameter, the second horizontal and vertical screen states of the electronic device 100, and the first rotation angle required for displaying the photo can be shown in table 3.
S108, the electronic device 100 displays the first image according to the first rotation angle required by the first image.
The electronic device 100 determines a display area of the first image on the display screen according to the first rotation angle, draws the first image, and displays the drawn first image in the display area.
In the embodiment of the application, the shooting interface can also be called a preview interface, and a preview image area of the shooting interface can also be called a first area; the photographing operation may also be referred to as a first user operation; the viewing operation may also be referred to as a second user operation; the first reference image may also be referred to as a second image; the image display area of the album interface may also be referred to as a second area; the second reference picture may also be referred to as a third picture. And is not particularly limited herein.
Next, an electronic apparatus related to the embodiment of the present application is described. The type of the electronic device is not particularly limited, and the electronic device 100 may be a portable electronic device such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a wearable device, and a laptop computer (laptop). Exemplary embodiments of portable electronic devices include, but are not limited to, portable electronic devices that carry an iOS, android, microsoft, or other operating system. The portable electronic device may also be other portable electronic devices such as laptop computers (laptop) with touch sensitive surfaces (e.g., touch panels) and the like. It should also be understood that in some other embodiments of the present application, the electronic device may not be a portable electronic device, but may be a desktop computer with a touch-sensitive surface (e.g., a touch panel), a smart television, or the like.
Referring to fig. 16, fig. 16 shows a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
The NPU may perform artificial intelligence operations using Convolutional Neural Networks (CNN) processing. For example, a CNN model is used for carrying out a large amount of information identification and information screening, and training and identification of scene intelligence can be achieved.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a bus or Universal Serial Bus (USB) interface, and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some embodiments of the present application, the interface content currently output by the system is displayed in the display screen 194. For example, the interface content is an interface provided by an instant messaging application.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like. The ISP is used to process the data fed back by the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some alternative embodiments of the present application, the pressure sensor 180A may be configured to capture a pressure value generated when a user's finger portion contacts the display screen and transmit the pressure value to the processor, so that the processor identifies which finger portion the user entered the user action.
The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
In some alternative embodiments of the present application, the pressure sensor 180A may transmit the detected capacitance value to the processor, so that the processor recognizes through which finger portion (knuckle or pad, etc.) the user inputs the user operation. In some alternative embodiments of the present application, the pressure sensor 180A may also calculate the number of touch points from the detected signals and transmit the calculated value to the processor, so that the processor recognizes the user operation by the single-finger or multi-finger input.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C to assist in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal 100 is a folder, the terminal 100 may detect the opening and closing of the folder according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the terminal 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications. In some alternative embodiments of the present application, the acceleration sensor 180E may be used to capture acceleration values generated when a user's finger portion contacts the display screen and transmit the acceleration values to the processor, so that the processor identifies which finger portion the user entered the user operation.
A distance sensor 180F for measuring a distance. The terminal 100 may measure the distance by infrared or laser. In some embodiments, the scene is photographed and the terminal 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal 100 emits infrared light outward through the light emitting diode. The terminal 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal 100. When insufficient reflected light is detected, the terminal 100 may determine that there is no object near the terminal 100. The terminal 100 can utilize the proximity light sensor 180G to detect that the user holds the terminal 100 close to the ear for talking, so as to automatically turn off the display screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The terminal 100 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal 100 executes a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the terminal 100 performs a reduction in the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, terminal 100 heats battery 142 when the temperature is below another threshold to avoid a low temperature causing abnormal shutdown of terminal 100. In other embodiments, the terminal 100 performs boosting of the output voltage of the battery 142 when the temperature is below a further threshold value to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or thereabout, which is an operation of a user's hand, elbow, stylus, or the like contacting the display screen 194. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 17 is a block diagram of a software configuration of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 17, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
In this application, the application layer may further add a floating window starting component (floating launcher) for serving as a default display application in the above mentioned floating window, and providing the user with an entry for entering another application.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 17, the application framework layer may include a window manager (window manager), a content provider, a view system, a phone manager, a resource manager, a notification manager, an activity manager (activity manager), and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the display screen, intercept the display screen and the like. In the application, the floating window can be expanded based on the Android native PhoneWindow and is specially used for displaying the mentioned floating window so as to be different from a common window, and the window has the attribute of being displayed on the topmost layer of the series of windows in a floating manner. In some alternative embodiments, the window size may be given a suitable value according to the size of the actual screen, according to an optimal display algorithm. In some possible embodiments, the aspect ratio of the window may default to the aspect ratio of the screen of a conventional mainstream mobile phone. Meanwhile, in order to facilitate the user to close the exit and hide the floating window, a close key and a minimize key can be additionally drawn at the upper right corner.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, viewing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures. In the application, the button views for closing, minimizing and other operations on the floating window can be correspondingly added and bound to the floating window in the window manager.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager allows applications to display notification information in the status bar 207, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears in the form of a dialog window on the display. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The activity manager is used for managing the active services running in the system, and comprises processes (processes), applications, services (services), task (task) information and the like. In the application, an Activity task stack specially used for managing the application Activity displayed in the floating window can be newly added in the Activity manager module, so that the application Activity and task in the floating window cannot conflict with the application displayed in the full screen in the screen.
In the application, a motion detector (motion detector) may be additionally arranged in the application framework layer, and is used for performing logic judgment on the acquired input event and identifying the type of the input event. For example, it is determined that the input event is a knuckle touch event or a pad touch event, based on information such as touch coordinates and a time stamp of a touch operation included in the input event. Meanwhile, the motion detection assembly can also record the track of the input event, judge the gesture rule of the input event and respond to different operations according to different gestures.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: input manager, input dispatcher, surface manager, Media Libraries, three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
And the input manager is responsible for acquiring event data from the input driver at the bottom layer, analyzing and packaging the event data and then transmitting the event data to the input scheduling manager.
The input scheduling manager is used for storing window information, and after receiving an input event from the input manager, the input scheduling manager searches a proper window in the stored window and distributes the event to the window.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
Through the description of the foregoing embodiments, it will be clear to those skilled in the art that, for convenience and simplicity of description, only the division of the functional modules is illustrated, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the apparatus may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A method of displaying an image, comprising:
the method comprises the steps that electronic equipment displays a preview interface, wherein the preview interface comprises a first image collected by a first camera, and the first image is displayed in a first area of a display screen;
detecting a first user operation in the preview interface, responding to the first user operation, shooting the first image through a first camera, and shooting a second image through a second camera;
when the second image comprises a face of a preset user, determining a first horizontal and vertical screen state of the electronic equipment according to the second image; when the second image does not comprise the face of the preset user, determining a first horizontal and vertical screen state of the electronic equipment by using a sensor; and storing the first image and the first horizontal and vertical screen state in an associated manner, wherein the first horizontal and vertical screen state comprises: a left or right cross screen;
detecting a second user operation for viewing the first image, responding to the second user operation, determining a second horizontal and vertical screen state of the electronic equipment, and displaying the first image in a second area of the display screen; a first rotation angle of the first image displayed in a second area on the display screen relative to the first image displayed in the first area is determined by the first horizontal and vertical screen state and the second horizontal and vertical screen state, and the second horizontal and vertical screen state comprises: the first rotation angle comprises 180 degrees, and the size proportion of the second area is the same as that of the first area.
2. The method of claim 1, wherein determining the second landscape and portrait screen status of the electronic device comprises:
collecting a third image through a second camera; the third image comprises a human face;
and determining a second horizontal and vertical screen state of the electronic equipment according to the third image.
3. The method according to claim 1, wherein the electronic device characterizes the first landscape screen state by a first parameter, takes the first parameter with a first value to characterize an upward landscape screen, takes the first parameter with a second value to characterize a left landscape screen, takes the first parameter with a third value to characterize a downward landscape screen, and takes the first parameter with a fourth value to characterize a right landscape screen;
the associating stores the first image and the first landscape and portrait screen state, including:
and storing the first image and the first parameter in an associated manner.
4. The method of claim 1, wherein determining the first landscape and portrait screen status of the electronic device from the second image comprises:
and when the included angle between the screen of the electronic equipment and the horizontal plane is smaller than a preset angle, determining the first horizontal and vertical screen states of the electronic equipment according to the second image.
5. The method of claim 1, wherein determining the first landscape and portrait screen status of the electronic device from the second image comprises:
and determining the first horizontal and vertical screen state according to a first included angle between the standard reference line of the second image and the face reference line corresponding to the second image.
6. The method according to claim 5, wherein when the second image includes a plurality of faces, the face datum line corresponding to the second image is a face datum line of a face with a largest area among the plurality of faces in the second image.
7. The method according to claim 5 or 6, wherein the determining the first horizontal and vertical screen state according to the first included angle between the standard reference line of the second image and the face reference line corresponding to the second image comprises:
when the first included angle is within a first range, determining that the first horizontal and vertical screen state is an upward vertical screen;
when the first included angle is within a second range, determining that the first horizontal and vertical screen state is a left horizontal screen;
when the first included angle is within a third range, determining that the first horizontal and vertical screen state is a downward vertical screen;
and when the first included angle is within a fourth range, determining that the first horizontal and vertical screen state is a right horizontal screen.
8. The method of claim 7, wherein the standard reference line of the second image is oriented along a side of the second image from a bottom of the second image to a top of the second image; the orientation of the face reference line corresponding to the second image is along the vertical direction of the binocular connecting line of the face in the second image, and the face reference line points to the vertex from the chin; the first range is greater than-45 ° and less than or equal to 45 °, the second range is greater than-135 ° and less than or equal to-45 °, the third range is greater than 135 ° and less than or equal to 225 °, and the fourth range is greater than 45 ° and less than or equal to 135 °.
9. The method according to claim 3, wherein the first parameter is equal to a second rotation angle when the second landscape screen state is a landscape screen up; when the second horizontal and vertical screen state is a left horizontal screen, the first rotation angle is equal to the first parameter plus 90 degrees; when the second horizontal and vertical screen state is a downward vertical screen state, the first rotation angle is equal to the first parameter plus 180 degrees; and when the second horizontal and vertical screen state is a right horizontal screen state, the first rotation angle is equal to the first parameter plus-90 degrees.
10. The method according to claim 1, wherein the first horizontal and vertical screen state is an upward vertical screen state, and when the second horizontal and vertical screen state is an upward vertical screen state, the first rotation angle is equal to 0 °;
the first horizontal and vertical screen state is a left horizontal screen, and when the second horizontal and vertical screen state is an upward vertical screen, the first rotation angle is equal to-90 degrees;
the first horizontal and vertical screen state is a downward vertical screen, and when the second horizontal and vertical screen state is an upward vertical screen, the first rotation angle is equal to 180 degrees;
the first horizontal and vertical screen state is a right horizontal screen, and when the second horizontal and vertical screen state is an upward vertical screen, the first rotating angle is equal to 90 degrees.
11. The method of claim 1, wherein the first camera is a rear camera and the second camera is a front camera.
12. An electronic device, comprising: a first camera, a second camera, a display screen, one or more processors, and one or more memories for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed on the one or more processors, cause the electronic device to perform the method of any of claims 1-11.
13. A computer-readable storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-11.
CN202010480991.3A 2020-05-30 2020-05-30 Method and related device for displaying image Active CN111669462B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010480991.3A CN111669462B (en) 2020-05-30 2020-05-30 Method and related device for displaying image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010480991.3A CN111669462B (en) 2020-05-30 2020-05-30 Method and related device for displaying image

Publications (2)

Publication Number Publication Date
CN111669462A CN111669462A (en) 2020-09-15
CN111669462B true CN111669462B (en) 2022-09-02

Family

ID=72385330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010480991.3A Active CN111669462B (en) 2020-05-30 2020-05-30 Method and related device for displaying image

Country Status (1)

Country Link
CN (1) CN111669462B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465910B (en) * 2020-11-26 2021-12-28 成都新希望金融信息有限公司 Target shooting distance obtaining method and device, storage medium and electronic equipment
CN112637497B (en) * 2020-12-21 2022-04-01 维沃移动通信有限公司 Shooting control method and device and electronic equipment
CN112702527A (en) * 2020-12-28 2021-04-23 维沃移动通信(杭州)有限公司 Image shooting method and device and electronic equipment
CN115484390B (en) * 2021-06-16 2023-12-19 荣耀终端有限公司 Video shooting method and electronic equipment
CN115484392B (en) * 2021-06-16 2023-12-19 荣耀终端有限公司 Video shooting method and electronic equipment
CN113918766B (en) * 2021-08-12 2023-10-13 荣耀终端有限公司 Video thumbnail display method, apparatus and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102065183A (en) * 2010-12-31 2011-05-18 惠州Tcl移动通信有限公司 Mobile phone and image pickup processing method thereof
CN107943443A (en) * 2017-12-13 2018-04-20 广东欧珀移动通信有限公司 Display control method, device, storage medium and the electronic equipment of photo
CN108540718A (en) * 2018-04-08 2018-09-14 Oppo广东移动通信有限公司 Image pickup method, device, mobile terminal and storage medium
CN110059211A (en) * 2019-03-28 2019-07-26 华为技术有限公司 Record the method and relevant apparatus of user feeling

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6895126B2 (en) * 2000-10-06 2005-05-17 Enrico Di Bernardo System and method for creating, storing, and utilizing composite images of a geographic location
CN103294358B (en) * 2013-05-17 2016-12-28 小米科技有限责任公司 A kind of control the method for screen rotation, device and equipment
CN104346030A (en) * 2013-08-01 2015-02-11 腾讯科技(深圳)有限公司 Display direction switching method, device and electronic equipment
CN104991723A (en) * 2015-07-08 2015-10-21 上海斐讯数据通信技术有限公司 Transverse and vertical screen switching system and method
CN106203254B (en) * 2016-06-23 2020-02-07 青岛海信移动通信技术股份有限公司 Method and device for adjusting photographing direction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102065183A (en) * 2010-12-31 2011-05-18 惠州Tcl移动通信有限公司 Mobile phone and image pickup processing method thereof
CN107943443A (en) * 2017-12-13 2018-04-20 广东欧珀移动通信有限公司 Display control method, device, storage medium and the electronic equipment of photo
CN108540718A (en) * 2018-04-08 2018-09-14 Oppo广东移动通信有限公司 Image pickup method, device, mobile terminal and storage medium
CN110059211A (en) * 2019-03-28 2019-07-26 华为技术有限公司 Record the method and relevant apparatus of user feeling

Also Published As

Publication number Publication date
CN111669462A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
CN111669462B (en) Method and related device for displaying image
CN109766036B (en) Message processing method and electronic equipment
CN114816210B (en) Full screen display method and device of mobile terminal
WO2021000881A1 (en) Screen splitting method and electronic device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN111010506A (en) Shooting method and electronic equipment
CN112328130B (en) Display processing method and electronic equipment
CN113556461A (en) Image processing method and related device
CN110471606B (en) Input method and electronic equipment
WO2020029306A1 (en) Image capture method and electronic device
CN112751954B (en) Operation prompting method and electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
CN111669459A (en) Keyboard display method, electronic device and computer readable storage medium
CN110059211B (en) Method and related device for recording emotion of user
CN112150499A (en) Image processing method and related device
CN113986070A (en) Quick viewing method for application card and electronic equipment
CN113010076A (en) Display element display method and electronic equipment
CN111768352A (en) Image processing method and device
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
CN112449101A (en) Shooting method and electronic equipment
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN114283195B (en) Method for generating dynamic image, electronic device and readable storage medium
CN112637477A (en) Image processing method and electronic equipment
CN115115679A (en) Image registration method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant