CN110795187A - Image display method and electronic equipment - Google Patents

Image display method and electronic equipment Download PDF

Info

Publication number
CN110795187A
CN110795187A CN201910936340.8A CN201910936340A CN110795187A CN 110795187 A CN110795187 A CN 110795187A CN 201910936340 A CN201910936340 A CN 201910936340A CN 110795187 A CN110795187 A CN 110795187A
Authority
CN
China
Prior art keywords
electronic device
image
display
user
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910936340.8A
Other languages
Chinese (zh)
Inventor
窦纯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910936340.8A priority Critical patent/CN110795187A/en
Publication of CN110795187A publication Critical patent/CN110795187A/en
Priority to PCT/CN2020/116587 priority patent/WO2021057673A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An image display method and electronic equipment relate to the technical field of terminals. The method is applied to electronic equipment, the electronic equipment comprises an outer foldable display screen, the display screen comprises a first display area and a second display area, and the method comprises the steps of obtaining the position of the face of a user relative to the electronic equipment; when the display screen is in a folded state, processing a first image and a second image of the target object according to the position of the face of the user relative to the electronic equipment; and displaying the processed first image in the first display area and displaying the processed second image in the second display area, so that the target object presents a display effect corresponding to the situation that a user looks from a position opposite to the electronic equipment on the electronic equipment. The technical scheme is beneficial to improving the vividness of the image display effect, so that the user experience is improved.

Description

Image display method and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an image display method and an electronic device.
Background
Currently, electronic devices (e.g., mobile phones, tablet computers, etc.) display images through a display screen. However, the display effect of the image is single, and the user experience is easily reduced. Therefore, research work on how electronic devices present images to users more vividly is of great significance for improving user experience.
Disclosure of Invention
The embodiment of the application provides an image display method and electronic equipment, so that the electronic equipment can present images with corresponding display effects to a user along with the rotation of the head of the user, the vividness of the image display effects is improved, and the user experience is improved.
In a first aspect, an embodiment of the present application provides an image display method, where the method is applied to an electronic device, where the electronic device includes an external-foldable display screen, and the display screen includes a first display area and a second display area; the method comprises the following steps:
acquiring the position of the face of a user relative to the electronic equipment; when the display screen is in a folded state, processing a first image and a second image of a target object according to the position of the face of the user relative to the electronic equipment; the first image and the second image are respectively used for describing one of two opposite surfaces of the target object; then, displaying the processed first image in the first display area, and displaying the processed second image in the second display area, so that the target object presents a display effect corresponding to the user when the user looks from a position relative to the electronic equipment on the electronic equipment;
when the display screen is in a folded state, the first display area is located on a first face of the electronic device, and the second display area is located on a second face of the electronic device.
In the embodiment of the application, the electronic equipment can correspondingly display according to the position of the face of the user relative to the electronic equipment, so that the electronic equipment can display the image with the corresponding display effect to the user along with the rotation of the head of the user, the vividness of the image display effect is improved, and the user experience is improved.
In one possible design, the first image and the second image are pre-stored in the electronic device in an object file. Helping to simplify the implementation.
In one possible design, the first image and the second image are acquired by the electronic device through a camera. It is helpful to improve the flexibility of acquiring the first image and the second image.
In a possible design, when the display screen is in a folded state, and an operation of previewing the target file is detected, the first image and the second image of the target object are processed according to the position of the face of the user relative to the electronic device. Helping to cause the electronic device to more vividly present images to the user when the display is turned off.
In a possible design, the target file is set as wallpaper of a lock screen interface, when the display screen is in a folded state, an event triggering display of the lock screen interface is detected, and the first image and the second image of the target object are processed according to the position of the face of the user relative to the electronic device.
In a second aspect, an electronic device provided in an embodiment of the present application includes means for performing the method according to the first aspect of the present application and any one of the possible designs related to the first aspect.
In a third aspect, a chip provided in an embodiment of the present application is configured to call and execute a program instruction stored in a memory, and execute the method according to any one of the possible designs of the first aspect and the first aspect of the embodiment of the present application.
In a fourth aspect, a computer storage medium of an embodiment of the present application stores program instructions, which, when executed on an electronic device, cause the electronic device to perform the method according to the first aspect of the embodiment of the present application and any one of the possible designs related to the first aspect.
In a fifth aspect, a computer program product according to an embodiment of the present application is configured to, when run on an electronic device, cause the electronic device to perform a method for implementing the first aspect of an embodiment of the present application and any possible design related to the first aspect.
In addition, the technical effects brought by any one of the possible design manners in the second aspect to the fifth aspect can be referred to the technical effects brought by different design manners in the association of the method part, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic view of an electronic device implementing an external folding display screen according to the present application;
FIG. 3 is a flowchart illustrating an image display method according to an embodiment of the present application;
FIG. 4 is a front view of a moon and a back view of the moon of an embodiment of the present application;
FIG. 5 is a schematic interface diagram according to an embodiment of the present application;
fig. 6 is a schematic view of a moon displayed in a first display area at different positions of a face of a user relative to an electronic device according to an embodiment of the present application;
FIG. 7 is a schematic interface diagram according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the display effect of an image presented on an electronic device more vivid, an embodiment of the application provides an image display method, so that the display effect of the image can change along with the change of the relative position between the head of a user and the electronic device, thereby facilitating the improvement of user experience.
It should be understood that in this application, "/" means "or" means "unless otherwise indicated. For example, A/B may represent A or B. In the present application, "and/or" is only one kind of association relation describing an associated object, and means that three kinds of relations may exist. For example, a and/or B, may represent: a exists alone, A and B exist simultaneously, and B exists alone. "at least one" means one or more, "a plurality" means two or more.
In this application, "exemplary," "in some embodiments," "in other embodiments," and the like are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term using examples is intended to present concepts in a concrete fashion.
Furthermore, the terms "first," "second," and the like, as used herein, are used for descriptive purposes only and not for purposes of indicating or implying relative importance or implicit indication of a number of technical features being indicated or implied as well as the order in which such is indicated or implied.
For example, the electronic device of the embodiment of the present application may be a portable electronic device, such as a mobile phone, a tablet computer, a wearable device, an Augmented Reality (AR)/Virtual Reality (VR) device, and the like. In particular, exemplary embodiments of the electronic device include, but are not limited to, a mount
Figure BDA0002221677700000031
Or other operating system. In other embodiments, the electronic device according to the embodiments of the present application may also be other electronic devices, such as a notebook computer.
By way of example, fig. 1 shows a hardware structure diagram of an electronic device according to an embodiment of the present application. As shown in fig. 1, the electronic device includes a processor 110, an internal memory 121, an external memory interface 122, a camera 130, a display 140, a sensor module 150, an audio module 160, a speaker 161, a receiver 162, a microphone 163, an earphone interface 164, a Subscriber Identification Module (SIM) card interface 171, a Universal Serial Bus (USB) interface 172, a charging management module 180, a power management module 181, a battery 182, a mobile communication module 191, and a wireless communication module 192. Further, in other embodiments, the electronic device may also include motors, indicators, keys, and the like.
It should be understood that the hardware configuration shown in fig. 1 is only one example. The electronic devices of the embodiments of the application may have more or fewer components than the electronic devices shown in the figures, may combine two or more components, or may have different configurations of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Processor 110 may include one or more processing units, among others. For example, the processor 110 may include an Application Processor (AP), a modem, a Graphics Processor (GPU), an Image Signal Processor (ISP), a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), and the like. In particular implementations, the different processing units may be separate devices or may be integrated in one or more devices.
In some embodiments, a buffer may also be provided in the processor 110 for storing programs and/or data. The programs in the embodiments of the present application may also be referred to as program instructions, computer programs, code instructions, and the like, and are not limited thereto. As an example, the cache in the processor 110 may be a cache memory. The buffer may be used to hold programs and/or data that have just been used, generated, or recycled by processor 110. If the processor 110 needs to use the program and/or data, it can be called directly from the cache. Which helps to reduce the time for the processor 110 to acquire programs or data, thereby increasing the efficiency of the system.
The internal memory 121 may be used to store programs and/or data. In some embodiments, the internal memory 121 includes a program storage area and a data storage area. The storage program area may be used to store an operating system (e.g., an operating system such as Android or IOS), a program required by at least one function (e.g., an image display function), and the like. The storage data area may be used to store data created, preset, and/or acquired during the use of the electronic device (e.g., images captured by a camera, images received via a network), and the like. For example, the processor 110 may implement one or more functions by calling programs and/or data stored in the internal memory 121 to cause the electronic device to execute a corresponding method. For example, the processor 110 calls some programs and/or data in the internal memory, so that the electronic device executes the display method provided in the embodiment of the present application, automatically moves the position of the small window, or moves the position of the large window, or adjusts the transparency of the small window, and the like, to automatically avoid the small window from blocking the face displayed in the large window, without manual operation of a user, which is beneficial to improving user experience. The internal memory 121 may be a high-speed random access memory, a nonvolatile memory, or the like. For example, the non-volatile memory may include at least one of one or more disk storage devices, flash memory devices, and/or universal flash memory (UFS), among others.
The external memory interface 122 may be used to connect an external memory card (e.g., a Micro SD card) to extend the storage capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 122 to implement a data storage function. For example, the electronic apparatus can save contents of images, music, videos, documents, and the like in the external memory card through the external memory interface 122.
The camera 130 may be used to capture motion, still images, and the like. Typically, the camera 130 includes a lens and an image sensor. The optical image generated by the object through the lens is projected on the image sensor, and then is converted into an electric signal for subsequent processing. For example, the image sensor may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The image sensor converts the optical signal into an electrical signal and then transmits the electrical signal to the ISP to be converted into a digital image signal. It should be noted that, in the embodiment of the present application, the electronic device may include one or more cameras 130, which is not limited thereto. Illustratively, the electronic device includes 5 cameras 130, e.g., 3 rear cameras and 2 front cameras. As another example, the electronic device includes 3 cameras 130, such as 2 rear cameras and 1 front camera.
The display screen 140 may include a display panel. The user can display different interfaces on the display screen 140 according to the needs of the user, so that the requirements of the user are met. Specifically, the display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode (active-matrix organic light-emitting diode (AMOLED)), a flexible light-emitting diode (FLED), a miniature, a Micro led, a quantum dot light-emitting diode (QLED), or the like. For example, the electronic device may implement display functionality via the GPU, the display screen 140, the application processor, and/or the like. It should be noted that, in the embodiment of the present application, the electronic device may include one or more display screens 140, which is not limited thereto. The display screen 140 may be a foldable screen or a non-foldable screen, which is not limited herein.
The sensor module 150 may include one or more sensors. For example, a touch sensor 150A, a pressure sensor 150B, etc. In other embodiments, the sensor module 150 may further include one or more of a gyroscope, an acceleration sensor, a fingerprint sensor, an ambient light sensor, a distance sensor, a proximity light sensor, a bone conduction sensor, a temperature sensor, a positioning sensor (e.g., a Global Positioning System (GPS) sensor), etc., without limitation.
The touch sensor 150A may also be referred to as a "touch panel". The touch sensor 150A may be provided to the display screen 140. When the touch sensor 150A is disposed on the display screen 140, the touch sensor 150A and the display screen 140 form a touch screen, which may also be referred to as a "touch screen". The touch sensor 150A is used to detect a touch operation applied thereto or nearby. The touch sensor 150A can communicate the detected touch operation to the application processor to determine the touch event type. The electronic device may provide visual output related to touch operations, etc. through the display screen 140. For example, the electronic apparatus may perform interface switching in response to the touch device 150A detecting a touch operation acting thereon or nearby, and display the switched interface on the display screen 140. In other embodiments, the touch sensor 150A can be disposed on a surface of the electronic device at a different location than the display screen 140.
The pressure sensor 150B is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. For example, the pressure sensor 150B may be disposed on the display screen 140. The touch operations which act on the same touch position but have different touch operation intensities can correspond to different operation instructions.
The electronic device may implement audio functions through the audio module 160, the speaker 161, the receiver 162, the microphone 163, the headphone interface 164, and the application processor, etc. Such as an audio play function, a recording function, a voice wake-up function, etc.
The audio module 160 may be used to perform digital-to-analog conversion, and/or analog-to-digital conversion on the audio data, and may also be used to encode and/or decode the audio data. For example, the audio module 160 may be disposed in the processor 110, or some functional modules of the audio module 160 may be disposed in the processor 110.
The speaker 161, also called a "speaker", converts audio data into sound and plays the sound. For example, the electronic device may listen to music, listen to a speakerphone, or issue a voice prompt, etc., through the speaker 161.
A receiver 162, also called "earpiece", is used to convert audio data into sound and play the sound. For example, when the electronic device answers the call, the user can answer the call by placing the receiver 162 close to the ear of the user.
The microphone 163, also referred to as a "microphone" or "microphone", is used to collect sound (e.g., ambient sound, including human-generated sound, device-generated sound, etc.) and convert the sound into audio electrical data. When making a call or transmitting voice, the user can make a sound by approaching the microphone 163 through the mouth of the person, and the microphone 163 collects the sound made by the user. It should be noted that the electronic device may be provided with at least one microphone 163. For example, two microphones 163 are provided in the electronic device, and in addition to collecting sound, a noise reduction function can be realized. For example, three, four or more microphones 163 may be further disposed in the electronic device, so that the recognition of the sound source, the directional recording function, or the like may be further implemented on the basis of implementing sound collection and noise reduction.
The earphone interface 164 is used to connect a wired earphone. The headset interface 164 may be a USB interface 170, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like.
The SIM card interface 171 is for connecting a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 171 or being pulled out from the SIM card interface 115. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 171 may support a Nano SIM card, a Micro SIM card, a SIM card, or the like. Multiple cards can be inserted into the same SIM card interface 171 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 171 may also be compatible with different types of SIM cards. The SIM card interface 171 may also be compatible with an external memory card. The electronic equipment realizes the functions of voice call, video call, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
The USB interface 172 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 172 may be used to connect a charger to charge the electronic device, and may also be used to transmit data between the electronic device and a peripheral device. And the earphone can also be used for connecting an earphone and playing sound through the earphone. For example, the USB interface 172 may be used to connect other electronic devices, such as AR devices, computers, and the like, in addition to the headset interface 164.
The charge management module 180 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 180 may receive charging input from a wired charger via the USB interface 170. In some wireless charging embodiments, the charging management module 180 may receive a wireless charging input through a wireless charging coil of the electronic device. While the charging management module 180 charges the battery 182, the power management module 180 may also supply power to the electronic device.
The power management module 181 is used to connect the battery 182, the charging management module 180 and the processor 110. The power management module 181 receives input from the battery 182 and/or the charging management module 180 to power the processor 110, the internal memory 121, the camera 130, the display screen 140, and the like. The power management module 181 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), and the like. In some other embodiments, the power management module 181 may also be disposed in the processor 110. In other embodiments, the power management module 181 and the charging management module 180 may be disposed in the same device.
The mobile communication module 191 may provide a solution including 2G/3G/4G/5G wireless communication, etc. applied to the electronic device. The mobile communication module 191 may include a filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like.
The wireless communication module 192 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLANs), such as wireless fidelity (Wi-Fi) networks, Bluetooth (BT), Global Navigation Satellite Systems (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 192 may be one or more devices that integrate at least one communication processing module.
In some embodiments, the antenna 1 of the electronic device is coupled to the mobile communication module 191 and the antenna 2 is coupled to the wireless communication module 192 so that the electronic device can communicate with other devices. Specifically, the mobile communication module 191 may communicate with other devices through the antenna 1, and the wireless communication module 191 may communicate with other devices through the antenna 2.
The image display method according to the embodiment of the present application will be described below with reference to the hardware configuration shown in fig. 1.
Take the display screen 140 as an external foldable display screen as an example. It should be understood that an electronic device having an external foldable display screen may also be referred to as an external foldable screen electronic device. Illustratively, the display screen 140 is in an expanded state, as shown in FIG. 2A; the display screen 140 is in a folded state, as shown in fig. 2B. Among other things, the display screen 140 includes a first display area 1401 and a second display area 1402. As shown in fig. 2A, when the display screen 140 is in the unfolded state, the first display area 1401 and the second display area 1402 are located on the same plane of the electronic device. When the display screen 140 is in a folded state, as shown in fig. 2B, the first display area 1401 and the second display area 1402 are located on different faces of the electronic device. Taking the first display area 1401 located on the first side of the electronic device and the second display area 1402 located on the second side of the electronic device as an example, the first side and the second side may be two sides of the electronic device opposite to each other, for example, when the first side of the electronic device faces the user, the second side of the electronic device faces away from the user.
Illustratively, as shown in fig. 3, a schematic flowchart of an image display method according to an embodiment of the present application specifically includes the following steps.
As shown in fig. 3, a schematic flow chart of an image display method according to an embodiment of the present application includes the following steps.
Step 301, the electronic device obtains the position of the face of the user relative to the electronic device.
Illustratively, the electronic device obtains, via the processor 110, a position of a face of the user relative to the electronic device. For example, the processor 110 acquires the position of the face of the user relative to the electronic device according to the camera 130 capturing the image of the face of the user. Taking the display screen 140 in the folded state as an example, when the user faces the first side of the electronic device, the camera located on the first side of the electronic device may collect an image of the face of the user, so as to obtain the position of the face of the user relative to the electronic device. For another example, when the user faces the second side of the electronic device, the camera on the second side of the electronic device may acquire an image of the face of the user, so as to obtain a position of the face of the user relative to the electronic device.
In some embodiments, the electronic device may periodically or in real-time acquire the position of the user's face relative to the electronic device. For example, the period for acquiring the position of the face of the user relative to the electronic device may be configured in the electronic device in advance, for example, may be configured in an internal memory of the electronic device in advance, or may be determined by the electronic device according to a preset policy, which is not limited herein.
Further, the electronic device may periodically or in real-time acquire the position of the user's face relative to the electronic device by responding to the first trigger event, for example. For example, the first triggering event may be the electronic device detecting an operation to fold the display screen 140 while the display screen is in the unfolded state. That is, the electronic device may periodically or real-time monitor the position of the face of the user relative to the electronic device when the display screen 140 is in the folded state. For another example, the first trigger event may be that the electronic device detects a screen locking operation when the display screen 140 is in the folded state. For another example, the first trigger event may also be that the electronic device detects an image preview operation while the display screen 140 is in the folded state. The embodiment of the present application does not limit the first trigger event.
Step 302, when the display screen 140 is in a folded state, the electronic device processes the first image and the second image of the target object according to the position of the face of the user relative to the electronic device; the first image and the second image are each used to describe one of two opposing faces of the target object.
By way of example, two opposing faces of the target object may be understood as: front and back of the target object, or left and right of the target object, or above and below the target object, etc. For example, the target object may be an animal, a human, a plant, a vehicle (e.g., an automobile, an airplane), an electronic device (e.g., a mobile phone), furniture, a house, and the like, but is not limited thereto. Taking the target object as the moon as an example, for example, the first image is a front view of the moon as shown in fig. 4A, and the second image is a back view of the moon as shown in fig. 4B.
It should be noted that the first image and the second image of the target object may be acquired by the electronic device through the camera 130, or may be acquired by the electronic device from other devices (for example, an application server, a cloud, and the like). Further, in order to facilitate storage of the first image and the second image of the target object, it is exemplified that the first image and the second image of the target object are stored in the electronic device in a target file form in a designated folder.
Take the example that the electronic device acquires the first image and the second image of the target object through the camera 130. For example, in a three-dimensional (3D) mode, the electronic device captures a first side of a target object through a camera in response to a first operation of a user, acquires a first image of the target object, and then prompts the user to capture a second side of the target object, the user adjusts a capturing angle of the camera 130 according to the prompt, and the electronic device captures the second side of the target object through the camera in response to a second operation of the user, acquires a second image of the target object, and then stores the first image and the second image of the target object in a target file in a designated folder. For example, the name of the designated folder may be 3D display (display). For example, in order to facilitate the electronic device to recognize that the first image and the second image are images of the same target object, the electronic device may name the first image as the target object _ a and the second image as the target object _ b.
Take the target object as the moon model as an example. For example, in the folded state of the display screen 140, the electronic device displays the interface 510 shown in fig. 5A in the first display area 1401. Interface 510 includes a camera icon 501. The electronic device displays the preview interface 520 in the first display region 1401 in response to an operation of the user clicking the camera icon 501. When the image acquired by the camera 130 of the electronic device is the front side of the moon model, a front side view of the moon model is displayed in the preview interface 520. For example, as shown in fig. 5B, the electronic device may acquire a front view of the moon model, i.e., a first image of the target object, in response to the user clicking the photographing key 502 in the 3D mode. Then, the electronic apparatus prompts the user for the other side of the photographing target object. For example, as shown in fig. 5C, the electronic apparatus prompts the user for the other side of the shooting target object through a prompt box 503. When the image collected by the camera 130 of the electronic newspaper is the back of the moon model, the back of the moon model is displayed in the preview interface 520. The electronic device may acquire a back view of the moon model, i.e., a second image of the target object, in response to the user clicking on the shooting case 502.
In some embodiments, the electronic device may simulate a 3D effect by the processor 110 deforming or stretching the first and second images of the target object based on the position of the face of the user relative to the electronic device. In the embodiment of the present application, the algorithm used for performing the processing such as the deformation or stretching of the first image and the second image is not limited.
In step 303, the electronic device displays the processed first image in the first display area 1401 and the processed second image in the second display area 1402, so that the target object presents a display effect on the electronic device corresponding to when the user looks from a position opposite to the electronic device.
For example, taking the first image as the front view of the moon shown in fig. 4A as an example, when the user faces the first display area 1401, the face of the user sees the image presented by the electronic device in the first display area 1401 from the left side of the first display area 1401 as shown in fig. 6A, the face of the user sees the image presented by the electronic device in the first display area 1401 while facing the first display area 1401 as shown in fig. 6B, and the face of the user sees the image presented by the electronic device in the first display area 1401 from the right side of the first display area 1401 as shown in fig. 6C.
In some embodiments, when the electronic device detects an operation of previewing the target file when the display screen 140 is in the folded state, the electronic device processes the first image and the second image of the target object according to the position of the face of the user relative to the electronic device, displays the processed first image in the first display area 1401, and displays the processed second image in the second display area 1402, so that the target object presents a display effect corresponding to the user when viewing from the position relative to the electronic device on the electronic device.
Further, in some embodiments, the electronic device may process and display the first image and the second image of the target object according to the position of the user relative to the electronic device in real time without exiting the file preview.
Take fig. 7 as an example. The first display area 1401 displays the interface 710 shown in fig. 7A, wherein the interface 710 includes a gallery identification 701. The electronic device displays an interface of the gallery on the first display area 1401 in response to an operation of clicking on the gallery identifier 701 by the user. For example, the interface of the gallery is as shown in FIG. 7B as interface 720. Interface 720 includes a plurality of folder icons, such as my album, video, and 3D display folder icons. Take 3D display folder as an example of a folder for storing 3D files. In response to the user clicking on the 3D folder icon, the electronic device opens the 3D folder and displays the interface 730 in the first display 1401. The interface 730 includes a plurality of 3D file identifiers, such as a 3D file 1, a 3D file 2, a 3D file 3, and a 3D file 4, for example, the 3D file 2 is used as a target file, and when the display screen 140 is in a folded state, the electronic device may respond to an operation of the user clicking on the 3D file 2, and display a first image and a second image of the target file according to the position of the user relative to the electronic device. Further, in the process of previewing the target file by the electronic device, the position of the user relative to the electronic device changes, and the first image and the second image of the target object displayed on the display screen 140 correspondingly change.
The operation of previewing the target file may be a shortcut handheld operation, a voice instruction, or the like, but is not limited thereto.
In other embodiments, when the electronic device sets the target file as the wallpaper of the lock screen interface, if the display screen 140 detects that the operation of displaying the lock screen interface is triggered in the folded state, the first image and the second image of the target object are processed according to the position of the face of the user relative to the electronic device. For example, when the electronic device is in a blank screen and is locked, a screen locking interface is displayed in response to an operation of pressing a power key by a user or in response to an operation of clicking a key of a main screen by the user.
Further, in other embodiments, during the process of displaying the lock screen interface when the display screen 140 is in the folded state, the electronic device may also process and display the first image and the second image of the target object in real time according to the position of the user relative to the electronic device. That is, when the electronic device displays the lock screen interface while the display screen 140 is in the folded state, the position of the user relative to the electronic device changes, and the first image and the second image of the target object displayed on the display screen 140 correspondingly change.
Or when the electronic device sets the target file as wallpaper of a screen-off interface, if the screen-off trigger event is detected in a folded state, processing the first image and the second image of the target object according to the position of the face of the user relative to the electronic device. For example, when the screen-off mode of the electronic device is turned on, the screen-off trigger event may be an operation of pressing a power key by a user, a screen-locking operation, or the like, or may be an operation of the electronic device not detecting the user operation for a time period exceeding a preset time period, which is not limited herein. It should be noted that the screen-off mode may also be referred to as a screen-off mode.
Further, in other embodiments, when the electronic device turns off the display screen 140 in the folded state, the position of the user relative to the electronic device changes, and the first image and the second image of the target object displayed on the display screen 140 correspondingly change.
It should be noted that, for the display screen 140 being an outer folding display screen, the electronic device may also be configured to determine, during the folding process of the display screen 140, the images displayed by the display screen 140 in the first display area and the second display area according to the position of the user relative to the electronic device, so that the display effect of the images may change along with the change of the relative position between the head of the user and the electronic device, thereby facilitating improvement of user experience.
In addition, for the display screen 140 that is not foldable or is folded inside, the electronic device may adjust the image displayed on the display screen 140 according to the position of the face of the user relative to the electronic device when previewing the target file, setting the target file as a screen-locking interface, or turning off the wallpaper of the screen interface. For example, if a first image of the target object is displayed on the display screen 140, the electronic device may process the first image according to the position of the face of the user relative to the electronic device, and combine with a second image of the target object, and perform corresponding display on the display screen 140.
The above embodiments can be used alone or in combination with each other to achieve different technical effects.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and the functions are implemented in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
Based on the same concept, fig. 8 illustrates an electronic device 800 provided by the present application, which is used for executing the image display method illustrated in fig. 3. Illustratively, the electronic device 800 includes a processing module 801, and a display module 802.
Illustratively, the processing module 801 is configured to process the first image and the second image of the target object according to a position of a face of the user relative to the electronic device. The display module 802 is configured to display the processed first image in a first display area and display the processed second image in a second display area.
Based on the same concept, fig. 9 illustrates an electronic device 900 provided in the present application. The electronic device 900 includes at least one processor 901, and a display 902. The processor 901 may be, for example, a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like, and may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
The display screen 902 is used to display images.
In some embodiments, electronic device 900 also includes memory 903. The memory 903 is used to store program instructions and/or data. In the embodiment of the present application, the memory 903 may be a non-volatile memory, such as a hard disk (HDD) or a solid-state drive (SSD), and may also be a volatile memory (e.g., a random-access memory (RAM)). The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such.
In some embodiments, electronic device 900 also includes a camera 904.
The processor 901 is coupled with the memory 903, the display 902, and the camera 904, and the coupling in this embodiment is an indirect coupling or a communication connection between devices, units, or modules, and may be in an electrical, mechanical, or other form, which is used for information interaction between the devices, units, or modules. In the embodiment of the present application, it is not limited that the processor 901, the memory 903, the display 902, and the camera 904 may be connected through a bus, and the bus may be divided into an address bus, a data bus, a control bus, and the like.
It should be understood that the electronic device 800 and the electronic device 900 may be used to implement the method shown in fig. 3 according to the embodiment of the present application, and reference may be made to the above for related features, which are not described herein again.
It is clear to those skilled in the art that the embodiments of the present application can be implemented in hardware, or firmware, or a combination thereof. When implemented in software, the functions described above may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. Taking this as an example but not limiting: the computer-readable medium may include RAM, ROM, Electrically Erasable Programmable Read Only Memory (EEPROM), compact disc read-Only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Furthermore, the method is simple. Any connection is properly termed a computer-readable medium. For example, if software is transmitted from a website, a server, or other remote source using a coaxial cable, a fiber optic cable, a twisted pair, a Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, the coaxial cable, the fiber optic cable, the twisted pair, the DSL, or the wireless technologies such as infrared, radio, and microwave are included in the fixation of the medium. Disk and disc, as used in accordance with embodiments of the present application, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
In short, the above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modifications, equivalents, improvements and the like made in accordance with the disclosure of the present application are intended to be included within the scope of the present application.

Claims (8)

1. An image display method is applied to an electronic device, wherein the electronic device comprises an outer foldable display screen, and the display screen comprises a first display area and a second display area; the method comprises the following steps:
acquiring the position of the face of a user relative to the electronic equipment;
when the display screen is in a folded state, processing a first image and a second image of a target object according to the position of the face of the user relative to the electronic equipment; the first image and the second image are respectively used for describing one of two opposite surfaces of the target object;
displaying the processed first image in the first display area and the processed second image in the second display area, so that the target object presents a display effect corresponding to the user when the user looks from a position relative to the electronic device on the electronic device;
when the display screen is in a folded state, the first display area is located on a first face of the electronic device, and the second display area is located on a second face of the electronic device.
2. The method of claim 1, wherein the first image and the second image are pre-stored in the electronic device as an object file.
3. The method of claim 1 or 2, wherein the first image and the second image are acquired by the electronic device through a camera.
4. The method of claim 2, wherein prior to processing the first image and the second image of the target object based on the position of the face of the user relative to the electronic device, further comprising:
and when the display screen is in a folded state, detecting the operation of previewing the target file.
5. The method of claim 2, wherein prior to processing the first image and the second image of the target object based on the position of the face of the user relative to the electronic device, further comprising:
setting the target file as a wallpaper of a screen locking interface;
when the display screen is in a folded state, an event triggering display of the screen locking interface is detected.
6. An electronic device, comprising a processor, a memory, and an outer folding display screen;
the memory has stored therein program instructions;
the program instructions, when executed, cause the electronic device to perform the method of any of claims 1 to 5.
7. A chip, wherein the chip is coupled to a memory in an electronic device, such that when run, the chip invokes program instructions stored in the memory to implement the method of any of claims 1 to 5.
8. A computer-readable storage medium, comprising program instructions which, when run on a device, cause the device to perform the method of any of claims 1 to 5.
CN201910936340.8A 2019-09-29 2019-09-29 Image display method and electronic equipment Pending CN110795187A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910936340.8A CN110795187A (en) 2019-09-29 2019-09-29 Image display method and electronic equipment
PCT/CN2020/116587 WO2021057673A1 (en) 2019-09-29 2020-09-21 Image display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910936340.8A CN110795187A (en) 2019-09-29 2019-09-29 Image display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN110795187A true CN110795187A (en) 2020-02-14

Family

ID=69439961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910936340.8A Pending CN110795187A (en) 2019-09-29 2019-09-29 Image display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN110795187A (en)
WO (1) WO2021057673A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021057673A1 (en) * 2019-09-29 2021-04-01 华为技术有限公司 Image display method and electronic device
CN115277929A (en) * 2021-04-30 2022-11-01 荣耀终端有限公司 Terminal equipment and method for multi-window display
CN116112597A (en) * 2020-09-03 2023-05-12 荣耀终端有限公司 Off-screen display method, electronic device and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116027877A (en) * 2021-10-25 2023-04-28 华为终端有限公司 Screen-off display method and terminal equipment
CN116841350A (en) * 2022-03-23 2023-10-03 华为技术有限公司 3D display method and device
CN114816169B (en) * 2022-06-29 2022-11-04 荣耀终端有限公司 Desktop icon display method and device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294369A (en) * 2012-03-05 2013-09-11 联想(北京)有限公司 Display method and electronic equipment
CN104866271A (en) * 2015-06-15 2015-08-26 联想(北京)有限公司 Display method and electronic equipment
CN106126142A (en) * 2016-06-21 2016-11-16 上海井蛙科技有限公司 Double-screen display system and method
CN106201392A (en) * 2016-06-27 2016-12-07 北京小米移动软件有限公司 The display control method of curve screens and device
JP2017062650A (en) * 2015-09-25 2017-03-30 セイコーエプソン株式会社 Display system, display unit, information display method, and program
CN107644395A (en) * 2016-07-21 2018-01-30 华为终端(东莞)有限公司 Image processing method and mobile device
CN108762859A (en) * 2018-04-13 2018-11-06 Oppo广东移动通信有限公司 Wallpaper displaying method, device, mobile terminal and storage medium
CN110035270A (en) * 2019-02-28 2019-07-19 努比亚技术有限公司 A kind of 3D rendering display methods, terminal and computer readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10324736B2 (en) * 2017-01-13 2019-06-18 Zspace, Inc. Transitioning between 2D and stereoscopic 3D webpage presentation
CN110795187A (en) * 2019-09-29 2020-02-14 华为技术有限公司 Image display method and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294369A (en) * 2012-03-05 2013-09-11 联想(北京)有限公司 Display method and electronic equipment
CN104866271A (en) * 2015-06-15 2015-08-26 联想(北京)有限公司 Display method and electronic equipment
JP2017062650A (en) * 2015-09-25 2017-03-30 セイコーエプソン株式会社 Display system, display unit, information display method, and program
CN106126142A (en) * 2016-06-21 2016-11-16 上海井蛙科技有限公司 Double-screen display system and method
CN106201392A (en) * 2016-06-27 2016-12-07 北京小米移动软件有限公司 The display control method of curve screens and device
CN107644395A (en) * 2016-07-21 2018-01-30 华为终端(东莞)有限公司 Image processing method and mobile device
CN108762859A (en) * 2018-04-13 2018-11-06 Oppo广东移动通信有限公司 Wallpaper displaying method, device, mobile terminal and storage medium
CN110035270A (en) * 2019-02-28 2019-07-19 努比亚技术有限公司 A kind of 3D rendering display methods, terminal and computer readable storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021057673A1 (en) * 2019-09-29 2021-04-01 华为技术有限公司 Image display method and electronic device
CN116112597A (en) * 2020-09-03 2023-05-12 荣耀终端有限公司 Off-screen display method, electronic device and storage medium
CN116112597B (en) * 2020-09-03 2023-10-20 荣耀终端有限公司 Electronic equipment with off-screen display function, method for displaying off-screen interface of electronic equipment and storage medium
US11823603B2 (en) 2020-09-03 2023-11-21 Honor Device Co., Ltd. Always-on-display method and electronic device
CN115277929A (en) * 2021-04-30 2022-11-01 荣耀终端有限公司 Terminal equipment and method for multi-window display
CN115277929B (en) * 2021-04-30 2023-08-08 荣耀终端有限公司 Terminal equipment and method for multi-window display

Also Published As

Publication number Publication date
WO2021057673A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
WO2021213120A1 (en) Screen projection method and apparatus, and electronic device
CN110347269B (en) Empty mouse mode realization method and related equipment
CN110795187A (en) Image display method and electronic equipment
WO2020062344A1 (en) Data transmission method and electronic device
WO2020078273A1 (en) Photographing method, and electronic device
WO2021180089A1 (en) Interface switching method and apparatus and electronic device
WO2021208723A1 (en) Full-screen display method and apparatus, and electronic device
CN112751954A (en) Operation prompting method and electronic equipment
CN109729246A (en) A kind of multi-camera circuit structure, terminal and computer readable storage medium
WO2023241209A9 (en) Desktop wallpaper configuration method and apparatus, electronic device and readable storage medium
CN117413245A (en) Display control method, electronic device, and computer storage medium
WO2022037725A1 (en) System service recovery method and apparatus, and electronic device
JP7204902B2 (en) File transfer method and electronic device
CN113593567B (en) Method for converting video and sound into text and related equipment
WO2020051852A1 (en) Method for recording and displaying information in communication process, and terminals
CN114089902A (en) Gesture interaction method and device and terminal equipment
CN111061410B (en) Screen freezing processing method and terminal
CN113901485B (en) Application program loading method, electronic device and storage medium
CN114999535A (en) Voice data processing method and device in online translation process
JP2023552731A (en) Data transmission methods and electronic devices
CN114079809A (en) Terminal and input method and device thereof
WO2024032400A1 (en) Picture storage method and apparatus, and terminal device
WO2024027374A1 (en) Method for displaying hidden information, and device, chip system, medium and program product
CN114691066A (en) Application display method and electronic equipment
CN114363820A (en) Electronic equipment searching method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200214

WD01 Invention patent application deemed withdrawn after publication