CN106502606B - Method for displaying edge image and electronic equipment - Google Patents

Method for displaying edge image and electronic equipment Download PDF

Info

Publication number
CN106502606B
CN106502606B CN201610912250.1A CN201610912250A CN106502606B CN 106502606 B CN106502606 B CN 106502606B CN 201610912250 A CN201610912250 A CN 201610912250A CN 106502606 B CN106502606 B CN 106502606B
Authority
CN
China
Prior art keywords
image
edge image
edge
electronic device
attribute information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610912250.1A
Other languages
Chinese (zh)
Other versions
CN106502606A (en
Inventor
西蒙·埃克斯特兰德
弗雷德里克·安德烈亚森
约翰·拉斯比
黄雪妍
刘波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201610912250.1A priority Critical patent/CN106502606B/en
Publication of CN106502606A publication Critical patent/CN106502606A/en
Priority to PCT/CN2017/102378 priority patent/WO2018072581A1/en
Application granted granted Critical
Publication of CN106502606B publication Critical patent/CN106502606B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Abstract

The embodiment of the invention discloses a method for displaying edge images and electronic equipment, wherein the method comprises the following steps: acquiring an edge image corresponding to a reference image included in an image displayed on a main screen; and determining the display position of the edge image on the side screen according to the position or the shape of the reference image, and displaying the edge image at the display position of the side screen. According to the method provided by the embodiment of the invention, the reference image displayed on the main screen is determined, and the display position of the edge image on the side screen is determined according to the position or the shape of the reference image, so that the visual dynamic effect and the three-dimensional feeling are added to a user, and the satisfaction degree of user experience is improved.

Description

Method for displaying edge image and electronic equipment
Technical Field
The embodiment of the invention relates to the field of image display, in particular to a method for displaying an edge image and electronic equipment.
Background
The curved screen is different from a conventional flat screen, and the edge of the curved screen has a certain radian, so that the electronic device with the curved screen has a richer display effect than the electronic device with the flat screen.
At present, the electronic equipment with curved surface screen shows the image respectively on main screen (i.e. the screen that is located curved surface screen central part) and side screen (i.e. the screen that is located the edge of curved surface screen) when showing the image, for example, curved surface screen cell-phone has just accomplished the display process of image when showing the image on the side screen, does not carry out other processing to experience in the vision is felt comparatively single, lacks the flexibility, causes user experience to feel relatively poor.
Disclosure of Invention
In view of this, the embodiment of the present invention discloses a method for displaying an edge image, which adds a visual dynamic effect and a stereoscopic feeling to a user and improves the satisfaction degree of user experience by determining an edge image corresponding to a reference image included in an image displayed on a main screen and determining the display position of the edge image on a side screen according to the position or the shape of the reference image.
In a first aspect, an embodiment of the present invention provides a method for displaying an edge image, where the method includes: acquiring an edge image corresponding to a reference image included in the image displayed on the main screen; determining the display position of the edge image on the side screen according to the position or the shape of the reference image; and displaying the edge image at the display position of the side screen.
The method for determining the display position of the edge image by the terminal equipment according to the reference image displayed on the main screen at the current moment comprises the steps of determining the reference image from the image displayed on the main screen, and determining the display position of the edge image on the side screen according to the position or the shape of the reference image, so that the position of the edge image and the shape or the position of the image displayed on the main screen can be dynamically combined, the visual experience of the terminal equipment is enhanced, and the satisfaction degree of user experience is improved.
Optionally, the determining a display position of the edge image on the side screen according to the shape of the reference image includes: determining reference pixel points of the reference image according to the shape of the reference image, wherein the reference pixel points are pixel points on the edge of the reference image, which are farthest from the center of the reference image; and determining the display position of the edge image on the side screen according to the position of the reference pixel point.
Optionally, the reference image includes an incoming call image, the incoming call image is used to indicate an incoming call object, the edge image is an elliptical image, and a position of the elliptical image changes with a position change of the reference pixel point.
In a second aspect, an embodiment of the present invention provides a method for displaying an edge image, where the method includes: acquiring attribute information and an attribute information threshold of sound represented by audio data, wherein the audio data is data obtained by recording with a microphone, or the audio data is data used for playing with an earphone or a loudspeaker; when the value indicated by the attribute information is greater than or equal to the attribute information threshold value, determining the display size of the edge image according to the value of the attribute information, and thus determining the edge image; and displaying the edge image at the display position of the side screen.
According to the method for displaying the edge image, whether the value indicated by the sound information within a period of time is the effective sound attribute value or not is determined through the attribute information threshold, so that the influence of calculation errors or background noise on the sound attribute value can be reduced, the drastic change of the size of the edge image is avoided, and the satisfaction degree of user experience is improved.
Optionally, the determining a display size of the edge image according to the value of the attribute information, so as to determine the edge image, includes: determining the change rate of the size of the edge image according to the value of the attribute information, a preset maximum threshold value and a preset minimum threshold value; and determining the display size of the edge image according to the change rate and the initial size of the edge image.
According to the method for displaying the edge image, the change rate of the size of the edge image is obtained by processing the effective sound attribute value through the preset maximum threshold and the preset minimum threshold, so that the influence of calculation errors or background noise on the sound attribute value can be reduced, the drastic change of the size of the edge image is avoided, and the satisfaction degree of user experience is improved.
Optionally, the attribute information is an amplitude or a phase of a frequency component of the sound.
Optionally, the edge image includes a first edge image and a second edge image, the size of the first edge image is obtained according to the amplitude of the first frequency component of the sound, the size of the second edge image is obtained according to the amplitude of the second frequency component of the sound, and the frequency band of the first frequency component is different from the frequency band of the second frequency component.
According to the method for displaying the edge image, the edge images corresponding to different sound attribute values are displayed on the side screens at different positions, so that the perception of a user on the attribute information of the audio data is enhanced, the interaction between the user and a terminal is enhanced, and the satisfaction degree of user experience is improved.
Optionally, the method further comprises: updating the attribute information threshold according to the value of the sound attribute.
According to the method for displaying the edge image, provided by the embodiment of the invention, the attribute information threshold value of the current time interval is determined according to the effective sound attribute value of the last time interval, so that the attribute information threshold value can be optimized in real time, the size change of the edge image is softer, and the satisfaction degree of user experience is improved.
Optionally, the edge image is an elliptical image, and a display size of the edge image includes a minor axis length of the elliptical image.
Optionally, the audio data comprises music data, voice data, recording data, alarm data or voice assistant data.
In a third aspect, an embodiment of the present invention provides a method for displaying an edge image, where the method includes: acquiring an edge image corresponding to a reference image included in the image displayed on the main screen; or determining the information of the external environment where the electronic equipment is located at the current moment; determining the color of an edge image according to the attribute of the reference image or the information of the external environment, so as to determine the edge image; and displaying the edge image at the display position of the side screen.
According to the method for displaying the edge image, the color of the edge image is determined according to the attribute of the reference image displayed on the main screen or the information of the current external environment where the electronic equipment is located, so that the visual experience of the electronic equipment is enhanced, and the satisfaction degree of user experience is improved.
Optionally, the external environment information includes: weather information, temperature information or humidity information of an external environment where the electronic device is located.
In a fourth aspect, an embodiment of the present invention provides an electronic device for displaying an edge image, where the electronic device may implement the functions performed by the electronic device in the method according to the first aspect, and the functions may be implemented by hardware or by hardware executing corresponding software. The hardware or software includes one or more units or modules corresponding to the above functions.
In one possible design, the structure of the electronic device includes a processor configured to support the electronic device to perform the corresponding functions of the above method. The electronic device may also include a memory, coupled to the processor, that retains program instructions and data necessary for the electronic device.
In a fifth aspect, an embodiment of the present invention provides an electronic device for displaying an edge image, where the electronic device may implement the functions performed by the electronic device in the method according to the second aspect, where the functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more units or modules corresponding to the above functions.
In one possible design, the structure of the electronic device includes a processor configured to support the electronic device to perform the corresponding functions of the above method. The electronic device may also include a memory, coupled to the processor, that retains program instructions and data necessary for the electronic device.
In a sixth aspect, an embodiment of the present invention provides an electronic device for displaying an edge image, where the electronic device may implement the functions performed by the electronic device in the method according to the third aspect, where the functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more units or modules corresponding to the above functions.
In one possible design, the structure of the electronic device includes a processor configured to support the electronic device to perform the corresponding functions of the above method. The electronic device may also include a memory, coupled to the processor, that retains program instructions and data necessary for the electronic device.
In a seventh aspect, an embodiment of the present invention provides a computer storage medium for storing software instructions for the electronic device, which includes a program designed to execute the above aspects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a schematic diagram of an electronic device for displaying an edge image according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart diagram of a method for displaying an edge image according to an embodiment of the present invention;
FIG. 3A is a schematic diagram of a reference image position and an edge image position according to an embodiment of the present invention;
FIG. 3B is a schematic diagram of another reference image location and edge image location provided by an embodiment of the invention;
FIG. 4 is a reference image provided by an embodiment of the present invention;
FIG. 5 is a diagram illustrating a process of changing a shape of a reference image according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a method for determining a display position of an edge image according to a reference point according to an embodiment of the present invention;
FIG. 7 is another reference image provided by embodiments of the present invention;
FIG. 8 is an image of another reference image after being warped according to an embodiment of the present invention;
FIG. 9 is another image of another reference image after being warped according to an embodiment of the present invention;
FIG. 10 is a further image of another reference image after being warped according to an embodiment of the present invention;
FIG. 11 is a further image of another reference image after being warped according to an embodiment of the present invention;
FIG. 12A is a schematic diagram of the relative positions of a reference image and an elliptical image at a first time provided by an embodiment of the invention;
FIG. 12B is a diagram illustrating the relative positions of the reference image and the elliptical image at a second time according to an embodiment of the invention;
FIG. 13 is a schematic flow chart diagram of another method for displaying an edge image provided by an embodiment of the invention;
FIG. 14 is a diagram illustrating a relationship between sound attribute values and frequencies according to an embodiment of the present invention;
FIG. 15 is a schematic flow chart diagram of another method for displaying an edge image according to an embodiment of the present invention;
FIG. 16 is a schematic diagram of a method for determining color of an edge image according to an embodiment of the present invention;
fig. 17A is a schematic structural diagram of a possible electronic device according to an embodiment of the present invention;
FIG. 17B is a schematic diagram of another possible electronic device according to an embodiment of the invention;
fig. 18A is a schematic structural diagram of yet another possible electronic device provided by an embodiment of the invention;
FIG. 18B is a schematic diagram of a possible electronic device according to an embodiment of the invention;
fig. 19A is a schematic structural diagram of yet another possible electronic device provided by an embodiment of the invention;
fig. 19B is a schematic structural diagram of another possible electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic diagram of an electronic device for displaying an edge image according to an embodiment of the present invention.
As shown in fig. 1, the electronic device 100 includes a main screen 101, a side screen 102 and a side screen 103, the main screen 101 and the side screen 102 are located on different planes, the main screen 101 and the side screen 103 are located on different planes, and the main screen 101, the side screen 102 and the side screen 103 together form a curved screen of the electronic device 100, wherein the type of an image displayed by the main screen 101 is related to an application scene of the electronic device 100 at present, the position of an image displayed by the side screen 102 and the side screen 103 is related to the position or shape of an image displayed by the main screen 101, the size of an image displayed by the side screen 102 and the side screen 103 is related to audio played by the electronic device 100 at present, and the color of an image displayed by the side screen 102 and the side screen 103 is related to an environment where the electronic device 100 is located at present or the type of an image displayed by the main screen 101.
For example, when the electronic apparatus 100 is currently answering a call, the main screen 101 displays an incoming call image indicating an incoming call object, the side screens 102 and 103 display edge images associated with the incoming call image, and the positions where the edge images are displayed on the side screens 102 and 103 are related to the position or shape of the incoming call image displayed on the main screen 101; when the electronic apparatus 100 plays music, the main screen 101 displays a music image indicating a singer or an album of the music, the side screens 102 and 103 display edge images associated with the music image, and the positions where the edge images are displayed on the side screens 102 and 103 are related to the position or shape of the incoming call image displayed on the main screen 101, so that the visual experience of the electronic apparatus 100 can be enhanced.
As another example, when the electronic device 100 is playing music, the side screen 102 and the side screen 103 display edge images related to the sound attribute of the music, and the size of the edge images is related to the sound attribute of the music, and the sound data corresponding to the music may be processed by software to filter out some sound attribute values with too large or too small values, so that the size of the edge images changes smoothly with the change of the sound attribute of the music, and the size of the edge images changes too violently when the change of the sound attribute of the music is large, thereby enhancing the visual experience of the electronic device 100.
As another example, when the temperature of the external environment where the electronic device 100 is located is high, the edge images displayed on the side screen 102 and the side screen 103 may be red images; when the temperature of the external environment where the electronic device 100 is located is low, the edge images displayed by the side screens 102 and 103 may be blue images; when the reference image displayed on the main screen 101 is a steamer image, the edge images displayed on the side screens 102 and 103 may be dark yellow images, so that the visual experience of the electronic device 100 may be enhanced.
The above embodiments are merely examples, and the embodiments of the present invention are not limited thereto, and the electronic device 100 may include only the main screen 101 and the side screen 102, and may also include more side screens besides the side screen 102 and the side screen 103, the positions and shapes of the side screen 102 and the side screen 103 are not limited to the positions and shapes shown in fig. 1, the edge images displayed on the side screen 102 and the side screen 103 may be the same or different, and the electronic device 100 may further include other functional modules.
The electronic device 100 may be a mobile phone, a Digital camera, a tablet computer, a smart watch, a Personal computer or a Personal Digital Assistant (PDA), and may also be a future smart device including a curved screen.
Therefore, according to the electronic device for displaying the edge image provided by the embodiment of the invention, the position of the edge image displayed on the side screen is determined according to the position or the shape of the reference image displayed on the main screen, the size of the edge image is determined according to the sound attribute value of the audio data, and the color of the edge image is determined according to the attribute of the reference image or the information of the external environment where the electronic device is located, so that the visual body feeling of the electronic device is enhanced, and the satisfaction degree of a user is improved.
Hereinafter, a method of displaying an edge image according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 2 is a schematic flow chart of a method for displaying an edge image according to an embodiment of the present invention. The execution subject of the method may be a terminal device, as shown in fig. 2, the method 200 includes:
and S210, acquiring an edge image corresponding to a reference image included in the image displayed on the main screen.
According to different application scenes, images displayed on the main screen are different, for example, when the terminal equipment answers a call, the images displayed on the main screen are call interface images; when the terminal equipment plays music, the image displayed on the main screen is a music interface image. The application scene of the terminal equipment further comprises the steps of using a voice assistant, using voice to unlock, recording and alarm clock. The terminal device determines a reference pattern from the image displayed on the main screen, and the position or shape of the reference pattern is used to determine the position where the edge image is displayed on the side screen.
When the main screen displays the call interface image, the terminal device may determine that an incoming call image indicating an incoming call object is determined from the call interface image as a reference image, and for example, the incoming call image may include an avatar of the incoming call object or may include a name of the incoming call object. When the main screen displays the music interface image, the terminal device may determine the singer avatar as a reference image from the music interface image, or may determine the album image as a reference image from the music interface image.
And S220, determining the display position of the edge image on the side screen according to the position or the shape of the reference image.
After the terminal device determines the reference image, the position of the edge image can be determined according to the position or the shape of the reference image. Fig. 3A and 3B are schematic diagrams respectively showing a method of determining the position of an edge image from the position of a reference image.
As shown in fig. 3A, the reference image includes an incoming call image for indicating a contact corresponding to an incoming call and an irregular curve outside the incoming call image, the center portion of the incoming call image is a head portrait of the contact, the edge image is an elliptical image, and the position of the elliptical image is determined according to the position of the incoming call image, for example, a coordinate system is established with the left frame of the terminal device shown in fig. 3A as a Y-axis and the upper frame as an X-axis, and then the Y-coordinate value of the center of the elliptical image (i.e., the intersection of the major axis and the minor axis of the ellipse) may be equal to the Y-coordinate value of the center of the incoming call image (i.e., the center of the circular head portrait. The Y coordinate value of the center of the elliptical image may be different from the Y coordinate value of the center of the incoming call image, wherein the Y coordinate value of the center of the left elliptical image may be the same as or different from the Y coordinate value of the right elliptical image.
As shown in fig. 3B, the reference image includes an alarm clock image, the alarm clock image includes a circular remaining time image and an irregular curve around the remaining time image, the edge image is an elliptical image, and the position of the elliptical image is determined according to the position of the alarm clock image, for example, a coordinate system is established by taking the left frame of the terminal device shown in fig. 3B as the Y axis and the upper frame as the X axis, so that the Y coordinate value of the center of the elliptical image (i.e., the intersection point of the major axis and the minor axis of the ellipse) may be smaller than the Y coordinate value of the center of the alarm clock image (i.e., the center of the circular head), wherein the Y coordinate value of the center of the left elliptical image may be the same as or different from the.
It should be noted that, in the above embodiment, the reference image displays the line of the circular image, the reference image may not display the line of the circular image, and the method for determining the display position of the edge image according to the position of the reference image is only an example, and the embodiment of the present invention is not limited thereto.
The terminal device may also determine the display position of the edge image according to the shape of the reference image, for example, when the reference image is a circular image, the terminal device may determine that the position at which the edge image is displayed is located on the same horizontal plane as the reference image (in which the terminal device is vertically placed); when the reference image is an irregular image, a reference point may be determined from the reference image, and the display position of the edge image may be determined based on the position of the reference point.
And S230, displaying the edge image at the display position of the side screen.
The shape or position of the reference image may change with time, and the terminal device determines the display position of the edge image at the present time based on the shape or position of the reference image at the present time.
Therefore, the terminal device determines the display position of the edge image according to the reference image displayed on the main screen at the current moment, and determines the display position of the edge image on the side screen according to the position or the shape of the reference image by determining the reference image from the image displayed on the main screen, so that the position of the edge image and the shape or the position of the image displayed on the main screen can be dynamically combined, the visual experience of the terminal device is enhanced, and the satisfaction degree of user experience is improved.
Optionally, the determining a display position of an edge image on the side screen according to the shape of the reference image includes:
s221, determining reference pixel points of the reference image according to the shape of the reference image, wherein the reference pixel points are pixel points on the edge of the reference image, which are farthest from the center of the reference image
S222, determining the display position of the edge image on the side screen according to the position of the reference pixel point, wherein the reference point is the point on the edge of the reference image, which is farthest from the center of the reference image.
Optionally, the reference image includes an incoming call image, the incoming call image is used to indicate an incoming call object, the edge image is an elliptical image, and a position of the elliptical image changes with a position change of the reference pixel point.
For convenience of understanding the embodiments of the present invention, the following describes the embodiments of the present invention by taking an incoming call scene as an example, but the method for displaying an edge image provided by the present invention is not limited thereto.
The reference image may be obtained by fusing a large circle with a radius R and three small circles with a radius R through a Metaball algorithm, and obtaining the above-mentioned incoming image through the Metaball algorithm may be implemented by the prior art, and is not described herein again.
As shown in fig. 4, it is possible to arrange that the centers of three small circles are distributed on the circumference of a large circle in an equilateral triangle manner, and to use this state as the initial state of the reference image. The centers of the three small circles can move on the radius of the large circle. For a clearer description of the reference image, fig. 4 shows the outer contour of three small circles and the shape of an equilateral triangle, and in fact, the reference image seen by the user is a polygonal image including bumps.
The large circle and the small circle are fused by Metaball algorithm to form a new visual image, as shown in FIG. 5, the fused image is an irregular image with a plurality of bulges on the edge, and the size and the position of the bulges can be controlled by controlling the position relation of the small circle and the large circle. When the small circle is inscribed in the large circle, the distance between the reference pixel point (i.e., the point of the protrusion having the largest distance from the center of the large circle, hereinafter referred to as "reference point") and the center of the large circle is the smallest, the smallest distance is R, and the shape of the fused reference image is shown as "minimum protrusion state" in fig. 5; when the center of the small circle is located on the circumference of the large circle, the distance between the reference point and the center of the large circle is the largest, the largest distance is R + R, and the shape of the fused reference image is shown as the "maximum convex state" in fig. 5. In the process of ringing an incoming call, the reference image may be static or may move, and the movement mode of the reference image may be, for example, that the incoming call image is static, the protrusions of the edge continuously rotate, and each protrusion of the reference image continuously changes between the maximum protrusion state and the minimum protrusion state.
The center of the great circle is the center of the fused reference image, the symmetric center is the center of the reference image for the centrosymmetric reference image, and the center of the reference image for the non-centrosymmetric reference image can be determined by the method in the prior art, which is not described herein again.
The above embodiments are merely examples, and the embodiments of the present invention are not limited thereto, and the number of small circles may also be other numbers, the reference image may also be obtained by fixing one small circle and moving the center of the large circle on the radius of the small circle, and the reference image may also be other images.
To describe the method for determining the display position of the edge image according to the reference point, we set up a coordinate system as shown in fig. 6, where the origin of coordinates (0, 0) is located at the upper left corner of the main screen, the edge image may be an elliptical image, and the Y-coordinate value of the center of the elliptical image (i.e., the intersection of the major axis and the minor axis of the ellipse) may be set to be the same as the Y-coordinate value of the reference point, or may be set to be greater than or less than the Y-coordinate value of the reference point.
As an alternative embodiment, the terminal device may determine the initial position of the elliptical image from the position of the reference image, and thereafter determine the position of the elliptical image from the position of the reference point of the reference image, the position of the elliptical image varying with the movement of the reference point. The size of the elliptical image is related to the size of the reference image, when the size of the reference image is larger, the size of the elliptical image is larger, when the size of the reference image is smaller, the size of the elliptical image is smaller, the maximum size of the elliptical image can be set to be 2 times of the initial size, in the embodiment of the invention, the change of the size of the elliptical image refers to the change of the length of the short axis of the elliptical image, and the length of the long axis of the elliptical image can be preset to be 20px (pixels). When the reference image size is large, the initial size of the elliptical image may be set to be large, and when the reference image size is small, the initial size of the elliptical image may be set to be small.
Alternatively, the elliptical image may be generated by processing the reference image in a preset method.
As shown in fig. 7, the reference image has a width a and a length b at the current time, and the width and the length of the reference image may vary with the change of the sound property of the audio.
The reference image is deformed according to a certain proportion. For example, the width of the reference image is set to a10.5a, length set as b1The image shown in fig. 8 was obtained as 1.5 b.
The image shown in fig. 8 is divided, for example, by cutting the left half of the image shown in fig. 8, as indicated by a broken line in fig. 8, which is a vertical line passing through the highest point of the image shown in fig. 8, and the image obtained after the division processing is shown in fig. 9.
The image shown in fig. 9 is mapped into an ellipse by a preset algorithm or software, for example, the image shown in fig. 9 can be processed by a plug-in bucket to obtain an ellipse a shown in fig. 10, and the width of the ellipse a is a2Length of b2
Performing preset deformation processing on the ellipse A to obtain display on the side screenThe ellipse a may be optionally transversely compressed to 10% and longitudinally elongated to 135% to obtain an elliptical image displayed on the side screen, that is, the initial size of the elliptical image displayed on the side screen is the width a3=10%a2Length b3=135%b2. The resulting elliptical image displayed on the side screen is shown in fig. 11.
For example, the center point of the elliptical image is consistent with the Y coordinate of the point (i.e., the reference point) on the edge of the reference image farthest from the center point of the reference image, and the position of the elliptical image changes with the change of the position of the reference point in visual effect, the method for calculating the Y coordinate of the reference point is shown in fig. 6, and for convenience of description, a schematic diagram of the reference image before fusion is used for description, in fig. 6, the point P indicates the point on the edge of the reference image farthest from the center point of the reference image, the coordinate of the point P is (X, Y), θ indicates the included angle between the point P and the X axis, and the coordinate position of the point P is:
X=(R+r)*cosθ,Y=(R+r)*sinθ。
fig. 12A and 12B show a process in which the position of the elliptical image changes with the rotation of the reference image. The dynamic ellipse 1 indicates the position of the elliptical image at the first time, the Y-coordinate value of the reference point changes with the rotation of the reference image, and the dynamic ellipse 2 indicates the position of the elliptical image at the second time. The reference image may be rotated according to the playing of the audio data, and when the audio data stops playing (for example, when the call is over or the user clicks a pause button), all the data parsed by the terminal device are 0, the elliptical image is no longer displayed on the side screen or the elliptical image with a still position is displayed on the side screen.
The above embodiments are merely examples, and the embodiments of the present invention are not limited thereto. The embodiment of the invention can also be applied to a music playing scene, a recording scene, an alarm clock scene, a scene using a voice assistant or other possible application scenes, correspondingly, different reference images and edge images can be determined according to different application scenes, and the positions of the edge images can be determined according to the shapes or the positions of the reference images.
According to the method for displaying the edge image, provided by the embodiment of the invention, the display position of the edge image on the side screen is determined according to the position or the shape of the reference image, so that the visual dynamic effect and the three-dimensional feeling are added for a user, and the satisfaction degree of user experience is improved.
In addition to determining the position of the edge image from the position or shape of the reference image, the electronic device may determine the shape of the edge image from the sound properties of the played audio.
Fig. 13 is a schematic flow chart of another method for displaying an edge image according to an embodiment of the present invention. The execution subject of the method may be a terminal device, as shown in fig. 13, and the method 300 includes:
s310, acquiring attribute information and an attribute information threshold of sound represented by audio data, wherein the audio data is data obtained by microphone recording, or the audio data is data used for earphone or loudspeaker playing.
S320, when the value indicated by the attribute information is larger than or equal to the attribute information threshold value, determining the display size of the edge image according to the value of the attribute information, and determining the edge image.
And S330, displaying the edge image at the display position of the side screen.
In the embodiment of the present invention, the terminal device may obtain the audio data by receiving an instruction of playing the audio data by a user, for example, the user triggers music to start playing through a "play" button in a touch interface of the terminal device, and the terminal device obtains the played audio data; the terminal device may also obtain the audio data by detecting other information, such as the incoming call ring data obtained when the terminal device receives an incoming call. The terminal equipment analyzes the acquired attribute information of the audio data, determines an image corresponding to the attribute information of the audio data according to a preset rule, displays the image to a user, displays the attribute information of the audio data to the user in a dynamic edge image mode for the audio data played in time sequence, increases the perception of the user, and improves the user experience.
The attribute information of the sound includes phase information, frequency information, period information, amplitude information, and the like, and the method for displaying the edge image according to the embodiment of the present invention will be described below by taking the phase information as an example.
Depending on the sampling frequency (e.g., 10000mHz), a different amount of frequency information can be obtained, as shown in fig. 14, where the abscissa represents the frequency and the ordinate represents the phase spectrum value obtained by computing the frequency domain data using fourier transform, the phase spectrum value can be, for example, a value represented by 8 bits, and the range is (-128, 128), and there are N total phase spectrum values, corresponding to N frequency values, respectively.
The method of determining the shape of the edge image from the attribute information is as follows:
summing phase spectrum values over a period of time (e.g., 1ms), e.g., the phase spectrum shown in fig. 14 corresponds to 1ms, adding the N phase spectrum values shown in fig. 14 to obtain a SUM of phase spectrum values (i.e., values of attribute information of a sound represented by audio data), and SUM1Represents, SUM1Comparing with attribute information threshold if SUM1Smaller than the attribute information threshold, the SUM is determined1Invalidation, keeping the shape of the edge image unchanged; if SUM1If the attribute information is greater than or equal to the attribute information threshold, the SUM is determined1Is a valid sound attribute value, and the edge image is processed according to the following procedure. The attribute information threshold may be preset or may be obtained by other methods, for example, by adding the sum of the phase spectrum values of the x-th to y-th periods (assuming that the sum of the phase spectrum values of the m-th to n-th periods are all valid sound attribute values, there are m-n +1 phase spectrum values in total) and dividing by m-n +1 to obtain an average value, which may be used as the attribute information threshold, wherein the m-th to n-th periods are all located before the period of time.
After the effective sound attribute value is obtained, the display size of the edge image can be determined according to the relation between the preset sound attribute value and the display size of the edge image, so that the edge image is determined, or the edge image can be determined according to other methods in the prior art, and the edge image is displayed at the display position of the side screen.
According to the method for displaying the edge image, whether the value indicated by the sound information within a period of time is the effective sound attribute value or not is determined through the attribute information threshold, so that the influence of calculation errors or background noise on the sound attribute value can be reduced, the drastic change of the size of the edge image is avoided, and the satisfaction degree of user experience is improved.
Optionally, the determining a display size of the edge image according to the value of the attribute information, so as to determine the edge image, includes:
s321, determining the change rate of the size of the edge image according to the value of the attribute information, a preset maximum threshold and a preset minimum threshold;
s322, determining the display size of the edge image according to the change rate and the initial size of the edge image.
Obtaining valid sound attribute values (i.e., SUM)1) And then, determining the change rate of the size of the edge image according to the effective sound attribute value, a preset maximum threshold value and a preset minimum threshold value. The maximum threshold may be denoted as basmax, and the minimum threshold may be denoted as basmin, where basmax and basmin are values obtained according to statistical analysis of a large number of different types of sound files, and may be according to formula k1=(SUM1-bassMin)/(bassMax-bassMin) yields a rate of change k1. The above formula is merely exemplary, and k can be obtained by other formulas1
Then according to k1The initial size of the edge image determines the display size of the edge image. For example, the formula d1=(k1+1) p yields the display size d of the edge image1Where p is an initial size of the edge image, a size of the edge image before the first period corresponding to a period adjacent to the first period may also be referred to as an initial size, and when the edge image is an elliptical image, the display size may indicate a length of a short axis of the elliptical image. The above formula is merely illustrative, and d can be obtained by other formulas1
According to the method for displaying the edge image, the change rate of the size of the edge image is obtained by processing the effective sound attribute value through the preset maximum threshold and the preset minimum threshold, so that the influence of calculation errors or background noise on the sound attribute value can be reduced, the drastic change of the size of the edge image is avoided, and the satisfaction degree of user experience is improved.
Optionally, the attribute information is an amplitude (i.e., amplitude) or a phase of a frequency component of the sound.
Optionally, the edge image includes a first edge image and a second edge image, the size of the first edge image is obtained according to the amplitude or the phase of the first frequency component of the sound, the size of the second edge image is obtained according to the amplitude or the phase of the second frequency component of the sound, and the frequency band of the first frequency component is different from the frequency band of the second frequency component.
In the embodiment of the present invention, the frequency band of the first frequency component of the sound is different from the frequency band of the second frequency component of the sound, for example, according to the difference of the sound frequencies, the sound in different frequency ranges may be divided into a bass (20Hz to 250Hz), a midrange (250Hz to 4kHz) and a treble (4kHz to 20kHz), phase spectrum values of the low, middle and high frequency sounds are respectively counted, and the phase spectrum value of the first frequency component may be a phase spectrum value corresponding to the bass.
In the embodiment of the present invention, the curved screen includes a first side screen and a second side screen, and the first side screen and the second side screen may be, for example, side screens located on the left and right sides of the terminal device (where the main screen faces the user). After determining the display size of the first edge image and the display size of the second edge image, the terminal device displays the first edge image and the second edge image on the two side screens, respectively. As another example, the left side screen may display a real-time edge image, and the right side screen may display an edge image delayed by a period of time (e.g., 800 ms).
Therefore, the method for displaying the edge image in the embodiment of the invention enhances the perception of the user on the attribute information of the audio data, enhances the interaction between the user and the terminal and improves the satisfaction degree of the user experience by displaying the edge image corresponding to different sound attribute values on the side screens at different positions.
Optionally, the method 300 further comprises:
s340, updating the attribute information threshold according to the value of the sound attribute.
For example, the time range of the first time interval may be determined to be 1-100 ms, the phase spectrum values of each sub-time interval (e.g. 1ms) in the first time interval are summed, and the result obtained by the summation is denoted as SUMi(i is 1 ~ 100), i.e., the SUM of the phase spectrum values in the 1ms is SUM1The SUM of the phase spectrum values in the 2ms is SUM2And so on. Assuming that the SUM of 100 phase spectrum values in the first time interval is a valid sound attribute value, adding the SUM of 100 phase spectrum values and dividing by 100 to obtain a result, which is denoted as SUM, the SUM can be used as an attribute information threshold in the second time interval, assuming that the time range of the second time interval is 101 ms-200 ms, the SUM of the phase spectrum values of the audio data in the 101ms is obtained to obtain SUM101If SUM101If it is less than SUM, determining SUM101If the attribute value is not the valid sound attribute value, the next operation is not executed on the edge image; if SUM101If it is greater than SUM, determining SUM101Is a valid sound attribute value, and determines the display size of the 101 ms-th edge image. If the program is run for the first time, a value set in advance may be used as the attribute information threshold.
Therefore, the method for displaying the edge image, provided by the embodiment of the invention, determines the attribute information threshold of the current time interval according to the effective sound attribute value of the last time interval, so that the attribute information threshold can be optimized in real time, the size change of the edge image is softer, and the satisfaction degree of user experience is improved.
Optionally, the edge image is an elliptical image, and a display size of the edge image includes a minor axis length of the elliptical image.
Optionally, the audio data comprises music data, voice data, recording data, alarm data or voice assistant data.
According to the method for displaying the edge image, whether the value indicated by the attribute information of the sound within a period of time is the effective sound attribute value is determined through the attribute information threshold, and the change rate of the size of the edge image is obtained by processing the effective sound attribute value through the preset maximum threshold and the preset minimum threshold, so that the influence of calculation errors or background noise on the sound attribute value can be reduced, the drastic change of the size of the edge image is avoided, and the satisfaction degree of user experience is improved.
Fig. 15 is a schematic flow chart of a method for displaying an edge image according to another embodiment of the present invention. The execution subject of the method may be a terminal device, as shown in fig. 15, and the method 400 includes:
s410, acquiring an edge image corresponding to a reference image included in an image displayed on a main screen; or determining the information of the external environment where the electronic equipment is located at the current moment;
s420, determining the color of the edge image according to the attribute of the reference image or the information of the external environment, so as to determine the edge image;
and S430, displaying the edge image at the display position of the side screen.
According to different application scenes, images displayed on the main screen are different, for example, when the terminal equipment answers a call, the images displayed on the main screen are call interface images; when the terminal equipment plays music, the image displayed on the main screen is a music interface image. The application scene of the terminal equipment further comprises the steps of using a voice assistant, using voice to unlock, recording and alarm clock. The terminal device needs to determine a reference pattern from the image displayed on the home screen.
When the main screen displays the call interface image, the terminal device may determine that an incoming call image indicating an incoming call object is determined from the call interface image as a reference image, and for example, the incoming call image may include an avatar of the incoming call object or may include a name of the incoming call object. When the main screen displays the music interface image, the terminal device may determine the singer avatar as a reference image from the music interface image, or may determine the album image as a reference image from the music interface image.
The terminal device determines the color of the edge image according to the attribute of the reference image after determining the reference image, wherein the attribute of the reference image comprises color, content, transparency, shape and the like. For example, when the color of the reference image belongs to a cold color system color, the terminal device may select one from preset cold color system colors (e.g., blue, green) as the color of the edge image; when the content displayed by the reference image is a steam engine, one of preset dark colors (for example, dark yellow and black white) can be selected as the color of the edge image; when the transparency of the reference image is high, the terminal device may select one from a preset light color system (e.g., white, light yellow) as the edge image; when the reference image is a circular image, the terminal device may select rainbow color as the color of the edge image.
The terminal device can also determine the color of the edge image according to the external environment, such as the weather condition, the temperature and other conditions. For example: when the temperature is higher than the preset temperature, the color of the graph can be set to be warm; when the temperature is low relative to the preset temperature, the color of the graphic may be set to a cool color.
Therefore, according to the method for displaying the edge image provided by the embodiment of the invention, the color of the edge image is determined according to the attribute of the reference image displayed on the main screen or the information of the external environment where the current electronic equipment is located, so that the visual experience of the electronic equipment is enhanced, and the satisfaction degree of user experience is improved.
Optionally, the external environment information includes: weather information, temperature information or humidity information of an external environment where the electronic device is located.
As an alternative embodiment, the color of the edge image may be a pure color or a gradient color, and the edge image may also have a certain transparency. A pure color is a single color and a color without gray, for example, when the attribute of a pixel is represented by HSB (hue, saturation, brightness), a color with a saturation equal to 50% is called a pure color.
Taking the edge image as an elliptical image as an example, the color filling method of the elliptical image is as follows:
the intersection O (x) of the major axis and the minor axis of an elliptical image (hereinafter abbreviated as ellipse)0,y0) A rectangular coordinate system is established for the origin, and the length of the major axis of the ellipse is set to be 2a, and the length of the minor axis is set to be 2b, the ellipse can be expressed by the following formula,
Figure GDA0002314415050000161
(where C is a constant).
The value of a can be preset to be 20px, and the value range of b is [ R, R + R]. Point P (x)p,yp) The coordinate of the point P is substituted into the above-mentioned ellipse equation to calculate for any point in the plane where the ellipse is located, and the distance d from any point in the plane to the point O, that is,
Figure GDA0002314415050000162
by calculating d/C, it can be known that: the value of the center of the ellipse is 0, the value of the point on the edge of the ellipse is 1, the values from the center of the ellipse to the edge of the ellipse vary uniformly from 0 to 1, and the ratio outside the ellipse varies from 1 to 0.
The calculation formula for defining the transparency of any point in the plane of the ellipse is
Alpha=1—d/C。
The transparency value of each point can be obtained by a transparency calculation formula, so that the transparency gradual change effect gradually changing from the center of the ellipse to the periphery is obtained.
The attribute of each point in the side screen display area is represented by (R, G, B, a), where the color attribute is represented by a preset RGB value of a pure color, and the transparency (a value, i.e., Alpha) of each point is represented by a transparency value calculated by the above formula, so that the effect of filling with the pure color can be obtained.
As another alternative embodiment, a gradient color fill edge image may be used, as shown in fig. 16, where startColor (start color) and endColor (end color) are two fill colors used for generating a gradient color, a color (color) point is a fill color at any point, a startColor is taken as a starting point, an endColor is taken as an end point, a percentage of the startColor at the starting point is 100%, a percentage of the endColor at the end point is 100%, and for any point between the starting point and the end point, a gradient fill effect of two colors can be obtained by adjusting a ratio of the startColor and the endColor, a color at any point can be represented by the following formula,
color=startColor+(endColor-startColor)*ratio。
wherein, the ratio is the ratio value of any point endColor, and the range is 0 to 1.
By the method, the color value and the transparency value of any point of the edge image can be obtained
Therefore, according to the method for displaying the edge image provided by the embodiment of the invention, the color of the edge image is determined according to the attribute of the reference image displayed on the main screen or the information of the external environment where the current electronic equipment is located, so that the visual experience of the electronic equipment is enhanced, and the satisfaction degree of user experience is improved.
The above embodiments describe a method for displaying an edge image, and in order to implement the method, an electronic device for displaying an edge image includes a hardware structure and/or a software module corresponding to each function. Those of skill in the art will readily appreciate that the present invention can be implemented in hardware or a combination of hardware and computer software, with the exemplary elements and algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The embodiment of the present invention may perform the division of the functional units for the electronic device displaying the edge image according to the method example described above, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present invention is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of an integrated unit, fig. 17A shows a schematic diagram of a possible structure of the electronic device involved in the above-described embodiment. The electronic device 1700 includes: a processing unit 1702 and a curved screen 1703, the processing unit 1702 being configured to control and manage actions of the electronic device 1700, for example, the processing unit 1702 being configured to enable the electronic device 1700 to perform S210 and S220 shown in fig. 2, and/or other processes for the techniques described herein. Curved screen 1703 is used to support electronic device 1700 in displaying images, such as the edge image described in FIG. 2. The electronic device 1700 may further include a storage unit 1701 for storing program codes and data of the electronic device 1700.
The processing unit 1703 is configured to acquire an edge image corresponding to a reference image included in the image displayed on the main screen; and the display position of the edge image on the side screen is determined according to the position or the shape of the reference image; and is configured to control the curved screen 1703 to display the edge image at the display position of the side screen.
The processing Unit 1702 may be a Processor or a controller, such as a Central Processing Unit (CPU), a general purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The curved screen 1703 may be, for example, an organic light-Emitting Diode (OLED) screen. The memory unit 1701 may be a memory.
When the processing unit 1702 is a processor and the storage unit 1701 is a memory, the electronic device displaying an edge image according to the embodiment of the present invention may be the electronic device shown in fig. 17B.
Referring to fig. 17B, the electronic device 1710 includes: a processor 1712, a memory 1711, and a curved screen 1713. Optionally, electronic device 1710 can also include bus 1714. The curved screen 1713, the processor 1712, and the memory 1711 may be connected to each other through a bus 1714; the bus 1714 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 1714 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 17B, but this is not intended to represent only one bus or type of bus.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Therefore, according to the electronic device for displaying the edge image provided by the embodiment of the invention, the reference image is determined from the image displayed on the main screen, and the display position of the edge image on the side screen is determined according to the position or the shape of the reference image, so that the position of the edge image and the shape or the position of the image displayed on the main screen can be dynamically combined, the visual experience of the terminal device is enhanced, and the satisfaction degree of user experience is improved.
In the case of an integrated unit, fig. 18A shows a schematic diagram of a possible structure of the electronic device involved in the above-described embodiment. The electronic device 1800 includes: a processing unit 1802 and a curved screen 1803, the processing unit 1802 configured to control and manage actions of the electronic device 1800, for example, the processing unit 1802 configured to enable the electronic device 1800 to perform S310, S320, and S330 of FIG. 13, and/or other processes for the techniques described herein. The curved screen 1803 is used to support the electronic device 1800 to display images, such as the edge image shown in FIG. 13. The electronic device 1800 may also include a storage unit 1801 for storing program codes and data for the electronic device 1800.
The processing unit 1803 is configured to obtain attribute information and an attribute information threshold of a sound represented by audio data, where the audio data is data recorded by a microphone, or the audio data is data used for playing through an earphone or a speaker; and when the value indicated by the attribute information is greater than or equal to the attribute information threshold, determining the display size of the edge image according to the value of the attribute information, thereby determining the edge image; and is used for controlling the curved screen 1803 to display the edge image at the display position of the side screen.
The processing Unit 1802 may be a Processor or a controller, such as a Central Processing Unit (CPU), a general purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The curved screen 1803 may be, for example, an organic light-Emitting Diode (OLED) screen. The storage unit 1801 may be a memory.
When the processing unit 1802 is a processor and the storage unit 1801 is a memory, the electronic device for displaying an edge image according to the embodiment of the present invention may be the electronic device shown in fig. 18B.
Referring to fig. 18B, the electronic device 1810 includes: a processor 1812, a memory 1811, and a curved screen 1813. Optionally, electronic device 1810 may also include bus 1814. Among them, the curved screen 1813, the processor 1812, and the memory 1811 may be connected to each other through a bus 1814; the bus 1814 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 1814 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 18B, but this does not indicate only one bus or one type of bus.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Therefore, according to the electronic device for displaying the edge image provided by the embodiment of the invention, whether the sound attribute value in a period of time is the effective sound attribute value is determined through the reference value, and the change rate of the size of the edge image is obtained by processing the effective sound attribute value through the preset maximum threshold value and the preset minimum threshold value, so that the influence of calculation errors or background noise on the sound attribute value can be reduced, the drastic change of the size of the edge image is avoided, and the satisfaction degree of user experience is improved.
In the case of an integrated unit, fig. 19A shows a schematic view of a possible structure of the electronic device involved in the above-described embodiment. The electronic device 1900 includes: a processing unit 1902 and a curved screen 1903, the processing unit 1902 being configured to control and manage actions of the electronic device 1900, e.g., the processing unit 1902 being configured to support the electronic device 1900 to perform S410, S420, and S430 shown in fig. 15, and/or other processes for the techniques described herein. Curved screen 1903 is used to support electronic device 1900 for displaying images, such as the edge image described in FIG. 15. The electronic device 1900 may further include a storage unit 1901 for storing program codes and data of the electronic device 1900.
The processing unit 1903 is configured to obtain an edge image corresponding to a reference image included in an image displayed on the main screen; or determining the information of the external environment where the electronic equipment is located at the current moment; and determining the color of the edge image according to the attribute of the reference image or the information of the external environment, thereby determining the edge image; and a control unit for controlling the curved screen 1903 to display the edge image at a display position of the side screen.
The processing Unit 1902 may be a Processor or a controller, such as a Central Processing Unit (CPU), a general purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The curved screen 1903 may be an organic light-Emitting Diode (OLED) screen, for example. The storage unit 1901 may be a memory.
When the processing unit 1902 is a processor and the storage unit 1901 is a memory, the electronic device displaying an edge image according to an embodiment of the present invention may be the electronic device shown in fig. 19B.
Referring to fig. 19B, the electronic device 1910 includes: a processor 1912, a memory 1911, and a curved screen 1913. Optionally, electronic device 1910 may also include a bus 1914. Among them, the curved screen 1913, the processor 1912, and the memory 1911 may be connected to each other through a bus 1914; the bus 1914 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 1914 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 19B, but it is not intended that there be only one bus or one type of bus.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Therefore, according to the electronic device for displaying the edge image, provided by the embodiment of the invention, the color of the edge image is determined according to the attribute of the reference image displayed on the main screen or the information of the external environment where the current electronic device is located, so that the visual experience of the electronic device is enhanced, and the satisfaction degree of user experience is improved.
In the embodiment of the present invention, the sequence number of each process does not mean the execution sequence, and the execution sequence of each process should be determined by the function and the internal logic of the process, and should not limit the implementation process of the embodiment of the present invention at all.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or in software executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in Random Access Memory (RAM), flash Memory, Read Only Memory (ROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable disk, a compact disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. Additionally, the ASIC may reside in an electronic device. Of course, the processor and the storage medium may reside as discrete components in an electronic device.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in this invention may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the present invention should be included in the scope of the present invention.

Claims (16)

1. A method for displaying an edge image, applied to an electronic device having a curved screen including a main screen and a side screen, the method comprising:
acquiring an edge image corresponding to a reference image included in the image displayed on the main screen;
determining the display position of an edge image on the side screen according to the shape of the reference image;
displaying the edge image at a display position of the side screen;
wherein the determining the display position of the edge image on the side screen according to the shape of the reference image comprises:
determining reference pixel points of the reference image according to the shape of the reference image, wherein the reference pixel points are pixel points on the edge of the reference image, which are farthest from the center of the reference image;
and determining the display position of the edge image on the side screen according to the position of the reference pixel point.
2. The method according to claim 1, wherein the reference image includes an incoming call image indicating an incoming call object, and the edge image is an elliptical image whose position changes with a change in the position of the reference pixel point.
3. A method for displaying an edge image, applied to an electronic device having a curved screen including a main screen and a side screen, the method comprising:
acquiring attribute information and an attribute information threshold of sound represented by audio data, wherein the audio data is data obtained by recording with a microphone, or the audio data is data used for playing with an earphone or a loudspeaker;
when the value indicated by the attribute information is greater than or equal to the attribute information threshold value, determining the display size of the edge image according to the value of the attribute information, and thus determining the edge image;
displaying the edge image at a display position of the side screen;
wherein the determining a display size of an edge image according to the value of the attribute information, thereby determining the edge image includes:
value SUM according to the attribute information1Determining the change rate k of the size of the edge image by a preset maximum threshold value bassMax and a preset minimum threshold value bassMin1
According to said rate of change k1Determining the display size d of the edge image according to the initial size p of the edge image1
Wherein k is1=(SUM1-bassMin)/(bassMax-bassMin),d1=(k1+1)*p。
4. The method according to claim 3, wherein the attribute information is an amplitude or a phase of a frequency component of the sound.
5. The method of claim 3,
the edge images comprise a first edge image and a second edge image, the size of the first edge image is obtained according to the amplitude or the phase of a first frequency component of the sound, the size of the second edge image is obtained according to the amplitude or the phase of a second frequency component of the sound, and the frequency band of the first frequency component is different from the frequency band of the second frequency component.
6. The method of claim 3, further comprising:
and updating the attribute information threshold according to the value of the sound attribute information.
7. The method of claim 3, wherein the edge image is an elliptical image, and wherein the display size of the edge image comprises a minor axis length of the elliptical image.
8. The method of claim 3, wherein the audio data comprises music data, voice data, recording data, alarm data, or voice assistant data.
9. An electronic device for displaying an edge image, the electronic device comprising a curved screen, the curved screen comprising a main screen and a side screen, the electronic device further comprising a processing unit, the processing unit being configured to:
acquiring an edge image corresponding to a reference image included in the image displayed on the main screen;
determining the display position of an edge image on the side screen according to the shape of the reference image;
controlling the side screen to display the edge image at the display position of the side screen;
wherein the processing unit is specifically configured to:
determining reference pixel points of the reference image according to the shape of the reference image, wherein the reference pixel points are pixel points on the edge of the reference image, which are farthest from the center of the reference image;
and determining the display position of the edge image on the side screen according to the position of the reference pixel point.
10. The electronic device according to claim 9, wherein the reference image includes an incoming call image indicating an incoming call object, and wherein the edge image is an elliptical image whose position changes with a change in position of the reference pixel point.
11. An electronic device for displaying an edge image, the electronic device comprising a curved screen, the curved screen comprising a main screen and a side screen, the electronic device further comprising a processing unit, the processing unit being configured to:
acquiring attribute information and an attribute information threshold of sound represented by audio data, wherein the audio data is data obtained by recording with a microphone, or the audio data is data used for playing with an earphone or a loudspeaker;
when the value indicated by the attribute information is greater than or equal to the attribute information threshold value, determining the display size of the edge image according to the value of the attribute information, and thus determining the edge image;
controlling the side screen to display the edge image at the display position of the side screen;
wherein the processing unit is specifically configured to:
value SUM according to the attribute information1Determining the change rate k of the size of the edge image by a preset maximum threshold value bassMax and a preset minimum threshold value bassMin1
According to the changeRate k1Determining the display size d of the edge image according to the initial size p of the edge image1
Wherein k is1=(SUM1-bassMin)/(bassMax-bassMin),d1=(k1+1)*p。
12. The electronic device according to claim 11, wherein the attribute information is an amplitude or a phase of a frequency component of the sound.
13. The electronic device of claim 11,
the edge images comprise a first edge image and a second edge image, the size of the first edge image is obtained according to the amplitude or the phase of a first frequency component of the sound, the size of the second edge image is obtained according to the amplitude or the phase of a second frequency component of the sound, and the frequency band of the first frequency component is different from the frequency band of the second frequency component.
14. The electronic device of claim 11, wherein the processing unit further comprises:
and updating the attribute information threshold according to the value of the sound attribute information.
15. The electronic device of claim 11, wherein the edge image is an elliptical image, and wherein a display size of the edge image comprises a minor axis length of the elliptical image.
16. The electronic device of claim 11, wherein the audio data comprises music data, voice data, recording data, alarm data, or voice assistant data.
CN201610912250.1A 2016-10-19 2016-10-19 Method for displaying edge image and electronic equipment Active CN106502606B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610912250.1A CN106502606B (en) 2016-10-19 2016-10-19 Method for displaying edge image and electronic equipment
PCT/CN2017/102378 WO2018072581A1 (en) 2016-10-19 2017-09-20 Edge image display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610912250.1A CN106502606B (en) 2016-10-19 2016-10-19 Method for displaying edge image and electronic equipment

Publications (2)

Publication Number Publication Date
CN106502606A CN106502606A (en) 2017-03-15
CN106502606B true CN106502606B (en) 2020-04-28

Family

ID=58294362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610912250.1A Active CN106502606B (en) 2016-10-19 2016-10-19 Method for displaying edge image and electronic equipment

Country Status (2)

Country Link
CN (1) CN106502606B (en)
WO (1) WO2018072581A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106502606B (en) * 2016-10-19 2020-04-28 华为机器有限公司 Method for displaying edge image and electronic equipment
CN107230428B (en) * 2017-05-27 2020-07-03 北京小米移动软件有限公司 Curved screen display method and device and terminal
EP3644166A4 (en) 2017-08-31 2020-07-15 Huawei Technologies Co., Ltd. Display screen and mobile terminal
CN107680048B (en) * 2017-09-05 2020-11-10 信利(惠州)智能显示有限公司 Edge display effect processing method
CN112438007A (en) * 2018-09-26 2021-03-02 深圳市欢太科技有限公司 Charging prompting method and related product
CN111127543B (en) * 2019-12-23 2024-04-05 北京金山安全软件有限公司 Image processing method, device, electronic equipment and storage medium
CN111796788A (en) * 2020-07-01 2020-10-20 芯颖科技有限公司 Method and system for compressing and accessing edge information of arc-shaped display screen
CN112034984A (en) * 2020-08-31 2020-12-04 北京字节跳动网络技术有限公司 Virtual model processing method and device, electronic equipment and storage medium
CN113885829B (en) * 2021-10-25 2023-10-31 北京字跳网络技术有限公司 Sound effect display method and terminal equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574453A (en) * 2013-10-17 2015-04-29 付晓宇 Software for expressing music with images
CN105579949A (en) * 2013-07-11 2016-05-11 三星电子株式会社 User terminal device for displaying contents and methods thereof
CN105892981A (en) * 2016-03-25 2016-08-24 乐视控股(北京)有限公司 Display method and device and mobile equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101577114B (en) * 2009-06-18 2012-01-25 无锡中星微电子有限公司 Method and device for implementing audio visualization
KR101515629B1 (en) * 2012-01-07 2015-04-27 삼성전자주식회사 Method and apparatus for providing event of portable device having flexible display unit
KR101515623B1 (en) * 2012-05-14 2015-04-28 삼성전자주식회사 Method and apparatus for operating functions of portable terminal having bended display
CN104598215B (en) * 2014-05-30 2018-01-05 小米科技有限责任公司 Audio figure methods of exhibiting and device
CN104317499B (en) * 2014-10-23 2018-08-10 广州三星通信技术研究有限公司 Device and method for the wallpaper that screen is arranged in the terminal
CN106502606B (en) * 2016-10-19 2020-04-28 华为机器有限公司 Method for displaying edge image and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105579949A (en) * 2013-07-11 2016-05-11 三星电子株式会社 User terminal device for displaying contents and methods thereof
CN104574453A (en) * 2013-10-17 2015-04-29 付晓宇 Software for expressing music with images
CN105892981A (en) * 2016-03-25 2016-08-24 乐视控股(北京)有限公司 Display method and device and mobile equipment

Also Published As

Publication number Publication date
CN106502606A (en) 2017-03-15
WO2018072581A1 (en) 2018-04-26

Similar Documents

Publication Publication Date Title
CN106502606B (en) Method for displaying edge image and electronic equipment
US10778832B2 (en) Information displaying method and terminal
US10410327B2 (en) Shallow depth of field rendering
US9558591B2 (en) Method of providing augmented reality and terminal supporting the same
CN110100251B (en) Apparatus, method, and computer-readable storage medium for processing document
US20140085538A1 (en) Techniques and apparatus for audio isolation in video processing
WO2020248900A1 (en) Panoramic video processing method and apparatus, and storage medium
WO2021189807A1 (en) Image processing method, apparatus and system, and electronic device
WO2020151491A1 (en) Image deformation control method and device and hardware device
CN110070551B (en) Video image rendering method and device and electronic equipment
JP2022551661A (en) Fake video detection using blockchain
CN113885829B (en) Sound effect display method and terminal equipment
US8843840B2 (en) Custom user interface presentation
CN109784327A (en) Bounding box determines method, apparatus, electronic equipment and storage medium
CN113457135A (en) Display control method and device in game and electronic equipment
CN113885828B (en) Sound effect display method and terminal equipment
US20230054283A1 (en) Methods and apparatuses for generating style pictures
CN115604887A (en) Atmosphere lamp group control method, control console and device
WO2018036526A1 (en) Display method and device
CN111179984B (en) Audio data processing method and device and terminal equipment
CN115880193A (en) Image processing method, image processing device, electronic equipment and storage medium
JP2022552888A (en) Fake video detection
US10204397B2 (en) Bowtie view representing a 360-degree image
JP7287118B2 (en) Image processing device, program, and image processing method
CN112055246B (en) Video processing method, device and system and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant