CN117714876A - Image display method, storage medium, electronic device and chip - Google Patents

Image display method, storage medium, electronic device and chip Download PDF

Info

Publication number
CN117714876A
CN117714876A CN202310852019.8A CN202310852019A CN117714876A CN 117714876 A CN117714876 A CN 117714876A CN 202310852019 A CN202310852019 A CN 202310852019A CN 117714876 A CN117714876 A CN 117714876A
Authority
CN
China
Prior art keywords
image
zoom
sampling
timestamp
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310852019.8A
Other languages
Chinese (zh)
Inventor
赵聪
李越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310852019.8A priority Critical patent/CN117714876A/en
Publication of CN117714876A publication Critical patent/CN117714876A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application relates to the technical field of terminals and discloses an image display method, a storage medium, electronic equipment and a chip, wherein the method comprises the following steps: smoothing a plurality of zoom values obtained by converting sampling point data in the zooming operation process of a user to obtain a plurality of smoothed zoom values; and acquiring a target time stamp of the frame to be displayed, acquiring a zoom value of a mark time stamp which meets the time distance relation with the target time stamp from the mark time stamps of the zoom values after the smoothing, and determining an estimated zoom value of the frame to be displayed through the zoom value of the mark time stamp. In this way, in the zooming process of the mobile phone, the zooming value of each frame to be displayed is estimated through the corresponding target time stamp and the zooming value corresponding to the mark time stamp of which the target time stamp meets the time distance relation, so that the situation that the mobile phone directly zooms through the uneven zooming value when the mobile phone is unevenly sampled is avoided, the uniform change display of the zooming image is ensured, and the visual experience of a user is improved.

Description

Image display method, storage medium, electronic device and chip
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an image display method, a storage medium, an electronic device, and a chip.
Background
In the process of shooting or previewing an image by using an electronic device such as a mobile phone, a user can zoom in or zoom out the image by sliding a zoom bar, clicking a zoom point, sliding and pinching the two fingers. The mobile phone samples sampling point data corresponding to clicking or sliding of a user on a mobile phone screen, determines a zoom value in the zooming operation process of the user through the sampling point data, and performs zoom-in or zoom-out adjustment on an image displayed by the mobile phone based on the zoom value so as to realize zoom-in and zoom-out display of the image.
However, the data sampling of the mobile phone may be uneven, which may further cause phenomena of stumbling of picture change, uneven image change and the like in the process of amplifying or shrinking the image of the mobile phone, thereby affecting the visual experience of the user.
Disclosure of Invention
The embodiment of the application provides an image display method, a storage medium, electronic equipment and a chip.
In a first aspect, the present application provides an image display method, applied to an electronic device, including: displaying a first image; in response to a zoom operation by a user, performing a zoom process from a first focal length to a second focal length on the first image to display a first image change corresponding to the first focal length as a second image corresponding to the second focal length, wherein the zoom process includes: acquiring a plurality of first zoom values in the zooming process from the first focal length to the second focal length, and first time stamps corresponding to the first zoom values; determining a preset second timestamp in the process that the first image change is displayed as a second image; determining a first timestamp satisfying the first time distance relationship with the second timestamp; determining a second zoom value based on the first zoom value corresponding to the first timestamp satisfying the first time-distance relationship; the third image is displayed based on the second zoom value.
In this embodiment of the present application, in the zooming process of the mobile phone, the zoom value (the second zoom value) of each frame to be displayed (the third image) is estimated by the corresponding target timestamp (the second timestamp) and the zoom value (the first zoom value) corresponding to the timestamp (the first timestamp) of the target timestamp satisfying the time-distance relationship. The mobile phone zooming method and device can avoid zooming processing directly through uneven zooming values when the mobile phone sampling is uneven, ensure uniform change display of images in the zooming process, and improve user visual experience.
In a possible implementation manner of the first aspect, the obtaining a plurality of first zoom values during a zooming process from the first focal length to the second focal length, and a first timestamp corresponding to each first zoom value includes: sampling the sliding track of the zooming operation to obtain a plurality of sampling points; determining a first zoom value corresponding to each sampling point based on the characteristic data of each sampling point, wherein the characteristic data comprises a change position and a change speed corresponding to each sampling point; based on the sampling time of each sampling point, a first timestamp of a first zoom value corresponding to each sampling point is determined.
In a possible implementation manner of the first aspect, the determining, based on the feature data of each sampling point, a first zoom value corresponding to each sampling point includes: determining a first zoom value of i sampling points based on the characteristic data of the i sampling points, wherein i is greater than or equal to 1; determining an initial zoom value of the mth sampling point based on the feature data of the mth sampling point; determining a first zoom value of an mth sampling point based on the first zoom value of the i sampling points and the initial zoom value, wherein the sampling time of the i sampling points is greater than that of the mth sampling point; or, the sampling time of the i sampling points is smaller than the sampling time of the mth sampling point; or, corresponding to i being greater than 1, the sampling time of one part of sampling points is greater than the sampling time of the mth sampling point, and the sampling time of the other part of sampling points is less than the sampling time of the mth sampling point.
In the embodiment of the application, the difference value between the zoom values of the previous sampling point data in the zooming operation can be reduced by determining the zoom value of the output current sampling point data after the weight distribution of the zoom value output by other sampling point data (i sampling points) and the current sampling point data (m sampling points) in the zooming operation. The smoothness of the curve is ensured when the zoom curve is generated subsequently, and abnormal value phenomenon, such as larger or smaller zoom value of a certain point in the zoom curve, is avoided, so that the zoom processing is performed subsequently through the smooth zoom curve, and the uniform change display of the mobile phone image in the subsequent zooming process is ensured.
In a possible implementation of the first aspect, the method further includes: the i sampling points are the first i sampling points of the mth sampling point; or, the i sampling points are the last i sampling points of the mth sampling point.
In a possible implementation of the first aspect, the second timestamp is a timestamp of a third image, and the third image is any frame of image in a process that the first image is changed and displayed as the second image; taking the timestamp of the first image as a second timestamp of a third image of the 1 st frame; determining a preset second timestamp in the process that the first image change is displayed as the second image comprises: acquiring a third image of the n-1 th frame and a second timestamp of the third image of the n-1 th frame; a second timestamp of the nth frame third image is determined based on the preset time interval.
In a possible implementation of the first aspect, determining the second zoom value based on the first zoom value corresponding to the first timestamp satisfying the first time-distance relationship includes: obtaining k first time stamps meeting a first time distance relation with the second time stamp, wherein k is greater than or equal to 1; determining a second zoom value based on the first zoom value and the second time stamp corresponding to the k first time stamps, wherein the k first time stamps are all larger than the second time stamp; or, k first time stamps are smaller than the second time stamp; or, k first time stamps corresponding to k being greater than 1, wherein one part of the first time stamps is greater than the second time stamp, and the other part of the first time stamps is less than the second time stamp.
In a possible implementation of the first aspect, the method further includes: the k first time stamps are the first k time stamps of the second time stamp; or, k first time stamps are the last k time stamps of the second time stamp; or alternatively.
In a possible implementation of the first aspect, displaying the third image based on the second zoom value includes: and carrying out zooming processing on the previous frame image of the third image based on the second zooming value to obtain the third image.
In a possible implementation of the first aspect, the first image includes at least one of the following images: when a user performs shooting preview through the electronic equipment, displaying an image by the electronic equipment; when a user photographs through the electronic equipment, displaying an image by the electronic equipment; when a user views a photo or video through the electronic device, the electronic device displays an image.
In a second aspect, embodiments of the present application provide a readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to implement any one of the image display methods provided in the first aspect and various possible implementations of the first aspect.
In a third aspect, an embodiment of the present application provides an electronic device, including: a memory for storing instructions for execution by one or more processors of the electronic device; and a processor, which is one of the processors of the electronic device, for executing the instructions stored in the memory to implement the above first aspect and any one of the image display methods provided by the various possible implementations of the above first aspect.
In a fourth aspect, embodiments of the present application provide a program product, where the program product includes instructions that, when executed by an electronic device, enable the electronic device to implement any one of the image display methods provided in the first aspect and various possible implementations of the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a data interface, and the processor reads instructions stored on a memory through the data interface to execute any one of the image display methods provided in the first aspect and various possible implementations of the first aspect.
Drawings
FIGS. 1A-1C illustrate some scene graphs of an image display method, according to some embodiments of the present application;
FIG. 2 illustrates a software architecture diagram of a mobile phone, according to some embodiments of the present application;
FIG. 3 illustrates a schematic view of a zoom curve, according to some embodiments of the present application;
FIGS. 4A-4B illustrate schematic diagrams of another zoom curve, according to some embodiments of the present application;
FIG. 5 illustrates a schematic diagram of a filtering algorithm, according to some embodiments of the present application;
FIG. 6 illustrates a schematic diagram of an estimation algorithm, according to some embodiments of the present application;
7A-7C illustrate schematic diagrams of some data smoothing modules, according to some embodiments of the present application;
FIG. 8 illustrates a flowchart of a method of displaying images, according to some embodiments of the present application;
fig. 9 illustrates a hardware architecture diagram of a mobile phone, according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, an image display method, a storage medium, an electronic device, and a chip.
The technical solutions of the present application will be described with reference to fig. 1 to 9.
In some embodiments, a user may implement photographing, video recording, and the like through a photographing function of an electronic device such as a mobile phone. For example, as shown in fig. 1A, when the user wants to take a picture, the user may open a camera application in the mobile phone 10, the camera application opens a "take a picture" function, and the mobile phone 10 displays a picture taking interface. Subsequently, when the user clicks the button 11, the mobile phone 10 can take a picture of the display image of the photographing area 12 in the photographing interface through the camera application.
In some embodiments, the user may zoom in or out of the display image of the photographing region 12 by a zoom operation such as sliding a zoom bar, clicking a zoom point, or a two-finger sliding, pinching, or the like.
For example, as shown in fig. 1A, when the user clicks on the zoom field 13, the mobile phone 10 jumps from the photographing interface shown in fig. 1A to the zoom interface of "1x" zoom value (zoom ratio) shown in fig. 1B. As shown in fig. 1B, the zoom bar 13 enlarges and displays the zoom button 14. The zoom value of the zoom field 13 shown in fig. 1B is "1x", that is, the focal length of the display image in the photographing region 12 shown in fig. 1B is 1 time. In some embodiments, the zoom bar 13 displays a plurality of zoom values, and the user may slide the zoom button 14 up and down by a finger to select a corresponding zoom value.
For example, as shown in fig. 1B, when the user slides down the zoom button 14 and selects the zoom value of "5.0x", the mobile phone 10 jumps from the zoom interface of "1x" zoom value shown in fig. 1B to the zoom interface of "5.0x" zoom value shown in fig. 1C, and the focal length of the display image in the photographing area 12 changes from 1 to 5 times. As shown in fig. 1C, the zoom value of the zoom column 13 is "5.0x", that is, the focal length of the display image in the photographing region 12 is 5 times. It will be appreciated that the display image in the capture area 12 of FIG. 1C is magnified five times as compared to the display image of the capture area 12 of FIG. 1B.
For a better understanding of the embodiments of the present application, a zoom implementation procedure is exemplified below.
In some embodiments, FIG. 2 illustrates a software architecture diagram of the handset 10. As shown in fig. 2, the handset 10 includes an application (android application package, APK) layer 21, a service (service) layer 22, a hardware abstraction (hardware abstract layer, HAL) layer 23, and a Driver (DRV) layer 24.
In some embodiments, APK layer 21 includes a camera (camera) APK module 211; the service layer 22 includes a touch service module 221 and a camera service module 222; the HAL layer 23 includes a HAL module 231 (e.g., a high-pass (Qualcomm) HAL module); the DRV layer 24 includes a touch DRV module 241 and a camera DRV module 242.
In some embodiments, when the user opens the camera application to perform the zooming operation, the touch DRV module 241 samples the sliding track corresponding to the user on the screen of the mobile phone 10, so as to obtain a plurality of sampling point data. For example, the touch DRV module 241 samples the sliding track of the user sliding down the zoom button 14 shown in fig. 1B, so as to obtain a plurality of sample point data corresponding to the sliding track on the mobile phone 10 when the user performs the zoom operation. Subsequently, the touch DRV module 241 transmits the sampling point data to the touch service module 221 through the HAL layer 23, and the touch service module 221 transmits the sampling point data to the APK layer 21. The APK layer 21 generates a camera zoom touch event based on the sampling point data and reports the camera zoom touch event to the camera APK module 211. In some embodiments, the sampling frequency of the touch DRV module 241 to the sampling point data is, for example, 120Hz, i.e., 120 times/second. It will be appreciated that in the data transmission process shown on the left side of fig. 2, the transmission data are all 120Hz sampling data.
In some embodiments, after the camera APK module 211 receives the camera zoom touch event, the 120Hz sample point data in the camera zoom touch event is sampled at a sampling frequency of, for example, 60Hz, i.e., 60 times/second. After the camera APK module 211 samples the 60Hz sampling point data, the 60Hz sampling point data is sent to the camera service module 222. The camera service module 222 samples the received 60Hz sampling point data at a sampling frequency of 30Hz, i.e., 30 times/second, through a rotation buffer (buffer), converts the sampled 30Hz sampling point data into zoom values, and adds corresponding marking time stamps to the respective zoom values according to a sampling sequence and a sampling time. Subsequently, the camera service module 222 generates a zoom request by converting the latest zoom value obtained in real time and a current frame image to be processed (hereinafter, simply referred to as a frame to be processed) of the mobile phone.
It can be understood that the frame to be processed represents the current image displayed by the mobile phone, and the zoom processing is performed on the frame to be processed through the zoom value, so that the current frame to be displayed (hereinafter, simply referred to as the frame to be displayed) of the mobile phone corresponding to the zoom value can be obtained.
It will be appreciated that the frame to be displayed represents the image that the handset is about to display. In some embodiments, the frame to be processed is typically the previous frame image of the frame to be displayed. For example, if the current displayed image of the mobile phone is the first frame image, the first frame image is the frame to be processed, and the second frame image is the frame to be displayed, and the second frame image can be obtained by performing zooming processing on the first frame image.
For another example, the display image of the zoom value "1x" in the display area shown in fig. 1B is taken as the frame to be processed, that is, the image currently displayed by the mobile phone is the display image shown in fig. 1B. Then, after the user performs the zoom operation shown in fig. 1B, the mobile phone converts the sampling point data to obtain a zoom value of "5.0x" based on the zoom operation of the user, and performs zoom processing on the display image of the zoom value of "1x" in the display area shown in fig. 1B through the zoom value of "5.0x", so as to obtain the display image of the zoom value of "5.0x" in the display area shown in fig. 1C. That is, when the display image of the "1x" zoom value in the display area shown in fig. 1B is taken as the frame to be processed, the display image of the "5.0x" zoom value in the display area shown in fig. 1C may be taken as the next frame image of the display image shown in fig. 1B, i.e., the frame to be displayed.
In other embodiments, after the camera APK module 211 sends the 60Hz sampling point data to the camera service module 222, the camera service module 222 may also resample the received 60Hz sampling point data based on the touch point resampler (zoom ratio filter) at, for example, 30Hz sampling frequency, i.e., 30 times/second. Subsequently, the latest sampling point data is acquired from the resampled 60Hz sampling point data by the rotation buffer at a sampling frequency of, for example, 30Hz, i.e., 30 times/second, to be converted into a zoom value, and a zoom request is generated by the latest zoom value obtained by the real-time conversion and the frame to be processed.
In some embodiments, the camera service module 222 sends a zoom request to the HAL module 231. The HAL module 231 invokes the camera DRV module 242 to control camera hardware in the mobile phone 10, such as an ultra wide angle camera, a tele camera, and the like, to perform focal length adjustment based on the zoom request, so as to display the change of the display image of the mobile phone 10 as a zoom image with a zoom value, and perform zoom adjustment of the image.
As described above, when the user opens the camera application to perform the zooming operation, the touch DRV module 241 samples the sliding track corresponding to the user on the screen of the mobile phone 10. In some embodiments, when the user performs the zooming operation, the sliding speed of the finger of the user may be uneven, which may further cause that the sampling point data sampled by the touch DRV module 241 is also uneven, so that the zoom curve of the subsequent conversion of the sampling point data into the zoom value sent to the camera DRV module 242 is not smooth. For example, as shown in fig. 3, due to the influence of uneven operation force or speed of the user's hand, the zoom curve may be uneven, so that when the camera DRV module 242 performs zoom adjustment on the mobile phone display image based on each zoom value in the zoom curve, phenomena such as stumbling of image change, uneven image change and the like occur, which affect the visual experience of the user.
In other embodiments, when the user performs a zooming operation, a phenomenon in which the zooming curve is not smooth may occur even if the user's finger sliding rate is uniform. For example, the carousel buffer includes a plurality of buffers, each of which typically sequentially acquires the latest sampling point data from the received sampling point data according to a preset time and converts the sampling point data into a zoom value. As shown in fig. 4A, the carousel buffer includes buffers 011, 022, 033, and the sampling point data includes data 01 to 05. The camera APK module 211 sends one sample point data every 15ms to the camera service module 222, and the rotation buffer in the camera service module 222 performs one sample point data sample every 30 ms. The sample point data acquired by the corresponding buffer 011 should be data 01, the sample point data acquired by the buffer 022 should be data 03, and the sample point data acquired by the buffer 033 should be data 05.
However, the buffer may be affected by the software processing time, resulting in uneven values. For example, as shown in fig. 4B, the image processing time of the buffer 022 is 40ms due to the influence of the software processing time. In this way, when the camera APK module 211 transmits the data 03 to the camera service module 222, the buffer 022 is in a non-idle state, and cannot acquire the data 03. When the buffer 022 is idle, the latest data can be obtained from the sampling point data sent by the camera APK module 211. At this time, the latest sampling point data may be the data 04, so that the sampling point data acquired by the buffer 022 is the data 04, and the value of the rotation buffer area is uneven, so that the subsequent zoom curve is uneven, phenomena of quick and slow picture change, uneven image change and the like occur, and the visual experience of the user is affected.
Therefore, the application provides an image display method, in the zooming process of the mobile phone, the zooming value (an example of a second zooming value) of a frame to be displayed by the mobile phone is not determined by the zooming value (an example of a first zooming value) directly obtained by the user zooming operation sampling point data; but the zoom value corresponding to the mark time stamp meeting the time distance relation with the target time stamp is obtained from the mark time stamp (the example of the first time stamp) of a plurality of zoom values directly obtained by zooming operation sampling point data of a user through the target time stamp (the example of the second time stamp) of the frame to be displayed, and the zoom value of the frame to be displayed is estimated through the zoom value of the mark time stamp to determine. For example, the estimated zoom value of the frame to be displayed may be determined by the zoom value corresponding to the timestamp closest to the target timestamp time. After obtaining the estimated zoom value of the frame to be displayed, the mobile phone performs image processing through the estimated zoom value, for example, performs zooming processing on the basis of the previous frame image of the frame to be displayed, so as to obtain the frame to be displayed (the example of the third image) corresponding to the estimated zoom value.
In this way, in the zooming process of the mobile phone, the zooming value of each frame to be displayed is estimated through the corresponding target time stamp and the zooming value corresponding to the mark time stamp of which the target time stamp meets the time distance relation. The number of the frames to be displayed (the third image) processed by the method can be multiple, and the frames to be displayed are image frames which appear in the zooming process from the initial image (the example of the first image) to the target image (the example of the second image), so that the situation that the mobile phone is directly subjected to zooming processing through uneven zooming values when the mobile phone is unevenly sampled is avoided, the uniform change display of the images in the zooming process is ensured, and the visual experience of a user is improved.
It can be understood that in the zooming process, the mobile phone determines the zooming value of the frame to be displayed according to the zooming value of the timestamp, which satisfies the time-distance relationship with the target timestamp of the frame to be displayed, so that the changing speed of the zooming picture of the image can be changed synchronously with the operating speed of the zooming operation of more users, and the visual experience of the interactive operation user is improved. For example, the faster the operation speed at the time of the user's zoom operation, the faster the zoom value changes, and the faster the change speed of the corresponding change screen also.
In some embodiments, when the user performs a zoom operation, the mobile phone may determine the type of the zoom operation through a sliding direction of a sliding track corresponding to the user on a screen of the mobile phone. For example, as shown in fig. 1B, when the slide locus of the user performing the zoom operation is a downward slide, the type of the corresponding zoom operation is an enlarged image, whereas when the slide locus of the user performing the zoom operation is an upward slide, the type of the corresponding zoom operation is a reduced image. In other embodiments, the user may perform the zooming operation by a two-finger sliding manner, and if the two fingers approach, the type corresponding to the zooming operation is a reduced image, and if the two fingers move away from the sliding manner, the type corresponding to the zooming operation is an enlarged image, which is not limited in particular.
In some embodiments, after the mobile phone determines the type of the user zoom operation, the mobile phone may determine a zoom value during the user zoom operation through a sliding distance, a sliding speed, and the like corresponding to the sliding track. For example, the larger the sliding distance of the user zoom operation sliding track, the larger the variation range of the zoom value; if the sliding distance of the user is 1mm, the zoom value is changed by 1 time; or for example, the user slides a distance of 2mm, the zoom value varies by a factor of 2, etc. For another example, the faster the user zooms the sliding speed of the operation sliding track, the faster the changing speed of the zoom value; if the sliding speed of the user is 1mm/s, the zoom value is changed by 1 time within 1 s; or if the user sliding speed is 2mm/s, the zoom value is changed by a factor of 2 within 1s, etc. In other embodiments, the type of zoom operation and the zoom value during the zoom operation may also be determined based on other ways, particularly without limitation.
In some embodiments, the mobile phone samples a plurality of sampling points on a sliding track during zooming operation of a user, and based on sampling point data such as a change position, a change speed and the like of each sampling point, the type of zooming operation and a zooming value corresponding to each sampling point data can be determined. The handset may then add a corresponding time stamp to each zoom value based on the sample time correspondence of each sample point data.
In some embodiments, after the mobile phone obtains the zoom values obtained by converting the plurality of sampling points, the obtained plurality of multi-zoom values may be smoothed to ensure smoothness of a zoom curve generated subsequently, so as to ensure uniform change display of the mobile phone image in a subsequent zooming process. The smoothing process is exemplified below.
In some embodiments, as shown in fig. 5, the smoothing process may be implemented by a filtering algorithm shown in the following equation (1). As shown in formula (1):
in the formula (1), y [ n ] represents a zoom value corresponding to the n-th sampling point data, y [ n-1] represents a zoom value corresponding to the n-1-th sampling point data, x [ n ] represents the n-th sampling point data, B/A represents a weight of the zoom value corresponding to the n-1-th sampling point data, and (A-B)/A represents a weight of the n-th sampling point data.
It will be appreciated that the filtering algorithm shown in equation (1) can determine the zoom value (y n) of the current sample point data output by the zoom value (y n-1) of the previous sample point data output and the current sample point data (x n). For example, when the user performs the zooming process, the focal length of the image displayed by the mobile phone before the zooming process of the mobile phone is taken as the zoom value of the first sampling point data. Thus, the zoom value of each subsequent sampling point data can be determined by the zoom value output by the corresponding previous sampling point data and the current sampling point data.
In some embodiments, the zoom value of the current sample point data may also be determined from a previous plurality of sample point data of the current sample point data, such as from a previous two sample point data of the current sample point data; or the zoom value of the current sampling point data can be determined by one or more sampling point data of the current sampling point data; or the zoom value of the current sampling point data can also be determined by the previous one or more sampling point data of the current sampling point data and the next one or more sampling point data of the current sampling point data, and the like, and the method is not particularly limited.
It can be understood that, compared with a mode of directly outputting the zoom value of the current sampling point data only through the current sampling point data, the difference between the zoom values of the sampling point data in the zooming operation can be reduced by determining the zoom value of the current sampling point data after the zoom value of the other sampling point data in the zooming operation and the current sampling point data are subjected to weight distribution. For example, the difference between the zoom value of the current sample point data and the zoom value of the previous sample point data is reduced. Therefore, smoothness of the curve during subsequent generation of the zoom curve can be ensured, abnormal value phenomena such as larger or smaller zoom value of a certain point in the zoom curve are avoided, and subsequent zooming processing is performed through the smooth zoom curve, so that uniform change display of the mobile phone image in the subsequent zooming process is ensured.
In other embodiments, the smoothing process may be implemented by other algorithms or other manners, for example, filtering the sample point data greater than or greater than a preset smoothing threshold, making the zoom curve of the subsequent production smoother, etc., which is not limited in particular.
In some embodiments, after the mobile phone obtains the smoothed zoom values, the target timestamp of the frame to be displayed may be estimated by the timestamp of the frame to be processed, that is, the image of the frame preceding the frame to be displayed (hereinafter, t is used current A target timestamp representing a frame to be displayed); alternatively, the target timestamp t may be estimated from the timestamp of the original image frame, i.e. the first frame image current
For example, when the user performs the zooming process, an image displayed by the mobile phone before the zooming process of the mobile phone is taken as a first frame image, and the final system time of the mobile phone displaying the first frame image is obtained as a time stamp of the first frame image. In this way, based on the image frame rate of the contact image sensor (contact image sensor, CIS) in the mobile phone, the target time stamp t of the frame to be displayed of each subsequent frame after the first frame image in the zooming process can be estimated current
For example, the frame rate of the mobile phone CIS is 30 ms/frame, i.e. the mobile phone can update and display a new image every 30 ms. That is, if the time stamp of the first frame image is 1ms, the time stamp corresponding to the second frame image should be 31ms, and the time stamp corresponding to the third frame image The timestamp is 61ms, and the like, so that the timestamp of each frame image after the first frame image in the zooming process can be estimated. It will be appreciated that the target timestamp of the frame to be displayed is determined by the frame rate of the cell phone CIS, for example based on an integer multiple of the frame rate of the cell phone CIS. In some embodiments, when the mobile phone estimates the target timestamp t current Then, the target time stamp t can be selected from the time stamps of the zoom values after the smoothing process current And determining the estimated zoom value of the frame to be displayed according to the zoom value corresponding to the marked timestamp meeting the preset time-distance relation.
For example, as shown in FIG. 6, based on the target timestamp t current Selecting one of the time stamps of the smoothed zoom values to be smaller than the target time stamp t current But is distant from the target timestamp t current A most recent time stamp t 1 At the same time, a time greater than the target time stamp t is selected current But is distant from the target timestamp t current A most recent time stamp t 2 Based on two time stamps t 1 And t 2 Corresponding to two time stamps t 1 And t 2 Zoom value zoom ratio of (c) 1 And ZoomRatio 2 Determining a predicted zoom value zoom ratio of a frame to be displayed current
In some embodiments, the estimated zoom value may be implemented by an estimation algorithm shown in the following equation (2). As shown in formula (2):
in formula (2), zoomRatio current Representing a zoom value, t, of a frame to be displayed current Target time stamp, t, representing frame to be displayed 1 Representing a time-stamped, zoomRatio 1 Representing t 1 Marking a zooming value corresponding to the time stamp, t 2 Representing another time-stamped, zoomRatio 2 Representing t 2 The zoom value corresponding to the timestamp is marked.
It can be understood that the estimation algorithm shown in the formula (2) can estimate the zoom value of the frame to be displayed through the two zoom values obtained by original sampling during the zooming operation of the user. Therefore, when the mobile phone sampling is uneven, the data which is missed or misplaced in the middle due to uneven sampling can be estimated through the estimation algorithm, so that zooming processing is avoided by directly using the zooming value of the uneven mobile phone sampling, and further, the phenomena of uneven zooming curve, stumbling of picture change, uneven image change and the like caused by uneven sampling are avoided.
It can be appreciated that with the target timestamp t current The time stamp of the zoom value may be satisfied or the sampling time of the sampling point data corresponding to the zoom value may be satisfied, which is not particularly limited.
It can be understood that, by determining the estimated zoom value corresponding to the frame to be displayed by the zoom value of the marking time stamp closest to the target time stamp of each frame image, the change speed of the zoom picture of the image can be changed synchronously with the operation speed of the zooming operation of more users, and the visual experience of the interactive operation user is improved. If the operation speed of the user in zooming operation is faster, the zoom value is changed faster, and the corresponding change picture is changed faster.
In other embodiments, the predicted zoom value may also be determined by other algorithms or other means, such as by a target timestamp t current Corresponding first plurality of time stamping marks, zoom values of first two time stamping marks determine target time stamping t current Is a zoom value of (2); or also by the target timestamp t current Corresponding last plurality of time-stamping, e.g. last two zoom values determine the target time-stamp t current Is a zoom value of (2); or may also pass the target timestamp t current Corresponding zoom values corresponding to the first plurality of marking time stamps, and a target time stamp t current Zoom values corresponding to the corresponding next plurality of marked timestamps to determine a target timestamp t current The zoom value of (c) and the like are not particularly limited.
In some embodiments, as shown in FIG. 7A, a data estimation module 70 is generated based on the filtering algorithm and the estimation algorithm described above. According to the new path shown in fig. 7A, the filtering algorithm in the data estimation module 70 can perform smoothing processing on the zoom value directly obtained by the user zooming operation sampling point data, so as to reduce the difference between the zoom value of the previous sampling point data and the zoom value of the current sampling point data, ensure the smoothness of the curve when the zoom curve is generated subsequently, and avoid the occurrence of abnormal value phenomenon.
Further, after the filtering algorithm obtains the smoothed zoom values, the estimation algorithm in the data estimation module 70 can estimate the zoom value of the frame to be displayed through the two zoom values obtained by the original sampling during the zooming operation of the user. Therefore, when the mobile phone samples unevenly, the data which are not sampled or are erroneously sampled in the middle due to uneven sampling can be estimated through the estimation algorithm. And the data used by the estimation algorithm is the data which is processed through smoothing, so that the data output by the estimation algorithm is smoother, the smoothness of a follow-up zooming curve is ensured, the uniform change display of the image in the zooming process is further ensured, and the visual experience of a user is improved.
In some embodiments, the input data of the data estimation module 70, i.e. the smoothed data, may be data of 120Hz sampling frequency in the APK layer 21, 60Hz sampling frequency in the service layer 22, or 30Hz sampling frequency in the HAL layer 23. In other embodiments, the smoothed data may be mixed data of any two or more of the APK layer 21, the service layer 22, and the HAL layer 23, which is not particularly limited. It will be appreciated that the higher the data sampling frequency, the higher the resulting data accuracy.
For example, as shown in fig. 7B, when the smoothed data is set as data of a 60Hz sampling frequency in the service layer 22, the data estimation module 70 may be provided in the service layer 22. Or, for example, as shown in fig. 7C, when the smoothed data is set as data of a sampling frequency of 30Hz in the HAL layer 23, the data predicting module 70 may be provided in the HAL layer 23.
It will be appreciated that the handset 10 described above is merely one example of an electronic device. In other embodiments, the electronic device may be any electronic device having screen capturing, schedule creating, and reminding functions, including, but not limited to, tablet, wearable, in-vehicle, augmented reality (augmented reality, AR)/Virtual Reality (VR) device, notebook, ultra-mobile personal computer, UMPC, netbook, personal digital assistant (personal digital assistant, PDA), etc., without limitation.
It is understood that the photographing preview scene shown in fig. 1A to 1C described above is only one example of application scenes of the present application. In other embodiments, the application scene may further include, without limitation, a zoom scene of an image when capturing an image by a camera application, a zoom scene of an image when previewing a photograph or playing a video by an album application, and the like.
Some embodiments of the present application will be illustrated below by taking the mobile phone 10 as an example of an electronic device through the photographing preview scene shown in fig. 1A to 1C and the frame shown in fig. 7C.
Fig. 8 illustrates a flow chart of steps of an image display method, according to some embodiments of the present application. As shown in fig. 8 and 7C, the steps include:
s801: the DRV layer 24 samples the user zoom operation at a first frequency to obtain a plurality of first sample point data.
In some embodiments, when the user opens the camera application to perform the zooming operation, the touch DRV module 241 in the DRV layer 24 samples the sliding track corresponding to the user on the screen of the mobile phone 10 according to the first frequency, for example, the sampling frequency of 120Hz, to obtain a plurality of first sampling point data. For example, the touch DRV module 241 samples the sliding track of the user sliding down the zoom button 14 shown in fig. 1B, so as to obtain a plurality of first sample point data corresponding to the sliding track on the mobile phone 10 when the user performs the zoom operation.
S802: the DRV layer 24 transmits the first sample point data to the HAL layer 23.
S803: the HAL layer 23 sends the first sample point data to the service layer 22.
S804: the service layer 22 transmits the first sample point data to the APK layer 21.
In some embodiments, after obtaining the first sample point data, the DRV layer 24 sends the first sample point data to the APK layer 21 sequentially through the HAL layer 23 and the service layer 22.
S805: the APK layer 21 samples the first sampling point data according to the second frequency to obtain second sampling point data.
In some embodiments, the APK layer 21 generates a camera zoom touch event based on the first sampling point data and reports the camera zoom touch event to the camera APK module 211 in the APK layer 21. When the camera APK module 211 receives the camera zoom touch event, the first sampling point data in the camera zoom touch event is sampled according to a second frequency, for example, a 60Hz sampling frequency, so as to obtain second sampling point data.
S806: the APK layer 21 transmits the second sample point data to the service layer 22.
S807: the service layer 22 samples the second sampling point data according to the third frequency, and converts the sampled data into corresponding first zoom values.
In some embodiments, the APK layer 21 sends the second sample point data to the service layer 22 through the camera APK module 21. Subsequently, the service layer 22 resamples the received second sampling point data according to a third frequency, such as a sampling frequency of 30Hz, by using the touch point resampler, and converts the resampled data into a corresponding first zoom value.
S808: the service layer 22 sends the first zoom value to the HAL layer 23.
S809: the HAL layer 23 performs smoothing processing on the first zoom value based on a filtering algorithm, and predicts a second zoom value of the frame to be displayed based on the smoothed first zoom value and an estimation algorithm.
In some embodiments, the service layer 22 sends the converted first zoom value to the data estimation module 70 in the HAL layer 23 through the touch point resampler. The data estimation module 70 performs smoothing processing on the first zoom value through a filtering algorithm, and then the data estimation module 70 estimates a zoom value of the frame to be displayed through an estimation algorithm and the smoothed first zoom value. Specific implementation can be seen from the descriptions of the above formula (1) and formula (2), and the description is omitted here.
It can be understood that, based on the filtering algorithm in the data estimation module 70, the zoom value directly obtained by the zooming operation of the user on the sample point data can be smoothed, so as to reduce the difference between the zoom value of the previous sample point data and the zoom value of the current sample point data, ensure the smoothness of the curve when the zoom curve is generated subsequently, and avoid the occurrence of abnormal value phenomenon.
It can be appreciated that after the filtering algorithm obtains the smoothed zoom values, the estimation algorithm in the data estimation module 70 may estimate the zoom value of the frame to be displayed through the two zoom values obtained by the original sampling during the zooming operation of the user. Therefore, when the mobile phone samples unevenly, the data which are not sampled or are erroneously sampled in the middle due to uneven sampling can be estimated through the estimation algorithm. And the data used by the estimation algorithm is the data which is processed through smoothing, so that the data output by the estimation algorithm is smoother, the smoothness of a follow-up zooming curve is ensured, the uniform change display of the image in the zooming process is further ensured, and the visual experience of a user is improved.
S810: the HAL layer 23 sends the second zoom value to the DRV layer 24.
In some embodiments, the data predictor module 70 sends the second zoom value of the frame to be displayed to the HAL module 231 in the HAL layer 23 after obtaining the second zoom value of the frame to be displayed. The HAL module 231 generates a zoom request based on the second zoom value of the frame to be displayed and the frame to be processed of the previous frame of the frame to be displayed, and transmits the zoom request to the DRV layer 24.
S811: the DRV layer 24 performs zooming processing on the frame to be processed based on the second zoom value, resulting in a frame to be displayed.
In some embodiments, the DRV layer 24 invokes the camera DRV module 242 through the camera DRV module 242 to control camera hardware in the mobile phone 10, such as the ultra-wide-angle camera, the tele camera, and the like, to perform focal length adjustment based on the zoom request, so as to display the frame to be processed displayed by the mobile phone 10 as the frame to be displayed with the strain focal value, and perform zoom adjustment for implementing the image.
In some embodiments, if the second zoom values of the plurality of frames to be displayed need to be determined, the image display method may be repeated to obtain the second zoom values of the corresponding frames to be displayed.
It can be understood that in the zooming process, the zooming value of each frame to be displayed is estimated through the corresponding target time stamp and the zooming value corresponding to the mark time stamp with the target time stamp meeting the time-distance relation, so that the zooming process is directly performed through the uneven zooming value when the mobile phone is not uniformly sampled, the uniform change display of the image in the zooming process is ensured, and the user visual experience is improved.
It can be understood that in the zooming process, the mobile phone determines the zooming value of the frame to be displayed according to the zooming value of the timestamp, which satisfies the time-distance relationship with the target timestamp of the frame to be displayed, so that the changing speed of the zooming picture of the image can be changed synchronously with the operating speed of the zooming operation of more users, and the visual experience of the interactive operation user is improved. For example, the faster the operation speed at the time of the user's zoom operation, the faster the zoom value changes, and the faster the change speed of the corresponding change screen also.
Fig. 9 is a schematic diagram illustrating a hardware structure of the mobile phone 10 according to some embodiments of the present application.
It should be understood that the structure illustrated in the embodiments of the present application is not intended to limit the specific implementation of the handset 10. In other embodiments of the present application, the handset 10 may include more or fewer components than shown, or certain components may be combined, certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
As shown in fig. 9, the handset 10 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
For example, in some embodiments, the processor 110 may execute instructions corresponding to the methods provided in the foregoing embodiments.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated access is avoided, the waiting time of the processor 110 is reduced, and the processing efficiency is improved.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect to a charger to charge the mobile phone 10, or may be used to transfer data between the mobile phone 10 and peripheral devices. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not limited to the structure of the mobile phone 10. In other embodiments of the present application, the mobile phone 10 may also use different interfacing manners, or a combination of multiple interfacing manners, as in the above embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the handset 10. The charging management module 140 can also supply power to the mobile phone 10 through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 10 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 10 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G/6G, etc. applied to the handset 10. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or video through display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the handset 10. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 of the handset 10 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 10 can communicate with a network and other devices via wireless communication technology. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), 5G and subsequent evolution standards, BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include, among other things, a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS), and/or a satellite based augmentation system (satellite-based augmentation systems, SBAS).
The handset 10 implements display functions through a GPU, a display 194, an application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the handset 10 may include 1 or N display screens 194, N being a positive integer greater than 1.
The cell phone 10 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like. The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to the naked eye. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the handset 10 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the handset 10 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The handset 10 may support one or more video codecs. Thus, the handset 10 can play or record video in a variety of encoding formats, such as moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. The NPU may enable applications such as intelligent recognition of the mobile phone 10, for example, image recognition, face recognition, voice recognition, text understanding, etc.
The external memory interface 120 may be used to interface with an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 10. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the handset 10, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications and data processing of the handset 10 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
For example, in some embodiments, the internal memory 121 may be used to temporarily store instructions of the methods provided by the foregoing embodiments.
The handset 10 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to effect contact and separation with the handset 10. The handset 10 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile phone 10 interacts with the network through the SIM card to realize functions such as call and data communication. In some embodiments, the handset 10 employs an eSIM, i.e., an embedded SIM card. The eSIM card may be embedded in the handset 10 and not separable from the handset 10.
In some embodiments, embodiments of the present application also provide a computer readable medium having program code stored thereon, which when run on a computer causes the computer to perform the methods of the above aspects.
In some embodiments, embodiments of the present application also provide a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method of the above aspects.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module is a logic unit/module, and in physical aspect, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is the key to solve the technical problem posed by the present application. Furthermore, to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems presented by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the present application.

Claims (12)

1. An image display method applied to an electronic device, comprising:
displaying a first image;
in response to a zoom operation by a user, performing a zoom process from a first focal length to a second focal length on the first image to display the first image change corresponding to the first focal length as a second image corresponding to the second focal length, wherein the zoom process includes:
acquiring a plurality of first zoom values in the zooming process from the first focal length to the second focal length, and a first timestamp corresponding to each first zoom value;
determining a preset second timestamp in the process that the first image change is displayed as the second image;
determining the first timestamp satisfying a first time distance relationship with the second timestamp;
determining a second zoom value based on the first zoom value corresponding to the first timestamp satisfying the first time-distance relationship;
and displaying a third image based on the second zoom value.
2. The method of claim 1, wherein the obtaining a plurality of first zoom values during the zooming process from the first focal length to the second focal length, and a first timestamp corresponding to each of the first zoom values, comprises:
Sampling the sliding track of the zooming operation to obtain a plurality of sampling points;
determining the first zoom value corresponding to each sampling point based on the characteristic data of each sampling point, wherein the characteristic data comprises a change position and a change speed corresponding to each sampling point;
and determining the first time stamp of the first zoom value corresponding to each sampling point based on the sampling time of each sampling point.
3. The method according to claim 2, wherein determining the first zoom value corresponding to each sampling point based on the feature data of each sampling point includes:
determining first zoom values of i sampling points based on the characteristic data of the i sampling points, wherein i is greater than or equal to 1;
determining an initial zoom value of the mth sampling point based on the feature data of the mth sampling point;
determining a first zoom value of the mth said sampling point based on the first zoom value of the i said sampling points and the initial zoom value, wherein,
the sampling time of the i sampling points is larger than the sampling time of the m-th sampling point; or alternatively, the first and second heat exchangers may be,
The sampling time of the i sampling points is smaller than the sampling time of the m-th sampling point; or alternatively, the first and second heat exchangers may be,
and corresponding to i being greater than 1, wherein the sampling time of one part of sampling points is greater than the sampling time of the mth sampling point, and the sampling time of the other part of sampling points is less than the sampling time of the mth sampling point.
4. A method according to claim 3, characterized in that the method further comprises:
the i sampling points are the first i sampling points of the mth sampling point; or alternatively, the first and second heat exchangers may be,
the i sampling points are the last i sampling points of the m-th sampling point.
5. The method of claim 1, wherein the second timestamp is a timestamp of the third image, and the third image is any frame of image during which the first image is displayed as the second image;
taking the time stamp of the first image as a second time stamp of the third image of the 1 st frame;
the determining the first image change to be displayed as a preset second timestamp in the process of the second image comprises:
acquiring the third image of the n-1 th frame and a second timestamp of the third image of the n-1 th frame;
And determining a second timestamp of the third image of the nth frame based on the preset time interval.
6. The method of any of claims 1 to 5, wherein the determining a second zoom value based on the first zoom value corresponding to the first timestamp satisfying the first time-distance relationship comprises:
obtaining k first timestamps satisfying a first time-distance relation with the second timestamp, wherein k is greater than or equal to 1;
determining the second zoom value based on the first zoom values corresponding to the k first time stamps and the second time stamp, wherein,
the k first timestamps are each greater than the second timestamp; or alternatively, the first and second heat exchangers may be,
the k first time stamps are smaller than the second time stamp; or alternatively, the first and second heat exchangers may be,
and k is larger than 1, wherein one part of the k first time stamps is larger than the second time stamp, and the other part of the k first time stamps is smaller than the second time stamp.
7. The method of claim 6, wherein the method further comprises:
the k first timestamps are the first k timestamps of the second timestamp; or alternatively, the first and second heat exchangers may be,
the k first timestamps are the last k timestamps of the second timestamp; or alternatively.
8. The method of any of claims 1 to 5, wherein displaying a third image based on the second zoom value comprises:
and carrying out zooming processing on the previous frame image of the third image based on the second zooming value to obtain the third image.
9. The method of any one of claims 1 to 5, wherein the first image comprises at least one of the following images:
when a user performs shooting preview through the electronic equipment, displaying an image by the electronic equipment;
when a user photographs through the electronic equipment, displaying an image by the electronic equipment;
and when a user views a photo or video through the electronic equipment, displaying an image by the electronic equipment.
10. A computer readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to implement the method of any of claims 1 to 9.
11. An electronic device, comprising:
a memory for storing instructions for execution by one or more processors of the electronic device;
and a processor, being one of the processors of the electronic device, for executing instructions stored in the memory to implement the method of any one of claims 1 to 9.
12. A chip comprising a processor and a data interface, the processor reading instructions stored on a memory via the data interface to perform the method of any one of claims 1 to 9.
CN202310852019.8A 2023-07-11 2023-07-11 Image display method, storage medium, electronic device and chip Pending CN117714876A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310852019.8A CN117714876A (en) 2023-07-11 2023-07-11 Image display method, storage medium, electronic device and chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310852019.8A CN117714876A (en) 2023-07-11 2023-07-11 Image display method, storage medium, electronic device and chip

Publications (1)

Publication Number Publication Date
CN117714876A true CN117714876A (en) 2024-03-15

Family

ID=90163014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310852019.8A Pending CN117714876A (en) 2023-07-11 2023-07-11 Image display method, storage medium, electronic device and chip

Country Status (1)

Country Link
CN (1) CN117714876A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102404497A (en) * 2010-09-16 2012-04-04 北京中星微电子有限公司 Digital zoom method and device for image pickup equipment
JP2012151658A (en) * 2011-01-19 2012-08-09 Ricoh Co Ltd Imaging apparatus and imaging method
US20170201691A1 (en) * 2014-07-02 2017-07-13 Sony Corporation Zoom control device, zoom control method, and program
JP2017224940A (en) * 2016-06-14 2017-12-21 キヤノン株式会社 Image processing method and image processing apparatus
WO2022200570A1 (en) * 2021-03-26 2022-09-29 Fotonation Limited Method of controlling a camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102404497A (en) * 2010-09-16 2012-04-04 北京中星微电子有限公司 Digital zoom method and device for image pickup equipment
JP2012151658A (en) * 2011-01-19 2012-08-09 Ricoh Co Ltd Imaging apparatus and imaging method
US20170201691A1 (en) * 2014-07-02 2017-07-13 Sony Corporation Zoom control device, zoom control method, and program
JP2017224940A (en) * 2016-06-14 2017-12-21 キヤノン株式会社 Image processing method and image processing apparatus
WO2022200570A1 (en) * 2021-03-26 2022-09-29 Fotonation Limited Method of controlling a camera

Similar Documents

Publication Publication Date Title
WO2020192461A1 (en) Recording method for time-lapse photography, and electronic device
CN110072070B (en) Multi-channel video recording method, equipment and medium
WO2020168956A1 (en) Method for photographing the moon and electronic device
WO2020073959A1 (en) Image capturing method, and electronic device
AU2019418925B2 (en) Photographing method and electronic device
EP4064684A1 (en) Method for photography in long-focal-length scenario, and terminal
CN113194242B (en) Shooting method in long-focus scene and mobile terminal
WO2021185250A1 (en) Image processing method and apparatus
CN113934330B (en) Screen capturing method and electronic equipment
CN112954251B (en) Video processing method, video processing device, storage medium and electronic equipment
CN113572948B (en) Video processing method and video processing device
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN115514883A (en) Cross-device collaborative shooting method, related device and system
CN112637481B (en) Image scaling method and device
CN113497851B (en) Control display method and electronic equipment
CN114915834A (en) Screen projection method and electronic equipment
CN111062224B (en) Content transmission method and terminal equipment
CN115412678B (en) Exposure processing method and device and electronic equipment
CN117714876A (en) Image display method, storage medium, electronic device and chip
CN115696067B (en) Image processing method for terminal, terminal device and computer readable storage medium
EP4274248A1 (en) Photographing method and electronic device
CN115209062A (en) Image processing method and device
CN116661615A (en) Display adjustment method for electronic equipment with folding screen and related equipment
CN113382162A (en) Video shooting method and electronic equipment
CN111738107A (en) Video generation method, video generation device, storage medium, and electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination