CN117667654A - Dynamic effect test method and electronic equipment - Google Patents

Dynamic effect test method and electronic equipment Download PDF

Info

Publication number
CN117667654A
CN117667654A CN202211058836.8A CN202211058836A CN117667654A CN 117667654 A CN117667654 A CN 117667654A CN 202211058836 A CN202211058836 A CN 202211058836A CN 117667654 A CN117667654 A CN 117667654A
Authority
CN
China
Prior art keywords
dynamic effect
dynamic
curve
effective
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211058836.8A
Other languages
Chinese (zh)
Inventor
刘咪咪
李涛
汤化
谌知学
廉文杰
田孝俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211058836.8A priority Critical patent/CN117667654A/en
Publication of CN117667654A publication Critical patent/CN117667654A/en
Pending legal-status Critical Current

Links

Landscapes

  • Telephone Function (AREA)

Abstract

The application provides a dynamic efficiency testing method and electronic equipment, which are applied to the technical field of terminals and are used for solving the problem of low efficiency caused by the fact that a large amount of data are needed to be obtained manually in advance in the dynamic efficiency testing process. Acquiring video data corresponding to dynamic effects to be tested; the video data corresponding to the dynamic effect to be tested is subjected to frame disassembly to obtain an effective image frame; the effective image frames comprise the rest image frames after deleting the image frames which are repeated in the adjacent image frames; obtaining effective data according to the effective image frames, wherein the effective data comprises parameters generated in the moving process of the effective image frames; obtaining an actual dynamic effect curve according to the effective data; fitting the actual dynamic effect curve with a preset curve to obtain fitting degree; the fitting degree is used for representing the dynamic effect test result. The method and the device are applied to the dynamic effect design process of the electronic equipment.

Description

Dynamic effect test method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a dynamic efficiency test method and electronic equipment.
Background
With the popularization and development of the mobile internet technology, products contacted by users are more mature and various, the perception of the products is deeper, and the critical degree is gradually improved. As a result, more and more products are striving to improve the user experience, creating differentiation of products in a variety of ways. Among them, dynamic design is one of the means of very subdivision and depth. Taking the dynamic effect generated by the motion of the image as an example, the dynamic effect is in short the dynamic effect generated by the continuous motion process of the image. In order to improve the user experience, the designed dynamic effect needs to be tested to be continuously optimized.
In the existing dynamic test scheme, a large amount of data needs to be obtained in advance in the dynamic test process, and the dynamic process is analyzed according to the data. However, when these data are obtained, much manpower and time are wasted, and the efficiency of the dynamic test is low.
Disclosure of Invention
The application provides a dynamic efficiency test method and electronic equipment, which can improve the efficiency of dynamic efficiency test.
In order to achieve the above purpose, the following technical solutions are adopted in the embodiments of the present application.
In a first aspect, the present application provides a dynamic effect testing method, including obtaining video data corresponding to a dynamic effect to be tested; the video data corresponding to the dynamic effect to be tested is subjected to frame disassembly to obtain an effective image frame; the effective image frames comprise the rest image frames after deleting the image frames which are repeated in the adjacent image frames; obtaining effective data according to the effective image frames, wherein the effective data comprises parameters generated in the moving process of the effective image frames; obtaining an actual dynamic effect curve according to the effective data; fitting the actual dynamic effect curve with a preset curve to obtain fitting degree; the fitting degree is used for representing the dynamic effect test result.
Therefore, the fitting degree of the actual dynamic effect curve and the preset curve can be obtained by automatically analyzing the video data corresponding to the dynamic effect to be tested, the dynamic effect test result is obtained, and the efficiency of dynamic effect test can be improved.
In one possible implementation, the motion effect is a motion effect generated by translating the interface, for example, a motion effect generated by translating a certain display interface up and down, left and right (which may be also described as sliding), where the effective data includes a translation distance and a translation direction of the interface. Or, the dynamic effect is generated when a service or a folder is opened or closed, for example, an icon of a certain application (application) is opened or a dynamic effect generated during a process of opening a certain folder, and in this scenario, the valid data includes a width and a scaling direction of the icon corresponding to the service or the folder. Or, the dynamic effect is generated by window rotation, for example, in video software, the video window is adjusted from a horizontal screen to a vertical screen or from the vertical screen to the horizontal screen, and in the scene, the effective data comprises the rotation angle and the rotation direction of the window.
That is, the dynamic effects in different scenes correspond to respective valid data. According to the specific condition of the dynamic effect, corresponding effective data are obtained, so that the accuracy of the dynamic effect test can be improved.
In one possible implementation, obtaining the actual dynamic curve from the valid data includes: normalizing the effective data; and obtaining an actual dynamic effect curve according to the normalized effective data description points.
When the dynamic effect is generated by a translation interface, a service or a folder is opened or closed, and the window rotates, the corresponding effective data are different, and the actual dynamic effect curve obtained by the point drawing after the normalization processing of the effective data is clearer and is easy to understand.
In one possible implementation, the preset curve includes a bezier curve. The Bezier curve accords with the visual law of human eyes, and when the fitting degree of the actual dynamic effect curve and the Bezier curve is higher, the dynamic effect process perceived by human eyes is smoother.
In one possible implementation manner, the preset curve may be another type of curve, which is not limited in the embodiment of the present application.
In one possible implementation, the method further includes: obtaining a Bezier curve according to a known preset number of control points; or determining a preset number of control points by using a least square method according to the actual dynamic effect curve; and obtaining a Bezier curve according to the preset number of control points.
By way of example, a smooth Bezier curve may be drawn from four arbitrary point coordinates. Any of these four locations may be referred to as a control point. The control point can be a control point which is designed by a developer according to the ideal dynamic effect to be realized. The control point can also be a control point obtained by back-pushing the actual dynamic effect curve, and a Bezier curve can be obtained according to the control point.
In one possible implementation, the method further includes: and if the fitting degree of the actual dynamic effect curve and the preset curve is lower than the preset fitting threshold, adjusting the dynamic effect to be tested or indicating to adjust the dynamic effect to be tested.
For example, if the fitting degree of the actual dynamic effect curve and the preset curve is lower than the preset fitting threshold, the electronic device performing the dynamic effect test adjusts the dynamic effect to be tested, or if the fitting degree of the actual dynamic effect curve and the preset curve is lower than the preset fitting threshold, the electronic device performing the dynamic effect test instructs other electronic devices to adjust the dynamic effect to be tested.
In a second aspect, the application provides a dynamic effect testing method, which includes displaying a first interface, wherein the first interface includes prompt information for prompting a user to upload video data corresponding to dynamic effects to be tested; receiving a first operation input by a user on a first interface, wherein the first operation is used for inputting video data corresponding to dynamic effects to be tested; and performing an action test on the video data corresponding to the action to be tested, and displaying an action test result.
In one possible implementation manner, performing an action test on video data corresponding to an action to be tested includes: the video data corresponding to the dynamic effect to be tested is subjected to frame disassembly to obtain an effective image frame; the effective image frames comprise the rest image frames after deleting the image frames which are repeated in the adjacent image frames; obtaining effective data according to the effective image frames, wherein the effective data comprises parameters generated in the moving process of the effective image frames; obtaining an actual dynamic effect curve according to the effective data; fitting the actual dynamic effect curve with a preset curve to obtain fitting degree; the fitting degree is used for representing the dynamic effect test result.
In one possible implementation, the first interface includes: a first control; the method further comprises the steps of: responding to a second operation of the user on the first interface, displaying a second interface, wherein the second interface comprises information of a dynamic test result history record; the second operation is a triggering operation on the first control; the dynamic efficiency test result history record information comprises at least one of a serial number of a historical dynamic efficiency, a test task type corresponding to the historical dynamic efficiency, a task name, a task starting time, a task ending time, an execution result and a second control; the second control is used for displaying detailed data of the historical dynamic task after being triggered.
In one possible implementation, the method further includes: responding to a third operation of a user on the second interface, displaying a third interface, wherein the third interface comprises detailed data of the historical dynamic task; the third operation is a triggering operation on the second control; the detailed data of the historical dynamic task comprises at least one of dynamic fitting degree, dynamic fitting diagram and dynamic fitting process diagram.
In one possible implementation, the dynamic effect fitting map includes an actual dynamic effect curve and a preset curve; the actual dynamic effect curve and the preset curve are used for fitting to obtain dynamic effect test results.
In one possible implementation, the dynamic fitting process map includes some or all of the active image frames; the effective image frames include image frames remaining after deleting image repeated image frames in adjacent image frames.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device has a function of implementing the dynamic efficiency test method as described in the first aspect and any one of possible implementation manners of the first aspect; alternatively, the electronic device has the functionality to implement the dynamic effect test method as described in the second aspect and any one of the possible implementations. The functions may be implemented by hardware, or by corresponding software executed by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a fourth aspect, the present application provides an electronic device, including: a processor and a memory coupled to the processor, the memory for storing computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform the method of any one of the first aspects or the second aspect.
In a fifth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on a computer, cause the computer to perform the method of any one of the first aspect, or to perform the method of any one of the second aspect.
In a sixth aspect, the present application provides a chip system comprising at least one processor and at least one interface circuit, the at least one interface circuit being configured to perform a transceiving function and to send instructions to the at least one processor, the at least one processor performing the method according to any of the first aspects or the method according to any of the second aspects when the at least one processor executes the instructions.
In a seventh aspect, the present application also provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method according to any one of the first aspects or to perform the method according to any one of the second aspects.
In an eighth aspect, embodiments of the present application provide a circuit system comprising a processing circuit configured to perform the method of the first aspect or any one of the embodiments of the first aspect; alternatively, the processing circuitry is configured to perform the second aspect or a method of any one of the embodiments of the second aspect.
The technical effects corresponding to the second aspect to the eighth aspect and any implementation manner of the second aspect to the eighth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, and are not repeated here.
Drawings
Fig. 1 is a schematic diagram of a system according to an embodiment of the present application;
fig. 2A is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2B is a schematic structural diagram of another electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an account login interface provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of an interface of an action fitting task according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a video type interface according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a dynamic history record according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of an interface of an action fitting task according to an embodiment of the present application;
fig. 8A is a schematic diagram of a mobile phone interface according to an embodiment of the present application;
fig. 8B is a schematic diagram of another mobile phone interface according to an embodiment of the present disclosure;
fig. 8C is a schematic diagram of another mobile phone interface according to an embodiment of the present disclosure;
Fig. 9A is a schematic diagram of another mobile phone interface according to an embodiment of the present application;
fig. 9B is a schematic diagram of another mobile phone interface according to an embodiment of the present disclosure;
fig. 9C is a schematic diagram of another mobile phone interface according to an embodiment of the present application;
fig. 10A is a schematic diagram of another mobile phone interface according to an embodiment of the present disclosure;
fig. 10B is a schematic diagram of another mobile phone interface according to an embodiment of the present disclosure;
fig. 10C is a schematic diagram of another mobile phone interface according to an embodiment of the present disclosure;
FIG. 11 is a flowchart of a dynamic test method according to an embodiment of the present application;
FIG. 12 is a flowchart of another method for testing dynamic effects according to an embodiment of the present disclosure;
FIG. 13 is a flowchart of another method for testing dynamic effects according to an embodiment of the present disclosure;
FIG. 14 is a flowchart of another method for testing dynamic effects according to an embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
The dynamic efficiency test method and the electronic device provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
The terms "comprising" and "having" and any variations thereof, as used in the description of embodiments of the present application, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or apparatus.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone.
The dynamic effect is the change of displacement, posture, size, visibility and the like of elements with time. The element may be an image and thus, in short, the dynamic effect may be represented as a dynamic effect produced by a continuous motion of the image. If an image or control is moved from one location to another, there is no active (dynamic effect) hold, which is a transient process that greatly reduces the user's visual experience, so that active is required for most applications or for many of the terminals' own systems. The fluency of the dynamic effect has great influence on the visual experience of the user, and the smoother the dynamic effect is, the better the visual experience of the user is. However, the dynamic effects are often affected by the performance of the device and the application program, so it is very important to test the smoothness of a dynamic effect in order to improve the visual experience of the user.
In the related art, curveExpert software can utilize various analysis data to perform curve fitting processing, so as to help a user to perform effective analysis. For example, displacement data and time data in the dynamic effect are obtained, a proper curve model is selected to fit the displacement data and the time data, a curve equation of the dynamic effect is obtained, and the relation between the two variables is analyzed according to the fitted curve equation. The user can simultaneously import a plurality of data of the dynamic effect to obtain a curve equation. However, the software cannot process data by itself, a user needs to collect and process dynamic data in advance, paste the processed data to the corresponding position of the software, and select a corresponding curve model for fitting calculation to obtain a dynamic curve equation. However, when data is collected and processed in advance, much manpower and time are wasted, and the efficiency is low.
In addition, doraammon kit and Perfdog performance dogs can also be tested for efficacy. The doraammon kit is an efficient platform which has complete functions and develops a full life cycle for a general front-end product, but the performance problem of the doraammon kit greatly influences the experience of a user, and the doraammon kit is more in resource occupation, more in redundant steps and slow in starting. Perfdog performance dogs are a tool platform for performance testing and analysis, but mainly aim at testing the performance of games and cannot be focused on testing the fluency of dynamic effects.
In order to solve the above problems, the embodiments of the present application provide a dynamic test method and an electronic device. In the method, the electronic equipment can de-frame the video corresponding to the dynamic effect to be tested, and select the effective image frames from all the images obtained by de-framing. And carrying out data analysis on the effective image frames to obtain effective data, and drawing an actual dynamic effect curve according to the effective data. And fitting the actual dynamic effect curve with a preset curve to obtain the fitting degree of the actual dynamic effect curve. And determining the dynamic effect test result according to the fitting degree of the actual dynamic effect curve.
According to the technical scheme, the fitting degree of the actual dynamic effect curve in the video can be obtained by automatically analyzing the video corresponding to the dynamic effect to be tested, whether the current dynamic effect curve is smooth is confirmed, a dynamic effect test result is obtained, and the efficiency of dynamic effect test can be improved.
In one possible implementation manner, the embodiment of the application may be applied to an electronic device, where the electronic device may implement the dynamic efficiency test method described above. The electronic device may be a personal computer (personal computer, PC), a mobile phone (mobile phone), a tablet (Pad), a notebook, a desktop, a notebook, a computer with a transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self driving (self driving), a wireless terminal in remote medical (remote medical), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), a wearable device, a vehicle-mounted device, or the like. The embodiment of the application does not particularly limit the specific form of the electronic device.
In another possible implementation manner, the embodiment of the application may be applied to a system formed by a plurality of electronic devices, where the system may implement the dynamic efficiency test method described above. The system may be as shown in fig. 1.
Fig. 1 is a schematic diagram of a system to which a dynamic test method according to an embodiment of the present application is applied. As shown in fig. 1, the system includes a first electronic device 100 and a second electronic device 200.
Alternatively, the first electronic device 100 may be, for example, a personal computer, a mobile phone, a tablet computer, a notebook computer, a desktop computer, a notebook computer, a computer with a transceiver function, a virtual reality terminal device, an augmented reality terminal device, a wireless terminal in industrial control, a wireless terminal in unmanned driving, a wireless terminal in telemedicine, a wireless terminal in smart grid, a wireless terminal in transportation security, a wireless terminal in smart city, a wireless terminal in smart home, a wearable device, a vehicle-mounted device, or the like. The embodiment of the present application does not particularly limit the specific form of the first electronic device 100.
Alternatively, the second electronic device 200 may be, for example, a PC, a mobile phone, a tablet computer, a VR terminal device, an AR terminal device, a wireless terminal in industrial control, a wireless terminal in unmanned driving, a wireless terminal in telemedicine, a wireless terminal in smart grid, a wireless terminal in transportation security, a wireless terminal in smart city, a wireless terminal in smart home, a wearable device, a vehicle-mounted device, or the like. The embodiment of the present application does not particularly limit the specific form of the second electronic device 200.
The system of the first electronic device 100 and the second electronic device 200 may be one of Android system (Android), apple system (input output system, IOS), and hong system (HOS). The systems of the first electronic device 100 and the second electronic device 200 may be other systems, which are not particularly limited in the embodiments of the present application.
In some embodiments, the first electronic device may obtain a video corresponding to the to-be-tested dynamic effect, analyze the video corresponding to the to-be-tested dynamic effect to obtain an actual dynamic effect curve, and obtain a fitting degree of the actual dynamic effect curve according to the actual dynamic effect curve and a preset curve. And confirming whether the dynamic effect curve is smooth or not to obtain a dynamic effect test result. Alternatively, the higher the fitting degree of the actual dynamic effect curve, the smoother the corresponding dynamic effect. When the fitting degree of the actual dynamic effect curve is larger than a preset fitting threshold, the dynamic effect is smooth, and optimization is not needed; when the fitting degree of the actual dynamic effect curve is smaller than or equal to a preset fitting threshold, the dynamic effect is not smooth, and further optimization is needed.
Optionally, the first electronic device may acquire a video corresponding to the to-be-tested dynamic effect from the second electronic device, and may also acquire a video corresponding to the to-be-tested dynamic effect from the first electronic device. Optionally, the video corresponding to the dynamic effect to be tested may be a screen recording video obtained by recording a dynamic effect process by the first electronic device or the second electronic device.
In some embodiments, the first electronic device de-frames a video corresponding to a dynamic effect to be tested, and obtains an effective image frame; analyzing the effective image frames to obtain effective data, normalizing the effective data, and drawing the normalized effective data into an actual dynamic curve. And fitting the actual dynamic effect curve with a preset curve to obtain the fitting degree of the actual dynamic effect curve. And determining the dynamic effect test result according to the fitting degree of the actual dynamic effect curve. The effective image frames are the image frames which are remained after deleting the repeated image frames in the adjacent image frames. For example, the video corresponding to the dynamic effect to be tested is de-framed to obtain 100 image frames, wherein the image of the first 20 frames is repeated, the image of one adjacent two frames in the middle 70 frames is repeated, and the image of the last 10 frames is repeated, at this time, 19 frames in the first 20 frames are deleted, one frame in the two frames of the adjacent image repetition in the middle 70 frames is deleted, 9 frames in the last 10 frames are deleted, and the rest 71 frames are valid frames. Wherein, for the image frames of which k images are repeated, k-1 image frames among them are randomly deleted, and one image frame remains.
In one possible implementation, the preset curve may be a bezier curve. The Bezier curve accords with the visual law of human eyes, and when the fitting degree of the actual dynamic effect curve and the Bezier curve is higher, the dynamic effect process perceived by human eyes is smoother. The preset curve may be any other type of curve, which is not limited in the embodiment of the present application.
Bezier curves, also known as Betz curves or Bezier curves, are mathematical curves applied to two-dimensional graphics applications. A smooth bezier curve can be drawn according to the coordinates of any of the four points. Any of these four locations may be referred to as a control point.
Taking the preset curve as a bezier curve as an example, in some embodiments, the first electronic device may obtain the bezier curve according to a known control point. Illustratively, the control point is a control point which is self-designed by a developer according to the desired dynamic effect to be achieved. In other embodiments, the first electronic device may also extrapolate the control point according to the actual motion profile, and obtain the bezier curve according to the control point. This process is described in detail below.
According to the scheme, a large amount of data is not required to be obtained in advance, the fitting degree of the actual dynamic effect curve of the video corresponding to the dynamic effect to be tested can be simply and efficiently obtained, and therefore the dynamic effect test result is determined according to the fitting degree of the actual dynamic effect curve.
Alternatively, the first electronic device 100 and the second electronic device 200 in the embodiments of the present application may be implemented by different devices. For example, the first electronic device 100 and the second electronic device 200 in the embodiments of the present application may be implemented by the electronic devices in fig. 2A and fig. 2B.
In this embodiment, the first electronic device 100 is taken as a PC as an example. Please refer to fig. 2A, which is a schematic diagram of a structure of a PC according to an embodiment of the present application. The method in the following embodiment may be implemented in a PC having the above-described hardware structure.
As shown in fig. 2A, the PC may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a camera 193, a display 194, and the like. Optionally, the PC may also include a mobile communication module 150 or the like.
It will be appreciated that the configuration illustrated in this embodiment does not constitute a specific limitation on the PC. In other embodiments, the PC may include more or fewer components than shown, or may combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of a PC. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a USB interface, and the like.
The charge management module 140 is configured to receive a charge input from a charger. The charging management module 140 may also supply power to the PC through the power management module 141 while charging the battery 142. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 may also receive an input from the battery 142 to power the PC.
The wireless communication function of the PC can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the PC may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
When the PC includes the mobile communication module 150, the mobile communication module 150 may provide a solution including 2G/3G/4G/5G or the like wireless communication applied to the PC. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), NFC, infrared (IR), etc. applied on a PC. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 of the PC is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the PC can communicate with the network and other devices through wireless communication technology. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The PC performs display functions via the GPU, display screen 194, and application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the PC may include 1 or N display screens 194, N being a positive integer greater than 1.
The PC can implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like. In some embodiments, the PC may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card to enable expansion of the memory capabilities of the PC. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the PC and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the PC (e.g., audio data, phonebook, etc.), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The PC may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
Fig. 2B is a schematic hardware structure of an electronic device according to an embodiment of the present application. The electronic device may be the first electronic device and/or the second electronic device described above. The electronic device comprises at least one processor 201, communication lines 202, a memory 203 and at least one communication interface 204. Wherein the memory 203 may also be included in the processor 201.
The processor 201 may be a central processing unit (central processing unit, CPU), but may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate arrays (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Communication line 202 may include a pathway to transfer information between the aforementioned components.
A communication interface 204 for communicating with other devices. In the embodiment of the application, the communication interface may be a module, a circuit, a bus, an interface, a transceiver, or other devices capable of implementing a communication function, for communicating with other devices. Alternatively, when the communication interface is a transceiver, the transceiver may be a separately provided transmitter that is operable to transmit information to other devices, or a separately provided receiver that is operable to receive information from other devices. The transceiver may also be a component that integrates the functions of transmitting and receiving information, and the embodiments of the present application do not limit the specific implementation of the transceiver.
The memory 203 may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), direct RAM (DR RAM), or other magnetic storage devices, or any other medium that can be used to carry or store the desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be stand alone and be coupled to the processor 201 via a communication line 202. Memory 203 may also be integrated with processor 201.
The memory 203 is used for storing computer-executable instructions for implementing the embodiments of the present application, and is controlled to be executed by the processor 201. The processor 201 is configured to execute computer-executable instructions stored in the memory 203, thereby implementing a carrier wave transmission method provided in the following embodiments of the present application.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application code, instructions, computer programs, or other names, and the embodiments of the present application are not limited in detail.
In a particular implementation, as one embodiment, processor 201 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 2B.
In a particular implementation, as one embodiment, an electronic device may include multiple processors, such as processor 201 and processor 205 in FIG. 2B. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The electronic device may be a general-purpose device or a special-purpose device, and the embodiment of the application is not limited to the type of the electronic device.
It is to be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the first electronic device 100 and the second electronic device 200. In other embodiments of the present application, the first electronic device 100 and the second electronic device 200 may include more or less components than illustrated, or may combine certain components, or may split certain components, or may have different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The first electronic device and the second electronic device can establish a communication connection and communicate through the communication connection.
In the embodiment of the present application, after the first electronic device 100 and the second electronic device 200 establish a communication connection, the communication connection performs communication. The user may send the content in the first electronic device 100 to the second electronic device 200, or may send the content in the second electronic device 200 to the first electronic device 100.
The dynamic test method provided in the embodiment of the present application will be described below by taking the first electronic device 100 as a PC and the second electronic device 200 as a mobile phone as an example.
In some scenarios, the dynamic test function may be integrated in a "dynamic test tool" on the PC side, or in a "dynamic test application" on the PC side. And the PC receives the video corresponding to the dynamic effect to be tested from the mobile phone. Optionally, the video corresponding to the dynamic effect to be tested may be a screen recording video obtained by recording a dynamic effect process in the mobile phone. Optionally, the dynamic scene may be at least one of a translation, a scaling, and a rotation of the dynamic scene.
For example, when the motion effect scene is a panning scene, the motion effect scene may be a motion effect scene such as panning left, panning right, panning up, panning down (sliding) of the current interface of the mobile phone. For example, in a shopping application, if the current interface is a "to-be-shipped" interface, the interface may be translated (slid) to the left to the "to-be-received" interface.
When the dynamic scene is a zoom scene, the dynamic scene can be a dynamic scene such as folder opening, folder closing, service opening, service closing and the like in the desktop of the mobile phone display screen. Alternatively, when the action scene is that a folder in a desktop is opened and a folder is closed, the folder may be a music folder, a video folder, or a shopping folder, for example. The folder includes a plurality of application icons therein, such as a plurality of music software therein. When the music folder is in a closed state, a plurality of music software icons are positioned in the music folder and are smaller; when the music folder is opened, the music folder is changed from small to large, and a plurality of music software icons in the music folder are also changed from small to large and displayed in the desktop. When the dynamic scene is the service of opening and closing, the service can be the service provided by the negative screen or the service in the desktop of the display screen of the mobile phone. For example, the service may include an application or an atomized service, and when the service is in a closed state, the service may be displayed as an application icon or an atomized service icon; when the service is opened, the service icon becomes larger, and an interface corresponding to the service can be displayed. In the process of opening the service, the service icon is changed from small to large, and the enlarged scene is obtained. In the process of closing the service, the service icon is changed from a large size to a small size, namely a reduced scene. The embodiment of the application is mainly described by taking the dynamic effect of the zoom scene when the folder is opened and the folder is closed as an example.
When the dynamic effect scene is a rotating scene, the dynamic effect scene can be a dynamic effect scene such as a video window horizontal screen to vertical screen transition of video software A, a video window vertical screen to horizontal screen transition of video software A, a video window horizontal screen to vertical screen transition of video software B, a video software B vertical screen to horizontal screen transition and the like.
Optionally, when the dynamic scene is that the video window of the video software a is turned from a horizontal screen to a vertical screen, and the video software a is turned from a vertical screen to a horizontal screen, the video window of the video software a is smaller when the video window of the video software a is turned from a horizontal screen, and the video window rotates and enlarges as large as the mobile phone display screen when the video window is turned from a horizontal screen to a vertical screen.
Alternatively, the dynamic scene may be another scene, which is not limited in the embodiment of the present application.
In the embodiment of the application, the dynamic effect test function is integrated in a dynamic effect test tool of a PC end. Optionally, in the embodiment of the present application, the preset curve is described by taking a bezier curve as an example.
In the embodiment of the application, the user can register the account number in the dynamic test tool. Subsequently, the user can log in the account number to test the dynamic effect.
As shown in fig. 3, for example, the account login interface 30 provided in the embodiment of the present application includes an account input box 301, a password input box 302, and a verification code box 303.
The account input box 301 may be used to input an account of a user, such as an account of the user a, an account of the user B, an account of the user C, and the like.
The password input box 302 is used to input a password of the user. The user can freely set the password. Alternatively, the user sets the password according to the password setting rule. For example, to increase password security, the server may set a password format, including one or more of uppercase letters, lowercase letters, numbers, symbols, if desired.
The verification code box 303 is used for inputting a verification code required by the user login account. In order to prevent malicious cracking of the password, in some examples, the verification mode of the verification code may be at least one of an animation verification code, a mobile phone short message verification code, a mobile phone voice verification code, a video verification code, and the like.
In some examples, after the user completes the information filling in the account number input box 301, the password input box 302, and the verification code box 303, the user may click on the "login immediately" button, to trigger the PC to enter into the service platform of the "dynamic test tool",
as shown in fig. 4, the service platform includes a dynamic fit task module 401 and a history query module 402. Clicking the dynamic effect fitting task module 401 can upload the video corresponding to the dynamic effect to be tested, and perform dynamic effect fitting to obtain the fitting degree of the dynamic effect contained in the video. The interface 40 corresponding to the dynamic effect fitting task module includes selection boxes 403, 404, 405 for user input device type, task name, and fitting method, respectively, and also includes an "upload" button 406, an "execute" button 407, a "reset" button 408, and a logged-in user name a409. Clicking the "upload" button 406 can upload the video corresponding to the to-be-tested dynamic effect, clicking the "execute" button 407 can fit the uploaded video corresponding to the to-be-tested dynamic effect, clicking the "reset" button 408 can re-upload the video corresponding to the to-be-tested dynamic effect.
Optionally, the mobile phone type in the embodiment of the present application may be an android mobile phone, an apple mobile phone, or a hong-mo mobile phone. Thus, as in fig. 5 (a), the device type may be one of an android system, an apple system, and a hong system. The device type selection box 403 contains the three device types described above. The system type of the equipment for generating the video corresponding to the dynamic effect to be tested is the equipment type. Alternatively, there may be other mobile phone types of the system, which is not particularly limited in the embodiments of the present application.
Different dynamic effects are used in different scenes, and the dynamic effect scene can be one of a translation scene, a scaling scene and a rotation scene. For example, the motion effect of the panning scene may be a motion effect of the interface panning left or right; the dynamic effect of the zooming scene can be the dynamic effect of the opening and closing process of the folder in the interface or the dynamic effect of the opening and closing process of the negative one-screen; the dynamic effect of the rotating scene can be the dynamic effect of the video window horizontal screen to vertical screen and vertical screen to horizontal screen in the video software A, or the dynamic effect of the video window horizontal screen to vertical screen and vertical screen to horizontal screen in the video software B. The task name may be a specific scene action name, and as shown in fig. 5 (B), the task name selection box 404 may include task names such as pan, zoom (folder), zoom (minus one screen), rotate (video software a), rotate (video software B), and the like. Optionally, the task name may also be customized, such as task 1, task 2, and so on.
As shown in fig. 5 (c), the fitting method 405 may be one of a given control point and a reverse control point. The control points are indicated to the PC, the control points corresponding to the Bezier curve can be known, and the Bezier curve is drawn according to the control points; the back-push control point indicates that the PC does not know the control point corresponding to the Bezier curve, the PC is required to back-push the control point, and then the Bezier curve is drawn according to the control point. Alternatively, when the preset curve is another type of curve, the method for obtaining the preset curve may be different, and the fitting method 405 may be different.
The specific content in the interface 40 corresponding to the dynamic effect fitting task module illustrated in the embodiment of the present application may be other, which is not limited in the embodiment of the present application.
Optionally, a prompt message "upload video, at most one video, and a recommended format mp4" is provided above the "upload" button 406. And prompting the user to upload the video meeting the requirements. After the user selects the device type, task name and fitting method of the video corresponding to the dynamic effect to be tested, click the "upload" button 406 to upload the video corresponding to the dynamic effect to be tested. And clicking an execution button 407, and performing fitting analysis on the video corresponding to the dynamic effect to be tested by the PC to obtain the fitting degree of the actual dynamic effect curve of the video corresponding to the dynamic effect to be tested, so as to determine the dynamic effect test result according to the fitting degree of the actual dynamic effect curve. Alternatively, if the video corresponding to the selected dynamic effect to be tested is wrong, the reset button 408 may be clicked to re-upload the video corresponding to the dynamic effect to be tested.
The user selects the video equipment type corresponding to the dynamic effect to be tested as a hong Monte system, the task name is translation, and the fitting method is given to the control point. After uploading the video corresponding to the dynamic effect to be tested, the user clicks the execute button 407, and the PC performs fitting analysis on the video corresponding to the dynamic effect to be tested to obtain the fitting degree of the actual dynamic effect curve of the video corresponding to the dynamic effect to be tested, so that the dynamic effect test result is determined according to the fitting degree of the actual dynamic effect curve.
FIG. 6 illustrates an interface 50 corresponding to the historian query module 402 in the service platform of the "dynamic test tool". The history query module 402 includes a history of all dynamic fit tasks. Specifically, the method can include serial numbers, task types, task names, task start time, task end time, execution results, detailed data of the tasks and the like.
Illustratively, the dynamic effect fit task with the sequence number of 1 is a "curve fit task", the task is named "translation", the task start time is "2022-05-08 14:38:24", the task end time is "2022-05-08 14:39:56", and the execution result is "success". The task type of the dynamic effect fitting task with the sequence number of 2 is a curve fitting task, the task name is a scaling (folder), the task starting time is 2022-05-09 15:30:24, the task ending time is 2022-05-09 15:32:14, and the execution result is successful. The task type of the dynamic effect fitting task with the sequence number 3 is a curve fitting task, the task name is a rotation (video software A), the task starting time is 2022-05-11:20:34, the task ending time is 2022-05-11:22:04, and the execution result is successful. The task type of the dynamic effect fitting task with the sequence number of 4 is a curve fitting task, the task is named as a rotation (video software A), the task starting time is 2022-05-11:13:20:34, the task ending time is 2022-05-11:13:22:04, and the execution result is failure.
Optionally, the "failure" of the execution result may be that the video corresponding to the dynamic effect to be tested is not successfully uploaded due to network reasons, and at this time, the execution result is "failure".
Optionally, the user may click a "view" button in the detailed data to view the detailed data corresponding to the action fitting task.
As shown in fig. 7, the detailed data interface 60 of the dynamic fitting task includes: and (5) fitting degree of dynamic effect and fitting map of dynamic effect.
In one possible implementation, the detailed data interface 60 of the action fitting task also includes a "view action fitting process map" button. And clicking a button of 'view dynamic effect fitting process diagram', so that the image of the dynamic effect fitting process can be viewed.
Optionally, the dynamic effect fitting process provided in the embodiment of the present application includes: and the PC de-frames the video corresponding to the dynamic effect to be tested, and selects an effective image frame from all images obtained by de-framing. The effective image frames are the image frames which are remained after deleting the repeated image frames in the adjacent image frames. Carrying out data analysis on the effective image frames to obtain effective data, and drawing an actual dynamic effect curve according to the effective data; and fitting the actual dynamic effect curve with a preset curve to obtain the fitting degree of the actual dynamic effect curve. And determining the dynamic effect test result according to the fitting degree of the actual dynamic effect curve. Thus, clicking on the "view dynamic effect fitting process map" button may view the image associated with the dynamic effect fitting process described above.
Optionally, when the moving effect scenes in the video corresponding to the moving effect to be tested are different, the effective image frames obtained according to the video corresponding to the moving effect to be tested are different, the effective data obtained according to the effective image frames are different, the actual moving effect curves obtained according to the effective data are different, and the corresponding bezier curves are also different. The following is a specific description according to different dynamic scenarios.
In one possible scenario, as shown in fig. 7 (a), the dynamic effect in the video corresponding to the dynamic effect to be tested is the dynamic effect of the translation scenario, clicking a "view" button in the detailed data in the history query module 402, and the interface 60 corresponding to the detailed data includes: dynamic effect fitness 601, dynamic effect fit map 602, "view dynamic effect fit process map" button 603.
Illustratively, the dynamic fit 601 is 95%. The dynamic fit map 602 has a frame number on the abscissa and a translation distance on the ordinate, and includes an actual dynamic curve and a Bezier curve. Clicking on the "view dynamic effects fitting process map" button 603 allows the image of the dynamic effects fitting process to be viewed.
As shown in fig. 8A, the cell phone interface translates from interface 81 to interface 82. Illustratively, the image frames (a) - (f) in fig. 8A are all image frames obtained by video de-framing the video corresponding to the dynamic effect to be tested. Wherein the image frame (a) is adjacent to and repeated from the image frame (b), and at this time, one of the random image frame (a) and the image frame (b) may be selected as an effective frame. Illustratively, the present embodiment selects image frame (b) as the effective image frame. The image frame (c) is not repeated with the images of the adjacent image frame (b) and the image frame (d), and is an effective image frame. The image frame (d) is repeated with the adjacent image frame (e), and at this time, one of the random image frame (d) and the image frame (e) may be selected as an effective frame. Illustratively, the present embodiment selects image frame (d) as the effective image frame. The image frame (f) is not repeated with the images of the adjacent image frame (e), and is an effective image frame. Therefore, the effective image frames of the video corresponding to the to-be-tested dynamic effect are the image frame (b), the image frame (c), the image frame (d) and the image frame (f). It should be understood that fig. 8A is only an example, and there may be more image frames obtained by video de-framing the video corresponding to the actual dynamic effect to be tested.
Optionally, the translation distance is the distance from the interface point of the interface 81 and the interface 82 to the rightmost end of the interface. With the screen width as the maximum value, when the interface 81 translates to the left, the translation distance changes from small to large; as the interface 81 translates to the right, the translation distance changes from large to small. Alternatively, the translation distance may be represented in other ways. In this embodiment, the translation distance is taken as an example of the distance from the boundary point of the interface 81 and the interface 82 to the rightmost end of the interface.
As shown in fig. 8A (c), the boundary between the interface 81 and the interface 82 is point a, the distance m from point a to the rightmost end of the interface is obtained, and if the value of m increases with the increase of the number of frames of the effective image frame, the interface 81 translates to the left, and if the value of m decreases with the increase of the number of frames of the effective image frame, the interface 81 translates to the right.
Optionally, when the motion effect scene is a motion effect of the translation scene, the effective data may be an interface translation distance and an interface translation direction.
As shown in fig. 8B, as the number of frames of the effective image frame increases, the larger the value of m, the interface 81 is shifted to the left.
As shown in fig. 8C, as the number of frames of the effective image frame increases, the smaller the value of m, the interface 81 is shifted rightward.
In the embodiment of the present application, taking the leftward translation of the interface 81 as an example, an actual motion effect curve and a corresponding bezier curve are drawn according to effective data, and the fitting degree of the actual motion effect curve is obtained according to the actual motion effect curve and the bezier curve, so that a motion effect test result is determined according to the fitting degree of the actual motion effect curve. Optionally, the translation distance m of each image in the effective image frames is obtained, and the frame number of the effective image frames and the corresponding translation distance m are normalized. The abscissa and the ordinate after normalization are all between 0 and 1.
For example, if the effective image frame is 11 frames, the translation distance is 0mm at the minimum and 50mm at the maximum. The translation distance m in the 1 st frame of image is 0mm, and after normalization, the corresponding coordinates are (0, 0); the translation distance m in the 2 nd frame image is 3mm, and after normalization, the corresponding coordinate is (0.1,0.06); the translation distance m in the 3 rd frame image is 7mm, after normalization, the corresponding coordinate is (0.2,0.14) … …, the translation distance m in the 10 th frame image is 46mm, and after normalization, the corresponding coordinate is (0.9,0.92); the translation distance m in the 11 th frame image is 50mm, and after normalization, the corresponding coordinates are (1, 1). And drawing an actual curve (actual dynamic effect curve) according to the normalized data.
Optionally, the abscissa of the dynamic fit map of fig. 7 (a) is the number of frames after the effective image frame normalization process, and the ordinate is the translation distance after the translation distance normalization process. And drawing an actual translational motion effect curve according to the frame number corresponding to each effective image frame and the translational distance after normalization.
In one possible implementation, the PC may learn the control point of the motion effect curve, and obtain the bezier curve according to the control point.
Alternatively, the embodiment of the present application uses a cubic bezier curve, where the bezier curve formula is the following formula (1):
B(x)=P 0 (1-x) 3 +3P 1 x(1-x) 2 +3P 2 x 2 (1-x)+P 3 x 3 (1)
Wherein B (x) is an independent variable, x is an independent variable, P 0 、P 1 、P 2 、P 3 Is the coordinates of the control point.
Alternatively, in an actual bezier curve application, p0=0 and p3=1, and the bezier curve formula (2) can be obtained by taking the above formula (1):
B(x)=3P 1 x(1-x) 2 +3P 2 x 2 (1-x)+x 3 (2)
wherein B (x) is an independent variable, x is an independent variable, P 1 、P 2 Is the coordinates of the control point.
For example, when the control points are P1 (x 1, y 1) and P2 (x 2, y 2), the values of x1, x2, y1, y2 are all [0,1], and the following formulas (3) and (4) can be obtained:
X(x)=3*x1*x(1-x) 2 +3*x2*x 2 (1-x)+x 3 (3)
Y(x)=3*y1*x(1-x) 2 +3*y2*x 2 (1-x)+x 3 (4)
wherein X (X) and Y (X) are independent variables, X is independent variable, P 1 、P 2 Is the coordinates of the control point.
x1 is P 1 Y1 is P 1 X2 is P 2 Y2 is P 2 The values of x1, x2, y1, y2 all belong to the range of [0,1]]。
And (3) obtaining X (X) and Y (X) according to the formula (3) and the formula (4), and drawing a corresponding Bezier curve.
In another possible implementation, the PC does not know the control point of the motion effect curve, and can use the least squares method to back-push the control point according to the actual motion effect curve, and obtain the bezier curve according to the control point. The "least squares method" is a standard method for solving an approximate solution by regression analysis on a linear equation set, i.e., an equation set with a larger number of equations than unknowns. The unknown control point can be simply and conveniently obtained by utilizing the least square method, and the square sum of errors between the motion effect curve (Bezier curve) corresponding to the obtained control point and the actual motion effect curve is minimized.
As in fig. 7 (a), a bezier curve is drawn from the control points. Wherein each point in the Bezier curve corresponds to an effective image frame and the effective image frame corresponds to a translation distance.
Optionally, the actual dynamic effect curve and the bezier curve are fitted to obtain an actual dynamic effect curve fitting degree, that is, a dynamic effect fitting degree 601.
Optionally, the translation distance (ordinate) corresponding to each effective image frame (abscissa) in the drawn actual motion effect curve and the bezier curve may be compared, if the translation distance difference between the actual motion effect curve and the bezier curve in the same image frame is smaller than a preset threshold, the image frame in the actual motion effect curve is reasonable, and the fitting degree of the actual motion effect curve is obtained according to the occupation ratio of all reasonable image frames in all image frames (effective image frames). The higher the fitting degree of the actual dynamic effect curve, the smoother the actual dynamic effect is, and the better the effect is. For example, in the dynamic effect fitting graph, the preset threshold may be 0.015, that is, the difference between the ordinate values of the actual dynamic effect curve and the bezier curve is less than 0.015, which indicates that the image frame in the actual dynamic effect curve is reasonable.
For example, if the effective image frame is 60 frames and the reasonable image frame is 57 frames, the fitting degree of the actual motion effect curve is 57/60=95%. Optionally, if the fitting degree of the actual dynamic curve is higher than a preset fitting threshold, the dynamic curve is smooth and does not need to be optimized; if the fitting degree of the actual dynamic effect curve is lower than the fitting degree threshold, the dynamic effect is not smooth, and further optimization is needed. Optionally, the preset fitting threshold may be 92%, and the fitting degree of the motion effect is 95%, which indicates that the motion effect of the left translation is smooth, and optimization is not needed.
Optionally, clicking on the "view action fitting process map" button 603 may view an image of the action fitting process that is translated to the left. Optionally, the method comprises the steps of. The image of the motion-effect fit process shifted to the left may be part or all of the effective image frame. Some or all of the active image frames are arranged in time order. When the image in the dynamic effect fitting process is a part of effective image frames, the image in the dynamic effect fitting process can extract a part of image frames from the effective image frames at fixed intervals, and can also extract a part of image frames from the effective image frames at fixed intervals. The embodiments of the present application are not limited in this regard.
In the motion effect of the translation scene, the PC uses the distance m from the boundary point A of the interface 81 and the interface 82 to the rightmost end of the interface as the translation distance, draws an actual motion effect curve of the translation scene according to the change condition of the translation distance m, and fits with the Bezier curve, so that the fitting degree of the actual motion effect curve is obtained, and a motion effect test result is determined according to the fitting degree of the actual motion effect curve.
In another possible scenario, as shown in fig. 7 (b), the dynamic effect in the video corresponding to the dynamic effect to be tested is the dynamic effect of the zoom scenario, specifically, the dynamic effect when the folder is opened and closed, the "view" button in the detailed data is clicked in the history query module 402, and the interface 70 corresponding to the detailed data includes: dynamic effect fitness 701, dynamic effect fit map 702, "view dynamic effect fit process map" button 703.
Optionally, the dynamic effect fitness 701 includes a dynamic effect fitness 1 and a dynamic effect fitness 2, where the dynamic effect fitness 1 is a dynamic effect fitness when the folder is opened, and the dynamic effect fitness 2 is a dynamic effect fitness when the folder is closed. Illustratively, the dynamic fit 1 is 91.67% and the dynamic fit 2 is 95%. Alternatively, the dynamic fit map 702 has a frame number on the abscissa and a folder width on the ordinate, including the actual dynamic curve and the Bezier curve. Optionally, clicking on the "view dynamic effect fitting process map" button 703 may view an image of the dynamic effect fitting process.
Optionally, the effective image frames corresponding to different dynamic scenes are different, and the effective data obtained according to the effective image frames are also different. Fig. 9A is a specific process of obtaining an effective image frame in a zoom scene.
As shown in fig. 9A, the dynamic effect in the video corresponding to the dynamic effect to be tested is the dynamic effect of the zoom scene, the interface of the mobile phone is converted from the interface 91 to the interface 92, and the folder is opened. Illustratively, the image frames (a) - (f) in fig. 9A are all image frames obtained by video de-framing the video corresponding to the dynamic effect to be tested. Wherein the image frame (a) is adjacent to and repeated from the image frame (b), and at this time, one of the random image frame (a) and the image frame (b) may be selected as an effective frame. Illustratively, the present embodiment selects image frame (b) as the effective image frame. The image frame (c) is not repeated with the images of the adjacent image frame (b) and the image frame (d), and is an effective image frame. The image frame (d) is repeated with the adjacent image frame (e), and at this time, one of the random image frame (d) and the image frame (e) may be selected as an effective frame. Illustratively, the present embodiment selects image frame (d) as the effective image frame. The image frame (f) is not repeated with the images of the adjacent image frame (e), and is an effective image frame. Therefore, the effective image frames of the video corresponding to the to-be-tested dynamic effect are the image frame (b), the image frame (c), the image frame (d) and the image frame (f). It should be understood that fig. 9A is only an example, and there may be more image frames obtained by video de-framing the video corresponding to the actual dynamic effect to be tested.
Alternatively, when the folder is opened, the folder width varies from small to large as the number of frames of the effective image frames increases; when the folder is closed, the folder width changes from large to small as the number of frames of the effective image frame increases.
As shown in fig. 9A (c), the folder width is n, and if the value of n increases with the number of frames of the effective image frames, the effect is to open the folder, and if the value of n decreases with the number of frames of the effective image frames, the effect is to close the folder.
Optionally, when the dynamic effect scene is dynamic effect when the folder is opened or closed under the zoom scene, the effective data may be the width of the folder and the opening or closing condition of the folder.
As shown in fig. 9B, as the number of frames of the effective image frame increases, the larger the value of n is, the dynamic effect is when the folder is opened. At this time, an actual dynamic effect curve is drawn from the effective data.
Optionally, acquiring the width of the folder of each image in the effective image frames as n, and normalizing the number of frames of the effective image frames and the corresponding width of the folder n. The abscissa and the ordinate after normalization are all between 0 and 1.
By way of example, if the effective image frame is 11 frames, the minimum folder width is 15mm and the maximum folder width is 40mm. The width n of the folder in the 1 st frame of image is 15mm, and after normalization, the corresponding coordinates are (0, 0); the width n of the folder in the 2 nd frame of image is 17mm, and after normalization, the corresponding coordinates are (0.1,0.08); the width n of the folder in the 3 rd frame image is 20mm, after normalization, the corresponding coordinate is (0.2) … …, the width n of the folder in the 10 th frame image is 38mm, and after normalization, the corresponding coordinate is (0.9,0.92); the folder width n in the 11 th frame image is 40mm, and after normalization, the corresponding coordinates are (1, 1). And drawing an actual dynamic effect curve according to the normalized data.
As shown in fig. 9C, as the number of frames of the effective image frame increases, the smaller the value of n is, the dynamic effect is when the folder is closed. At this time, an actual dynamic effect curve is drawn from the effective data.
By way of example, if the effective image frame is 11 frames, the minimum folder width is 15mm and the maximum folder width is 40mm. The width n of the folder in the 1 st frame of image is 40mm, and after normalization, the corresponding coordinates are (0, 1); the width n of the folder in the 2 nd frame of image is 37mm, and after normalization, the corresponding coordinates are (0.1,0.88); the width n of the folder in the 3 rd frame image is 35mm, after normalization, the corresponding coordinate is (0.2, 0.8) … …, the width n of the folder in the 10 th frame image is 17mm, after normalization, the corresponding coordinate is (0.9,0.08); the folder width n in the 11 th frame image is 15mm, and after normalization, the corresponding coordinates are (1, 0). And drawing an actual dynamic effect curve according to the normalized data.
Optionally, the abscissa of the dynamic fit map of fig. 7 (b) is the number of frames after the normalization of the effective image frames, and the ordinate is the folder width after the normalization of the folder width. And drawing an actual dynamic effect curve according to the frame number corresponding to each effective image frame and the width of the folder after normalization.
In one possible implementation, the PC may learn the control point of the motion effect curve, and obtain the bezier curve according to the control point.
In another possible implementation, the PC does not know the control point of the motion effect curve, and can use the least squares method to back-push the control point according to the actual motion effect curve, and obtain the bezier curve according to the control point.
As in (b) of fig. 7, a bezier curve is drawn from the control points. Wherein each point in the Bezier curve corresponds to an effective image frame and the effective image frame corresponds to a folder width.
Optionally, the actual dynamic effect curve and the bezier curve are fitted to obtain an actual dynamic effect curve fitting degree, that is, dynamic effect fitting degree 701.
Optionally, the width (ordinate) of the folder corresponding to each effective image frame (abscissa) in the drawn actual motion effect curve and the Bezier curve may be compared, if the difference between the widths of the folders of the actual motion effect curve and the Bezier curve in the same image frame is smaller than a preset threshold, the image frame in the actual motion effect curve is reasonable, and the fitting degree of the actual motion effect curve is obtained according to the duty ratio of all reasonable image frames in all image frames (effective image frames). The higher the fitting degree of the actual dynamic effect curve, the smoother the actual dynamic effect is, and the better the effect is. For example, in the dynamic effect fitting graph, the preset threshold may be 0.015, that is, the difference between the ordinate values of the actual dynamic effect curve and the bezier curve is less than 0.015, which indicates that the image frame in the actual dynamic effect curve is reasonable.
For example, if the effective image frame of the open folder is 60 frames and the reasonable image frame is 55 frames, the fitting degree of the actual motion curve of the open folder is 55/60=91.67%. If the effective image frame of the closed folder is 60 frames and the reasonable image frame is 57 frames, the fitting degree of the actual dynamic curve of the closed folder is 57/60=95%. Optionally, if the fitting degree of the actual dynamic curve is higher than a preset fitting threshold, the dynamic curve is smooth and does not need to be optimized; if the fitting degree of the actual dynamic effect curve is lower than the fitting degree threshold, the dynamic effect is not smooth, and further optimization is needed. Optionally, the preset fitting threshold may be 92%, and the fitting degree of the dynamic effect of opening the folder is 91.67%, which indicates that the dynamic effect is not smooth and needs to be further optimized; the fitting degree of the dynamic effect of closing the folder is 95%, which indicates that the dynamic effect is smooth and does not need to be optimized.
Optionally, clicking on the "view dynamic fit process map" button 703 may view the image of the dynamic fit process of the scaled scene. Optionally, the method comprises the steps of. The image of the dynamic fit process of the scaled scene may be a part or all of the valid image frames. Some or all of the active image frames are arranged in time order. When the image in the dynamic effect fitting process is a part of effective image frames, the image in the dynamic effect fitting process can extract a part of image frames from the effective image frames at fixed intervals, and can also extract a part of image frames from the effective image frames at fixed intervals. The embodiments of the present application are not limited in this regard.
In the dynamic effect of the zoom scene, the PC acquires the width data of the folder, draws an actual dynamic effect curve of the zoom scene according to the change condition of the width n of the folder, and fits with the Bezier curve, so that the fitting degree of the actual dynamic effect curve is obtained, and a dynamic effect test result is determined according to the fitting degree of the actual dynamic effect curve.
In another possible scenario, as shown in fig. 7 (c), the dynamic effect in the video corresponding to the dynamic effect to be tested is the dynamic effect of the rotating scenario, specifically may be the dynamic effect when the video window of the video software a changes from horizontal to vertical and vice versa, clicking a "view" button in the detailed data in the history query module 402, and the interface 80 corresponding to the detailed data includes: dynamic effect fitness 801, dynamic effect fit map 802, "view dynamic effect fit process map" button 803.
Optionally, the dynamic effect fitting degree 801 includes a dynamic effect fitting degree 1 and a dynamic effect fitting degree 2, where the dynamic effect fitting degree 1 is a dynamic effect fitting degree when the video window horizontal screen is turned to the vertical screen, and the dynamic effect fitting degree 2 is a dynamic effect fitting degree when the video window vertical screen is turned to the horizontal screen. Illustratively, the dynamic fit is 95% for 1 and 90% for 2. Optionally, the abscissa of the dynamic fit map 802 is the number of frames, and the ordinate is the rotation angle, including the actual dynamic curve and the bezier curve. Alternatively, clicking on the "view action fitting process map" button 803 can view an image of the action fitting process.
As shown in fig. 10A, the dynamic effect in the video corresponding to the dynamic effect to be tested is the dynamic effect of the rotating scene, the interface 101 of the mobile phone is converted into the interface 102, and the video window is converted into the vertical screen from the horizontal screen. Illustratively, the image frames (a) - (f) in fig. 10A are all image frames obtained by video de-framing the video corresponding to the dynamic effect to be tested. Wherein the image frame (a) is adjacent to and repeated from the image frame (b), and at this time, one of the random image frame (a) and the image frame (b) may be selected as an effective frame. Illustratively, the present embodiment selects image frame (b) as the effective image frame. The image frame (c) is not repeated with the images of the adjacent image frame (b) and the image frame (d), and is an effective image frame. The image frame (d) is repeated with the adjacent image frame (e), and at this time, one of the random image frame (d) and the image frame (e) may be selected as an effective frame. Illustratively, the present embodiment selects image frame (d) as the effective image frame. The image frame (f) is not repeated with the images of the adjacent image frame (e), and is an effective image frame. Therefore, the effective image frames of the video corresponding to the to-be-tested dynamic effect are the image frame (b), the image frame (c), the image frame (d) and the image frame (f). It should be understood that fig. 10A is only an example, and there may be more image frames obtained by video de-framing the video corresponding to the actual dynamic effect to be tested.
Optionally, when the video window is changed from a horizontal screen to a vertical screen, the included angle between the video window and the boundary of the mobile phone is changed from small to large along with the increase of the number of frames of the effective image frames; when the video window is changed from a vertical screen to a horizontal screen, the included angle between the video window and the boundary of the mobile phone is changed from large to small along with the increase of the number of frames of the effective image frames. Optionally, as shown in fig. 10A (C), the included angle between the video window and the mobile phone boundary may be the included angle q between the left boundary of the video window and the left boundary of the mobile phone interface, where the included angle q is the included angle of the intersection point C; the included angle between the video window and the boundary of the mobile phone can also be an included angle p between the upper boundary of the video window and the upper boundary of the mobile phone interface, and the included angle q is an included angle of an intersection point D. The included angle between the video window and the boundary of the mobile phone is in the range of 0-90 degrees.
As shown in fig. 10A (c), when the included angle between the video window and the mobile phone boundary is q, if the value of q is larger as the number of effective image frames increases, the dynamic effect is that the video window is changed from a horizontal screen to a vertical screen; if the value of q is smaller as the number of frames of the effective image frames increases, the dynamic effect is that the video window is changed from a vertical screen to a horizontal screen.
Optionally, with the increase of the number of frames of the effective image frames, when the included angle q between the video window and the mobile phone boundary is not obvious, the dynamic rotation condition can be judged according to the included angle p between the video window and the mobile phone boundary. If the value of p is larger along with the increase of the number of frames of the effective image frames, the dynamic effect is that a video window is changed from a horizontal screen to a vertical screen; if the value of p is smaller as the number of frames of the effective image frames increases, the dynamic effect is that the video window is changed from a vertical screen to a horizontal screen.
Optionally, when the dynamic effect scene is the dynamic effect of the video window horizontal screen to vertical screen and vertical screen to horizontal screen of the video software a in the rotating scene, the effective data may be the included angle between the video window and the boundary of the mobile phone and the rotating direction of the video window.
As shown in fig. 10B, as the number of frames of the effective image frame increases, the larger the value of q is, and the motion effect is when the video window is changed from a horizontal screen to a vertical screen. At this time, an actual dynamic effect curve is drawn from the effective data.
Optionally, acquiring an included angle q between the video window of each image in the effective image frames and the boundary of the mobile phone, and carrying out normalization processing on the frame number of the effective image frames and the included angle q between the corresponding video window and the boundary of the mobile phone. The abscissa and the ordinate after normalization are all between 0 and 1.
For example, if the effective image frame is 11 frames, the minimum value of the included angle q between the video window and the mobile phone boundary is 0 degrees, and the maximum value of the included angle q between the video window and the mobile phone boundary is 90 degrees. The included angle q between the video window in the 1 st frame image and the boundary of the mobile phone is 0 degrees, and after normalization, the corresponding coordinates are (0, 0); the included angle q between the video window in the 2 nd frame image and the boundary of the mobile phone is 7.2 degrees, and the corresponding coordinate is (0.1,0.08) after normalization; the included angle q between the video window in the 3 rd frame image and the mobile phone boundary is 18 degrees, after normalization, the corresponding coordinate is (0.2) … …, the included angle q between the video window in the 10 th frame image and the mobile phone boundary is 82 degrees, and after normalization, the corresponding coordinate is (0.9,0.91); the included angle q between the video window in the 11 th frame image and the boundary of the mobile phone is 90 degrees, and after normalization, the corresponding coordinates are (1, 1). And drawing an actual dynamic effect curve according to the normalized data.
As shown in fig. 9C, as the number of frames of the effective image frame increases, the smaller the value of q is, the dynamic effect is when the video window is changed from the vertical screen to the horizontal screen. At this time, an actual dynamic effect curve is drawn from the effective data.
For example, if the effective image frame is 11 frames, the minimum value of the included angle q between the video window and the mobile phone boundary is 0 degrees, and the maximum value of the included angle q between the video window and the mobile phone boundary is 90 degrees. The included angle q between the video window in the 1 st frame image and the boundary of the mobile phone is 90 degrees, and after normalization, the corresponding coordinates are (0, 1); the included angle q between the video window in the 2 nd frame image and the boundary of the mobile phone is 80 degrees, and after normalization, the corresponding coordinate is (0.1,0.89); the included angle q between the video window in the 3 rd frame image and the mobile phone boundary is 71 degrees, after normalization, the corresponding coordinate is (0.2,0.79) … …, the included angle q between the video window in the 10 th frame image and the mobile phone boundary is 7.2 degrees, and after normalization, the corresponding coordinate is (0.9,0.08); the included angle q between the video window in the 11 th frame image and the boundary of the mobile phone is 0 degrees, and after normalization, the corresponding coordinates are (1, 0). And drawing an actual dynamic effect curve according to the normalized data.
Optionally, the abscissa of the dynamic fitting chart in fig. 7 (c) is the number of frames after normalization processing of the effective image frames, and the ordinate is the rotation angle after normalization processing of the included angle between the video window in the image and the mobile phone boundary. And drawing an actual dynamic effect curve according to the number of frames corresponding to each effective image frame and the included angle between the video window in the image and the boundary of the mobile phone after normalization.
In one possible implementation, the PC may learn the control point of the motion effect curve, and obtain the bezier curve according to the control point.
In another possible implementation, the PC does not know the control point of the motion effect curve, and can use the least squares method to back-push the control point according to the actual motion effect curve, and obtain the bezier curve according to the control point.
As in fig. 7 (c), a bezier curve is drawn from the control points. Each point in the Bezier curve corresponds to an effective image frame and an included angle between a video window corresponding to the effective image frame and the boundary of the mobile phone.
Optionally, the actual dynamic effect curve and the Bezier curve are fitted to obtain an actual dynamic effect curve fitting degree, namely, dynamic effect fitting degree 801.
Optionally, in the drawn actual motion effect curve and the bezier curve, the included angle (ordinate) between the video window corresponding to each effective image frame (abscissa) and the mobile phone boundary may be compared, if the difference between the included angles between the video windows of the actual motion effect curve and the bezier curve and the mobile phone boundary in the same image frame is smaller than the preset threshold, it indicates that the image frame in the actual motion effect curve is reasonable, and the fitting degree of the actual motion effect curve is obtained according to the duty ratio of all reasonable image frames in all image frames (effective image frames). The higher the fitting degree of the actual dynamic effect curve, the smoother the actual dynamic effect is, and the better the effect is. For example, in the alternative, in the motion effect fit map, the preset threshold may be 0.015, that is, the difference between the ordinate values of the actual motion effect curve and the bezier curve is less than 0.015, which indicates that the image frame in the actual motion effect curve is reasonable.
For example, if the effective image frame from horizontal screen to vertical screen of the video window is 60 frames and the reasonable image frame is 57 frames, the fitting degree of the actual motion effect curve from horizontal screen to vertical screen of the video window is 57/60=95%. If the effective image frame from the vertical screen to the horizontal screen of the video window is 60 frames and the reasonable image frame is 54 frames, the fitting degree of the actual motion effect curve from the vertical screen to the horizontal screen of the video window is 54/60=90%. Optionally, if the fitting degree of the actual dynamic curve is higher than a preset fitting threshold, the dynamic curve is smooth and does not need to be optimized; if the fitting degree of the actual dynamic effect curve is lower than the fitting degree threshold, the dynamic effect is not smooth, and further optimization is needed. Optionally, the preset fitting threshold may be 92%, and the fitting degree of the dynamic effect of the video window from the horizontal screen to the vertical screen is 95%, which indicates that the dynamic effect is smooth and does not need to be optimized; the fitting degree of the dynamic effect of the video window from the vertical screen to the horizontal screen is 90%, which indicates that the dynamic effect is not smooth and needs to be further optimized.
Optionally, clicking on the "view action fitting process map" button 903 may view an image of the action fitting process of the rotating scene. Alternatively, the image of the dynamic fitting process of the rotated scene may be part or all of the valid image frames. Some or all of the active image frames are arranged in time order. When the image in the dynamic effect fitting process is a part of effective image frames, the image in the dynamic effect fitting process can extract a part of image frames from the effective image frames at fixed intervals, and can also extract a part of image frames from the effective image frames at fixed intervals. The embodiments of the present application are not limited in this regard.
In the dynamic effect of the rotating scene, the PC draws an actual dynamic effect curve of the rotating scene according to the change condition of an included angle q between the video window and the boundary of the mobile phone and fits the actual dynamic effect curve with the Bezier curve, so that the fitting degree of the actual dynamic effect curve is obtained, and a dynamic effect test result is determined according to the fitting degree of the actual dynamic effect curve.
The dynamic effect fitting graph in the embodiment of the application can clearly and intuitively display the change condition of the current horizontal dynamic effect, and displays the difference between the current actual dynamic effect curve and the Bezier curve through the fitting degree of the actual dynamic effect curve; in the effective image frame acquisition process, when the condition that the dynamic performance is affected by frame loss and the like occurs, the dynamic performance can be intuitively displayed in the dynamic fitting diagram.
Further, as shown in fig. 11, the dynamic effect fitting process of the video corresponding to the dynamic effect to be tested by the PC may be specifically implemented as the following steps S101 to S105.
S101, the PC acquires video data corresponding to the dynamic effect to be tested.
Optionally, the video data corresponding to the dynamic effect to be tested is the video corresponding to the dynamic effect to be tested.
Optionally, the video corresponding to the dynamic effect to be tested may be a screen recording video obtained by recording a dynamic effect process in the mobile phone. Optionally, the video corresponding to the dynamic effect to be tested may also be a video recorded by recording a screen of a dynamic effect process in the PC, and the obtained video recorded by the screen. Referring to fig. 11, in this embodiment of the present application, the video corresponding to the to-be-tested dynamic effect may be a recorded video in a mobile phone.
As shown in fig. 4, in the dynamic effect fitting task interface, the device type, the task name and the fitting method of the video corresponding to the dynamic effect to be tested are selected. Clicking the upload video 406 to upload the video corresponding to the dynamic effect to be tested.
Optionally, as shown in fig. 5, the device type is a system type of a mobile phone that sends a video corresponding to a to-be-tested dynamic effect, and may be one of an android system, an apple system, and a hong-and-Monte system, or may be another type of system. The task name can be different dynamic scene conditions, such as translation, scaling (folder), scaling (negative one screen), rotation (video software A), rotation (video software B), and the like, and also can be a customized task name. The fitting method may be different according to the different preset curves used in fitting. Taking a preset curve as a Bezier curve as an example, the fitting method can be two types of given control points and back-pushing control points (the control points are not given and are needed to be back-pushed to obtain the control points).
S102, the PC frames the video data corresponding to the dynamic effect to be tested, and an effective image frame is obtained.
Optionally, the effective image frame is the image frame remaining after deleting the image repetition in the adjacent image frames. As shown in fig. 8A, the motion effect in the video corresponding to the motion effect to be tested is the motion effect of the translation scene, and the video corresponding to the motion effect to be tested is subjected to video frame splitting to obtain all the image frames (a) - (f), wherein the effective image frames are the image frame (b), the image frame (c), the image frame (d) and the image frame (f).
Optionally, the PC may perform video de-framing on the video corresponding to the dynamic effect to be tested by using a video capture/read () method in the Python OpenCV library, to obtain an image of each frame, and store the image of each frame under a fixed path.
Optionally, deleting other image frames except the effective image frame, and sorting the effective image frames according to video time.
S103, the PC obtains effective data according to the effective image frames.
Optionally, the effective data includes parameters generated in the motion process of the effective image frame, and the parameters are used for drawing an actual dynamic effect curve;
optionally, for different effective image frames of the video corresponding to the to-be-tested dynamic effect under different dynamic effect scenes, the obtained effective data are different, that is, the effective data corresponding to different dynamic effects are different.
As shown in fig. 12, when the motion effect scene is a translation scene, that is, when the motion effect is a motion effect generated by interface translation, the process of obtaining the effective data may be specifically implemented as the following steps S201 to S204.
S201, the PC acquires coordinates of an intersection point of the source interface and the target interface.
The source interface is the interface which is currently translated, the target interface is the translated interface, and the intersection point is the intersection point of the two interfaces in the interface translation process.
S202, the PC acquires the distance from the intersection point of the source interface and the target interface to the target side as the translation distance.
Alternatively, the target edge may be the rightmost edge of the mobile phone display.
S203, the PC judges the change condition of the translation distance along with the increase of the number of frames of the effective image frames.
S204, the PC determines the translation direction according to the change condition of the translation distance of the source interface.
Optionally, when the dynamic effect is generated by translating the interface, the effective data is a translation distance and a translation direction of the source interface.
In another scenario, as shown in fig. 13, when the action scene is a zoom scene, that is, the action is generated by opening or closing a folder. The process of acquiring valid data may be specifically implemented as steps S301 to S304 described below.
S301, the PC acquires the position coordinates of the service on state or the service off state.
Optionally, when the service is in the off state, the width of the icon corresponding to the service is the minimum value, and when the service is in the on state, the width of the icon corresponding to the service is the maximum value. And judging whether the width of the icon corresponding to the service is increased or decreased according to the position coordinates of the icon corresponding to the service.
By way of example, the service may be a folder in a desktop, a service in a minus one screen, and so on.
S302, the PC acquires the width of the icon corresponding to the service in each effective image frame.
Optionally, the distance from the right boundary of the icon corresponding to the service to the target side is different from the distance from the left boundary of the icon corresponding to the service to the target side, so as to obtain the width of the icon corresponding to the service.
Optionally, the target edge is the leftmost edge of the mobile phone display screen.
S303, the PC judges the change condition of the width of the icon corresponding to the service along with the increase of the number of frames of the effective image frames.
S304, the PC determines the scaling direction according to the change condition of the width of the icon corresponding to the service.
Optionally, when the dynamic effect is generated by opening or closing the folder, the effective data is the width and the scaling direction of the icon corresponding to the service.
In another scenario, as shown in fig. 14, when the motion effect scenario is a rotation scenario, that is, when the motion effect is a motion effect generated by window rotation, the process of acquiring the effective data may be specifically implemented as the following steps S401 to S404.
S401, the PC acquires coordinate values of the intersection point of the window and the target edge.
Optionally, the window is a window that is currently being rotated. Optionally, the window is a video window in a mobile phone display screen interface.
Optionally, the target edge may be the leftmost edge and/or the uppermost edge of the mobile phone display screen, etc. The intersection point can be the intersection point of the window boundary and the mobile phone display screen, and the intersection point can be one or more.
S402, the PC acquires the rotation angle of the window in each effective image frame.
Optionally, the rotation angle may be an angle between the left boundary of the window and the leftmost side of the mobile phone display screen, or an angle between the upper boundary of the window and the leftmost side of the mobile phone display screen. The embodiments of the present application are not limited in this regard.
Alternatively, the reference data may be returned by abs function, and the PC calculates the rotation angle from the reference data by getAngle function.
S403, the PC judges the change condition of the rotation angle of the window along with the increase of the number of frames of the effective image frames.
S404, the PC determines the rotation condition according to the change condition of the rotation angle of the window.
Optionally, when the dynamic effect is generated by rotation of the window, the effective data is the rotation angle and the rotation direction of the window.
Alternatively, the following step S104 may be performed according to the above effective data.
S104, the PC obtains an actual dynamic effect curve according to the effective data.
Optionally, the PC draws actual dynamic curves under different scenes according to different effective data.
Optionally, normalizing the effective data; and obtaining an actual dynamic effect curve according to the normalized effective data description points.
In one scenario, the dynamic effects are those generated by interface translation. Illustratively, as shown in fig. 7 (a), an actual motion effect curve of the panning scene is plotted from the effective data.
In another scenario, the action is one that results from opening or closing a folder. Illustratively, as shown in fig. 7 (b), the actual dynamic curve of opening and closing folders in the zoom scene is plotted from the valid data.
In another scenario, when the motion effect is a motion effect generated by window rotation, an actual motion effect curve of the rotated scenario is drawn from the valid data, as illustrated in fig. 7 (c), for example.
S105, the PC fits the actual dynamic effect curve with a preset curve to obtain a dynamic effect test result.
Optionally, in the embodiment of the present application, the preset curve is described by taking a bezier curve as an example.
Alternatively, the PC may obtain a bezier curve according to a known preset number of control points; or determining a preset number of control points by using a least square method according to the actual dynamic effect curve, and obtaining the Bezier curve according to the preset number of control points.
Illustratively, as in (a) of fig. 7, a bezier curve of the panning scene is plotted according to the control points. As in fig. 7 (b), a bezier curve of the zoom scene is plotted according to the control points. As in fig. 7 (c), a bezier curve of the rotated scene is plotted according to the control points.
Optionally, the PC fits the actual motion effect curve with the bezier curve to obtain the fitting degree of the actual motion effect curve. And determining an action test result according to the fitting degree of the actual action curve. Optionally, if the fitting degree of the actual dynamic curve is higher than a preset fitting threshold, the dynamic curve is smooth and does not need to be optimized; if the fitting degree of the actual dynamic effect curve is lower than the fitting degree threshold, the dynamic effect is not smooth, and further optimization is needed. Optionally, the preset fitting threshold may be 92%, and the fitting degree of the motion effect is 95%, which indicates that the motion effect of the left translation is smooth, and optimization is not needed.
In one scene, when the video corresponding to the motion effect to be tested is the motion effect video of the translation scene, the actual motion effect curve is fitted with the Bezier curve. For example, the fitting degree of the obtained actual dynamic effect curve is 95%, and is higher than a preset fitting threshold, so that the dynamic effect is smooth and does not need to be optimized.
In another scenario, when the video corresponding to the dynamic effect to be tested is the dynamic effect video of the zoom scenario, the actual dynamic effect curve is fitted with the Bezier curve. For example, the fitting degree of the actual dynamic effect curve when the folder is opened is 91.67%, and is lower than the fitting threshold, the dynamic effect is not smooth, and further optimization is needed; the fitting degree of the actual dynamic effect curve is 95% when the folder is closed, the dynamic effect is higher than a preset fitting threshold, and the dynamic effect is smooth and does not need to be optimized.
In another scenario, when the video corresponding to the dynamic effect to be tested is the dynamic effect video of the rotating scenario, the actual dynamic effect curve is fitted with the Bezier curve. The fitting degree of the actual dynamic effect curve is 95% when the video window is turned from a horizontal screen to a vertical screen, and the dynamic effect is smooth and does not need to be optimized when the fitting degree is higher than a preset fitting threshold; the fitting degree of the actual dynamic effect curve is 90% when the video window is turned from the vertical screen to the horizontal screen, and is lower than a preset fitting threshold, the dynamic effect is not smooth, and further optimization is needed.
Further, the dynamic effect testing method comprises the steps of displaying a first interface, wherein the first interface comprises prompt information for prompting a user to upload video data corresponding to dynamic effects to be tested; receiving a first operation input by a user on a first interface, wherein the first operation is used for inputting video data corresponding to dynamic effects to be tested; and performing an action test on the video data corresponding to the action to be tested, and displaying an action test result.
As shown in fig. 4, the first interface includes a dynamic effect fitting task module 401, a history query module 402, and an interface 40 corresponding to the dynamic effect fitting task module, where the interface 40 corresponding to the dynamic effect fitting task module 401 includes prompt information for prompting a user to upload video data corresponding to a dynamic effect to be tested. As shown in fig. 5, the first operation includes an input operation of inputting a device type at a device type selection box 403, inputting a task name at a task name selection box 404, and inputting a fitting method at a fitting method selection box 405; the first operation further includes clicking an "upload" button 406 in the interface to upload a video corresponding to the to-be-tested dynamic effect, clicking an "execute" button 407, performing an dynamic effect test on the uploaded video data corresponding to the to-be-tested dynamic effect, and displaying a dynamic effect test result.
Optionally, performing an action test on video data corresponding to the action to be tested includes: the video data corresponding to the dynamic effect to be tested is subjected to frame disassembly to obtain an effective image frame; the effective image frames comprise the rest image frames after deleting the image frames which are repeated in the adjacent image frames; obtaining effective data according to the effective image frames, wherein the effective data comprises parameters generated in the moving process of the effective image frames; obtaining an actual dynamic effect curve according to the effective data; fitting the actual dynamic effect curve with a preset curve to obtain fitting degree; the fitting degree is used for representing the dynamic effect test result. For example, the specific process of the dynamic test can be seen in steps S101-S105.
Optionally, the first interface further includes: the first control is used for responding to second operation of the user on the first interface and displaying a second interface, wherein the second interface comprises information of a dynamic effect test result history record; the second operation is a triggering operation on the first control; the dynamic efficiency test result history record information comprises at least one of a serial number of a historical dynamic efficiency, a test task type corresponding to the historical dynamic efficiency, a task name, a task starting time, a task ending time, an execution result and a second control; the second control is used for displaying detailed data of the historical dynamic task after being triggered.
Illustratively, the first control includes a history query module 402, the second operation includes clicking on the history query module 402, and as shown in FIG. 6, displaying a second interface including an interface 50 corresponding to the history query module. The interface 50 includes dynamic efficiency test result history information, where the dynamic efficiency test result history information includes at least one of a serial number of a historical dynamic efficiency, a test task type corresponding to the historical dynamic efficiency, a task name, a task start time, a task end time, an execution result, and a second control; the second control is used for displaying detailed data of the historical dynamic task after being triggered. Illustratively, the second control includes detailed data of the historical performance fit task.
Optionally, in response to a third operation of the user on the second interface, displaying a third interface, where the third interface includes detailed data of the historical dynamic task; the third operation is a triggering operation on the second control; the detailed data of the historical dynamic task comprises at least one of dynamic fitting degree, dynamic fitting diagram and dynamic fitting process diagram.
Illustratively, as shown in fig. 7 (a), the third interface includes a detailed data interface 60 of the historical action fitting task, which includes at least one of an action fit degree, an action fit map. Optionally, the interface may also include a dynamic fit process map.
Optionally, the dynamic effect fitting graph comprises an actual dynamic effect curve and a preset curve; the actual dynamic effect curve and the preset curve are used for fitting to obtain dynamic effect test results.
Illustratively, as shown in fig. 7 (a), the dynamic effect fitting graph includes an actual dynamic effect curve and a bezier curve, and the actual dynamic effect curve and the bezier curve are used for fitting to obtain dynamic effect test results.
Optionally, the dynamic effect fitting process map includes a part or all of the valid image frames; the effective image frames include image frames remaining after deleting image repeated image frames in adjacent image frames.
Illustratively, as shown in fig. 8A, the effective image frames include image frame (b), image frame (c), image frame (d), image frame (f). For example, as shown in fig. 8B, the dynamic fit process map may include three of the active image frames. The dynamic fit process map may also include all valid image frames.
One or more of the interfaces described above are exemplary, and in other embodiments, other interface designs are possible.
It should be noted that some operations in the flow of the above-described method embodiments are optionally combined, and/or the order of some operations is optionally changed. The order of execution of the steps in each flow is merely exemplary, and is not limited to the order of execution of the steps, and other orders of execution may be used between the steps. And is not intended to suggest that the order of execution is the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. In addition, it should be noted that details of processes involved in a certain embodiment herein apply to other embodiments as well in a similar manner, or that different embodiments may be used in combination.
Moreover, some steps in method embodiments may be equivalently replaced with other possible steps. Alternatively, some steps in method embodiments may be optional and may be deleted in some usage scenarios. Alternatively, other possible steps may be added to the method embodiments.
Moreover, the method embodiments described above may be implemented alone or in combination.
Further embodiments of the present application provide an apparatus, which may be the second electronic device or the first electronic device or a component in the second electronic device (such as a chip system) as described above.
The apparatus may include: a display screen, a memory, and one or more processors. The display, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device may refer to the structure of the electronic device shown in fig. 2A.
The core structure of the electronic device may be represented as the structure shown in fig. 15, and the electronic device includes: a processing module 151, an input module 152, a storage module 153, a display module 154, and a communication module 155.
The processing module 151 may include at least one of a Central Processing Unit (CPU), an application processor (Application Processor, AP), or a communication processor (Communication Processor, CP). The processing module 151 may perform operations or data processing related to control and/or communication of at least one of the other elements of the consumer electronic device. Optionally, the processing module 151 is configured to support the first electronic device 100 to execute S102-S105 in fig. 11; and/or for supporting the first electronic device 100 to perform S201-S204 in fig. 12; and/or for supporting the first electronic device 100 to perform S301-S304 in fig. 13; and/or for supporting the first electronic device 100 to perform S401-S404 in fig. 14.
The input module 152 is configured to obtain an instruction or data input by a user, and transmit the obtained instruction or data to other modules of the electronic device. Specifically, the input mode of the input module 152 may include touch, gesture, proximity screen, and the like, and may be voice input. For example, the input module may be a screen of the electronic device, acquire an input operation of a user, generate an input signal according to the acquired input operation, and transmit the input signal to the processing module 151. Optionally, the input module 152 is configured to obtain video data corresponding to the dynamic effect to be tested, which is input by the user, and may refer to the dynamic effect fitting task interface schematic diagram shown in fig. 4.
The storage module 153 may include volatile memory and/or nonvolatile memory. The storage module is used for storing at least one relevant instruction or data in other modules of the user terminal equipment. Optionally, the storage module 153 is configured to store the video data corresponding to the dynamic effect to be tested, which is acquired in S101 in fig. 11, by the first electronic device 100.
The display module 154 may include, for example, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, an Organic Light Emitting Diode (OLED) display, a microelectromechanical system (MEMS) display, or an electronic paper display. For displaying user viewable content (e.g., text, images, video, icons, symbols, etc.). Optionally, the display module 154 is configured to display the first electronic device 100 to display the content as shown in fig. 3-10C.
A communication module 155 for supporting the personal terminal to communicate with other personal terminals (via a communication network). For example, the communication module may be connected to a network via wireless communication or wired communication to communicate with other personal terminals or network servers. The wireless communication may employ at least one of cellular communication protocols, such as Long Term Evolution (LTE), long term evolution-advanced (LTE-a), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), universal Mobile Telecommunications System (UMTS), wireless broadband (WiBro), or global system for mobile communications (GSM). The wireless communication may include, for example, short-range communication. The short-range communication may include at least one of wireless fidelity (Wi-Fi), bluetooth, near Field Communication (NFC), magnetic Stripe Transmission (MST), or GNSS. Optionally, the communication module 155 is configured to support the first electronic device to communicate with the second electronic device, and for example, reference may be made to the system schematic shown in fig. 1.
The apparatus shown in fig. 15 may also include more, fewer, or split portions of the components, or have other arrangements of components, which are not limited in this embodiment of the present application.
Embodiments of the present application also provide a chip system, as shown in fig. 16, that includes at least one processor 161 and at least one interface circuit 162. The processor 161 and the interface circuit 162 may be interconnected by wires. For example, interface circuit 162 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, interface circuit 162 may be used to send signals to other devices (e.g., processor 161). The interface circuit 162 may, for example, read instructions stored in the memory and send the instructions to the processor 161. The instructions, when executed by the processor 161, may cause the electronic device to perform the various steps of the embodiments described above. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiment of the application also provides a computer storage medium, which comprises computer instructions, when the computer instructions run on the electronic device, the electronic device is caused to execute the functions or steps executed by the mobile phone in the embodiment of the method.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the functions or steps performed by the mobile phone in the above-mentioned method embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above. The specific working processes of the above-described systems, devices and modules may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, and the division of modules or units, for example, is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A dynamic effect test method, the method comprising:
acquiring video data corresponding to dynamic effects to be tested;
the video data corresponding to the dynamic effect to be tested is subjected to frame disassembly to obtain an effective image frame; the effective image frames comprise image frames which are remained after deleting the image frames which are repeated in the adjacent image frames;
obtaining effective data according to the effective image frame, wherein the effective data comprises parameters generated in the motion process of the effective image frame;
obtaining an actual dynamic effect curve according to the effective data;
fitting the actual dynamic effect curve with a preset curve to obtain fitting degree; the fitting degree is used for representing the dynamic effect test result.
2. The method of claim 1, wherein the dynamic effect is a dynamic effect generated by a translation interface; the effective data comprises a translation distance and a translation direction of the interface;
Or,
the dynamic effect is generated by opening or closing a service or a folder, and the effective data comprises the width and the scaling direction of an icon corresponding to the service or the folder;
or,
the dynamic effect is generated by window rotation, and the effective data comprise the rotation angle and the rotation direction of the window.
3. The method according to claim 1 or 2, wherein said deriving an actual kinetic effect curve from said effective data comprises:
normalizing the effective data;
and obtaining the actual dynamic effect curve according to the normalized effective data description points.
4. A method according to any one of claims 1-3, wherein the predetermined profile comprises a bezier profile.
5. The method according to claim 4, wherein the method further comprises:
obtaining the Bezier curve according to a known preset number of control points;
or,
determining a preset number of control points by using a least square method according to the actual dynamic effect curve;
and obtaining the Bezier curve according to the preset number of control points.
6. The method according to any one of claims 1-5, further comprising:
And if the fitting degree of the actual dynamic effect curve and the preset curve is lower than a preset fitting threshold, adjusting the dynamic effect to be tested or indicating to adjust the dynamic effect to be tested.
7. A dynamic effect test method, the method comprising:
displaying a first interface, wherein the first interface comprises prompt information for prompting a user to upload video data corresponding to dynamic effects to be tested;
receiving a first operation input by a user on the first interface, wherein the first operation is used for inputting video data corresponding to the dynamic effect to be tested;
and performing an action test on the video data corresponding to the action to be tested, and displaying an action test result.
8. The method for testing the dynamic effect according to claim 7, wherein the performing the dynamic effect test on the video data corresponding to the dynamic effect to be tested comprises:
the video data corresponding to the dynamic effect to be tested is subjected to frame disassembly to obtain an effective image frame; the effective image frames comprise image frames which are remained after deleting the image frames which are repeated in the adjacent image frames;
obtaining effective data according to the effective image frame, wherein the effective data comprises parameters generated in the motion process of the effective image frame;
Obtaining an actual dynamic effect curve according to the effective data;
fitting the actual dynamic effect curve with a preset curve to obtain fitting degree; the fitting degree is used for representing the dynamic effect test result.
9. The method of claim 7 or 8, wherein the first interface comprises: a first control; the method further comprises the steps of:
responding to a second operation of a user on the first interface, and displaying a second interface, wherein the second interface comprises information of a dynamic effect test result history record; the second operation is a triggering operation on the first control;
the dynamic efficiency test result history record information comprises at least one of a serial number of a historical dynamic efficiency, a test task type corresponding to the historical dynamic efficiency, a task name, a task starting time, a task ending time, an execution result and a second control; and the second control is used for displaying detailed data of the historical dynamic effect task after being triggered.
10. The method according to claim 9, wherein the method further comprises:
responding to a third operation of a user on the second interface, and displaying a third interface, wherein the third interface comprises detailed data of the historical dynamic effect task;
the third operation is a triggering operation on the second control; the detailed data of the historical dynamic task comprises at least one of dynamic fitting degree, dynamic fitting diagram and dynamic fitting process diagram.
11. The method of claim 10, wherein the dynamic fit map comprises an actual dynamic curve and a preset curve; and the actual dynamic effect curve and the preset curve are used for fitting to obtain the dynamic effect test result.
12. The method of claim 10, wherein the dynamic fit process map comprises a portion or all of the active image frames; the effective image frames include image frames remaining after deleting image repeated image frames in adjacent image frames.
13. An electronic device, comprising: a processor and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform the method of any of claims 1-6 or 7-12.
14. A computer readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform a dynamic effect test method according to any of claims 1-6 or 7-12.
CN202211058836.8A 2022-08-31 2022-08-31 Dynamic effect test method and electronic equipment Pending CN117667654A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211058836.8A CN117667654A (en) 2022-08-31 2022-08-31 Dynamic effect test method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211058836.8A CN117667654A (en) 2022-08-31 2022-08-31 Dynamic effect test method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117667654A true CN117667654A (en) 2024-03-08

Family

ID=90073938

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211058836.8A Pending CN117667654A (en) 2022-08-31 2022-08-31 Dynamic effect test method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117667654A (en)

Similar Documents

Publication Publication Date Title
US11455738B2 (en) Electronic device and method for applying image effect to images obtained using image sensor
WO2020244492A1 (en) Screen projection display method and electronic device
CN110462572B (en) Electronic device and control method thereof
US10659684B2 (en) Apparatus and method for providing dynamic panorama function
US20200104034A1 (en) Electronic device and method for electronic device displaying image
US10573053B2 (en) Method and apparatus for animating images on mobile devices
US10848669B2 (en) Electronic device and method for displaying 360-degree image in the electronic device
US9478029B2 (en) Selection strategy for exchanging map information in collaborative multi-user SLAM systems
US20150348493A1 (en) Method of controlling display and electronic device for providing the same
KR102547104B1 (en) Electronic device and method for processing plural images
WO2021180089A1 (en) Interface switching method and apparatus and electronic device
CN111448587B (en) Advertisement picture display method, advertisement picture uploading method and advertisement picture uploading device
CN104754234A (en) Photographing method and device
US20170155917A1 (en) Electronic device and operating method thereof
US9942467B2 (en) Electronic device and method for adjusting camera exposure
US20180330154A1 (en) Electronic device for determining position of user, and method of controlling said device
CN106506945A (en) A kind of control method and terminal
US10331334B2 (en) Multiple transparent annotation layers for use within a graphical user interface
KR102151705B1 (en) Method for obtaining image and an electronic device thereof
CN115022982B (en) Multi-screen cooperative non-inductive access method, electronic equipment and storage medium
CN117667654A (en) Dynamic effect test method and electronic equipment
WO2022078116A1 (en) Brush effect picture generation method, image editing method and device, and storage medium
CN116991302B (en) Application and gesture navigation bar compatible operation method, graphical interface and related device
US12008761B2 (en) Image processing method and apparatus, and device
CN116437293B (en) Geofence establishment method, server and communication system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination