CN116708751A - Method and device for determining photographing duration and electronic equipment - Google Patents

Method and device for determining photographing duration and electronic equipment Download PDF

Info

Publication number
CN116708751A
CN116708751A CN202211216155.XA CN202211216155A CN116708751A CN 116708751 A CN116708751 A CN 116708751A CN 202211216155 A CN202211216155 A CN 202211216155A CN 116708751 A CN116708751 A CN 116708751A
Authority
CN
China
Prior art keywords
picture
time length
photographing
average value
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211216155.XA
Other languages
Chinese (zh)
Other versions
CN116708751B (en
Inventor
杨文菊
萧欣宜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211216155.XA priority Critical patent/CN116708751B/en
Publication of CN116708751A publication Critical patent/CN116708751A/en
Application granted granted Critical
Publication of CN116708751B publication Critical patent/CN116708751B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Abstract

The embodiment of the application provides a method and a device for determining photographing duration and electronic equipment. Wherein the method comprises the following steps: acquiring videos recorded in the photographing and previewing process of photographing M pictures by a camera application program of the electronic equipment to be tested, wherein M is an integer greater than or equal to 1; according to the photographed video, respectively determining the picture preview time length and the picture generation time length of each picture in M pictures; and determining the photographing time length of the electronic equipment to be tested according to the picture preview time length and the picture generation time length of each picture in the M pictures. According to the method, the picture preview time length and the picture generation time length of each picture in the M pictures can be accurately determined according to the photographing videos of the photographing and previewing processes of the M pictures, and then the photographing time length of the electronic device to be tested can be accurately determined according to the picture preview time length and the picture generation time length of each picture in the M pictures, so that the photographing time length of the electronic device to be tested is accurately quantized.

Description

Method and device for determining photographing duration and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method and an apparatus for determining photographing duration, and an electronic device.
Background
With the development of terminal technology, the demands of users for electronic devices (such as mobile phones) are gradually increased, and accordingly, the demands for various performances of the electronic devices are also increased. For example, the user's demand for a photographing speed of a camera (camera) of an electronic apparatus is gradually increasing.
Currently, the photographing speed of a camera of an electronic device is generally measured by the photographing duration of a picture taken by the camera. However, at present, the photographing time length of a camera of the electronic device is evaluated in a manual observation mode, and no better mode exists yet for accurately quantifying the photographing time length of the camera. Therefore, how to precisely quantify the photographing time of the camera of the electronic device is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the application provides a method and a device for determining photographing time length and electronic equipment, and aims to solve the problem that a better mode does not exist at present and accurately quantize the photographing time length of a camera of the electronic equipment.
In a first aspect, an embodiment of the present application provides a method for determining a photographing duration, where the method includes:
Acquiring photographing videos of electronic equipment to be tested, wherein the photographing videos comprise videos of photographing and previewing processes of M pictures photographed by a camera application program of the electronic equipment to be tested, and M is an integer greater than or equal to 1;
respectively determining a picture preview time length and a picture generation time length of each picture in the M pictures according to the photographed video, wherein the picture preview time length of an nth picture in the M pictures is a time length from the photographing operation for photographing the nth picture received by the electronic equipment to the complete display of the nth picture by an image thumbnail frame of the camera application program, the picture generation time length of the nth picture is a time length from the complete display of the nth picture by the image thumbnail frame to the complete display of the nth picture by a detail interface of the camera application program, and N is an integer which is more than or equal to 1 and less than or equal to M;
and determining the photographing time length of the electronic equipment to be tested according to the picture preview time length and the picture generation time length of each picture in the M pictures.
In this way, the image preview time length and the image generation time length of each image in the M images shot by the electronic equipment to be tested can be accurately determined according to the video shot by the camera application program of the electronic equipment to be tested and recorded in the preview process, and then the image shooting time length of the electronic equipment to be tested can be accurately determined according to the image preview time length and the image generation time length of each image in the M images, so that the image shooting time length of the electronic equipment to be tested is accurately quantized, and subsequently, the image shooting speed performance of the electronic equipment to be tested can be evaluated according to the accurately quantized image shooting time length, and the evaluation result is more accurate.
In a possible implementation manner, determining a graph preview duration and a graph generation duration of the nth picture according to the photographed video includes: determining a first shooting time of the electronic equipment to be detected in the shooting video when receiving the first frame image of the shooting operation according to the shooting video, wherein the image thumbnail frame completely displays a second shooting time of the first frame image of the Nth picture, and the detail interface completely displays a third shooting time of the first frame image of the Nth picture; and determining the graph preview time length and the graph generation time length of the Nth picture according to the first shooting time, the second shooting time and the third shooting time.
In this way, the image preview time and the image generation time of the N-th image shot by the camera application program of the electronic equipment to be tested can be accurately determined according to the shooting time of each frame image obtained after the shooting video of the electronic equipment to be tested is framed, so that the image preview time and the image generation time of each image in M images can be accurately quantized according to the image preview time and the image generation time of each image, and the applicability is better.
In a possible implementation manner, the determining, according to the photographed video, a third photographing time of the detail interface in the photographed video to completely display the first frame image of the nth picture includes: acquiring an H frame image of the shooting and previewing process of the N-th picture shot by the electronic equipment to be detected in the shooting video, wherein H is an integer greater than or equal to 3; respectively acquiring a definition average value and a brightness average value of each frame of image in the H frame of image; and determining the third shooting time according to the definition average value and the brightness average value of each frame of image in the H frame of images.
Therefore, the time for finishing the loading of the Nth picture can be accurately determined according to the change of the definition and the brightness of each frame of image corresponding to the photographing and previewing process of the Nth picture taken by the electronic equipment to be tested, so that the picture generation time length of the Nth picture can be accurately determined, and the accuracy of the photographing time length of the electronic equipment to be tested can be further improved.
In a possible implementation manner, the determining the third shooting time according to the average value of the definition and the average value of the brightness of each frame of the H-frame images includes: sequentially determining the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image in the H-th frame image and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image according to the sequence from small to large, wherein K is an integer greater than or equal to 1 and less than or equal to H-2; and if the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image is smaller than a first preset threshold value, and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image is smaller than a second preset threshold value, determining that the shooting time of the K-th frame image is the third shooting time.
Therefore, the time for loading the Nth picture to be completed can be accurately determined according to the change of the definition and the brightness of the H-frame image corresponding to the photographing and previewing process of the Nth picture taken by the electronic equipment to be tested, so that the picture generation time length of the Nth picture can be accurately determined, and the accuracy of the photographing time length of the electronic equipment to be tested can be further improved.
In a possible implementation manner, the determining, according to the graph preview duration and the graph generation duration of each of the M pictures, the photographing duration of the electronic device to be tested includes: generating a picture preview time length average value and a picture generation time length average value of the M pictures according to the picture preview time length and the picture generation time length of each picture in the M pictures; and determining the sum of the graph preview time length average value and the picture generation time length average value as the photographing time length of the electronic equipment to be tested.
Therefore, the photographing time length of the electronic equipment to be tested can be determined according to the average value of the image preview time length and the average value of the image generation time length of the plurality of images photographed by the electronic equipment to be tested, the determined photographing time length has smaller error, higher accuracy and better applicability.
In a second aspect, an embodiment of the present application provides a device for determining a photographing duration, where the device includes:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring photographing videos of electronic equipment to be tested, the photographing videos comprise videos of photographing and previewing processes of M pictures photographed by a camera application program of the electronic equipment to be tested, and M is an integer greater than or equal to 1;
the first determining module is configured to determine, according to the photographed video, a picture preview duration and a picture generation duration of each picture in the M pictures, where the picture preview duration of an nth picture in the M pictures is a duration from when a photographing operation for photographing the nth picture is received from the electronic device to be tested to when an image thumbnail frame of the camera application completely displays the nth picture, and the picture generation duration of the nth picture is a duration from when the image thumbnail frame completely displays the nth picture to when a detail interface of the camera application completely displays the nth picture, and N is an integer greater than or equal to 1 and less than or equal to M;
and the second determining module is used for determining the photographing time length of the electronic equipment to be tested according to the picture preview time length and the picture generation time length of each picture in the M pictures.
In this way, the image preview time length and the image generation time length of each image in the M images shot by the electronic equipment to be tested can be accurately determined according to the video shot by the camera application program of the electronic equipment to be tested and recorded in the preview process, and then the image shooting time length of the electronic equipment to be tested can be accurately determined according to the image preview time length and the image generation time length of each image in the M images, so that the image shooting time length of the electronic equipment to be tested is accurately quantized, and subsequently, the image shooting speed performance of the electronic equipment to be tested can be evaluated according to the accurately quantized image shooting time length, and the evaluation result is more accurate.
In a possible implementation manner, the first determining module is configured to determine, according to the photographed video, a graph preview duration and a graph generation duration of the nth picture, where the graph preview duration and the graph generation duration are specifically: the first determining module is configured to: determining a first shooting time of the electronic equipment to be detected in the shooting video when receiving the first frame image of the shooting operation according to the shooting video, wherein the image thumbnail frame completely displays a second shooting time of the first frame image of the Nth picture, and the detail interface completely displays a third shooting time of the first frame image of the Nth picture; and determining the graph preview time length and the graph generation time length of the Nth picture according to the first shooting time, the second shooting time and the third shooting time.
In this way, the image preview time and the image generation time of the N-th image shot by the camera application program of the electronic equipment to be tested can be accurately determined according to the shooting time of each frame image obtained after the shooting video of the electronic equipment to be tested is framed, so that the image preview time and the image generation time of each image in M images can be accurately quantized according to the image preview time and the image generation time of each image, and the applicability is better.
In a possible implementation manner, the first determining module is configured to determine, according to the photographed video, a third photographing time of the detail interface in the photographed video to completely display the first frame image of the nth picture, where the third photographing time is specifically: the first determining module is configured to: acquiring an H frame image of the shooting and previewing process of the N-th picture shot by the electronic equipment to be detected in the shooting video, wherein H is an integer greater than or equal to 3; respectively acquiring a definition average value and a brightness average value of each frame of image in the H frame of image; and determining the third shooting time according to the definition average value and the brightness average value of each frame of image in the H frame of images.
Therefore, the time for finishing the loading of the Nth picture can be accurately determined according to the change of the definition and the brightness of each frame of image corresponding to the photographing and previewing process of the Nth picture taken by the electronic equipment to be tested, so that the picture generation time length of the Nth picture can be accurately determined, and the accuracy of the photographing time length of the electronic equipment to be tested can be further improved.
In a possible implementation manner, the first determining module is configured to determine the third capturing time according to a definition average value and a brightness average value of each frame of the H-frame image, where the third capturing time is specifically: the first determining module is configured to: sequentially determining the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image in the H-th frame image and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image according to the sequence from small to large, wherein K is an integer greater than or equal to 1 and less than or equal to H-2; and if the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image is smaller than a first preset threshold value, and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image is smaller than a second preset threshold value, determining that the shooting time of the K-th frame image is the third shooting time.
Therefore, the time for loading the Nth picture to be completed can be accurately determined according to the change of the definition and the brightness of the H-frame image corresponding to the photographing and previewing process of the Nth picture taken by the electronic equipment to be tested, so that the picture generation time length of the Nth picture can be accurately determined, and the accuracy of the photographing time length of the electronic equipment to be tested can be further improved.
In a possible implementation manner, the second determining module is configured to determine, according to a graph preview duration and a graph generation duration of each of the M pictures, a photographing duration of the electronic device to be tested, where the photographing duration is specifically: the second determining module is configured to: generating a picture preview time length average value and a picture generation time length average value of the M pictures according to the picture preview time length and the picture generation time length of each picture in the M pictures; and determining the sum of the graph preview time length average value and the picture generation time length average value as the photographing time length of the electronic equipment to be tested.
Therefore, the photographing time length of the electronic equipment to be tested can be determined according to the average value of the image preview time length and the average value of the image generation time length of the plurality of images photographed by the electronic equipment to be tested, the determined photographing time length has smaller error, higher accuracy and better applicability.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors and one or more memories; the one or more memories store computer programs or instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of the first aspects.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored therein a computer program or instructions which, when executed, perform a method as in any of the first aspects.
Therefore, through the method and the device for determining the photographing time length and the electronic equipment provided by the application, videos obtained by recording the photographing and previewing processes of the camera application program of the electronic equipment to be tested in the photographing process of M pictures can be obtained, the picture previewing time length and the picture generating time length of each picture in the M pictures can be determined according to the recorded videos, and then the photographing time length of the electronic equipment to be tested can be determined according to the picture previewing time length and the picture generating time length of each picture in the M pictures. That is, by the method provided by the application, the picture preview time length and the picture generation time length of each picture in the M pictures shot by the electronic equipment to be tested can be accurately determined according to the video shot by the camera application program of the electronic equipment to be tested and recorded in the shooting and previewing process of the M pictures, and then the shooting time length of the electronic equipment to be tested can be accurately determined according to the picture preview time length and the picture generation time length of each picture in the M pictures, so that the shooting time length of the electronic equipment to be tested is accurately quantized, and subsequently, the shooting speed performance of the electronic equipment to be tested can be evaluated according to the accurately quantized shooting time length, and the evaluation result is more accurate.
Drawings
Fig. 1 is a schematic structural diagram of a photographing duration determining system according to an embodiment of the present application;
fig. 2 is a schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 5 is a flowchart of a method for determining a photographing duration according to an embodiment of the present application;
fig. 6 is a flowchart of a method for determining a preview duration and a generation duration of an nth picture according to an embodiment of the present application;
fig. 7 is a flowchart of a method for acquiring a third image according to a photographed video of an electronic device to be tested according to an embodiment of the present application;
fig. 8 is a block diagram of a device for determining a photographing duration according to an embodiment of the present application;
fig. 9 is a block diagram of a chip according to an embodiment of the present application.
Detailed Description
The technical scheme of the application is described below with reference to the accompanying drawings.
In the description of the present application, unless otherwise indicated, "and/or" is merely an association relationship describing an association object, meaning that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. Further, "at least one" means one or more, "at least two" means two or more, and "a plurality" also means two or more. The terms "first," "second," and the like do not limit the number and order of execution, and the terms "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In order to facilitate understanding of the technical scheme of the present application, an application scenario of the technical scheme provided by the present application is first described in the following by way of example.
With the development of terminal technology, the demands of users for electronic devices (such as mobile phones) are gradually increased, and accordingly, the demands for various performances of the electronic devices are also increased. For example, the user's demand for a photographing speed of a camera (camera) of an electronic apparatus is gradually increasing.
Currently, the photographing speed of a camera of an electronic device is generally measured by the photographing duration of a picture taken by the camera. However, at present, the photographing time length of a camera of the electronic device is evaluated in a manual observation mode, and no better mode exists yet for accurately quantifying the photographing time length of the camera. Therefore, how to precisely quantify the photographing time of the camera of the electronic device is a technical problem to be solved by those skilled in the art.
In order to solve the technical problems, the embodiment of the application provides a method and a device for determining photographing duration and electronic equipment. In the method, firstly, videos obtained by recording in the process of photographing and previewing M pictures by a camera application program of the electronic equipment to be tested are obtained, then, according to the recorded videos, the picture previewing time length and the picture generating time length of each picture in the M pictures are determined, and then, according to the picture previewing time length and the picture generating time length of each picture in the M pictures, the photographing time length of the electronic equipment to be tested is determined. In other words, in the method provided by the application, according to the video recorded in the photographing and previewing process of photographing M pictures by the camera application program of the electronic equipment to be tested, the picture previewing time and the picture generating time of each picture in the M pictures obtained by photographing the electronic equipment to be tested are accurately determined, and then the photographing time of the electronic equipment to be tested is accurately determined according to the picture previewing time and the picture generating time of each picture in the M pictures, so that the photographing time of the electronic equipment to be tested is accurately quantized, and subsequently, the photographing speed performance of the electronic equipment to be tested can be evaluated according to the accurately quantized photographing time, and the evaluation result is more accurate.
The method for determining the photographing time length provided by the application can be applied to the system for determining the photographing time length shown in fig. 1. As shown in fig. 1, the system may include: the electronic device under test 10, the three-party recording device 20, the photographic subject 30, the light source device 40 and the test electronic device 50.
The electronic device 10 to be tested has the functions of photographing and previewing. The three-party recording device 20 is used for recording the photographing and previewing process of the electronic device 10 to be tested. The three-party recording device 20 may have a high frame rate, i.e., the three-party recording device 20 records a large number of frames per second. In the embodiment of the present application, the three-party recording device 20 is taken as an example of a high frame rate camera.
Both the electronic device under test 10 and the three-way recording device 20 may be mounted on a stand (e.g., a tripod). The display screen of the electronic device under test 10 is located within the recording field of view of the three-party recording device 20. Illustratively, the three-party recording device 20 may be secured directly in front of the display screen of the electronic device under test 10 by a tripod.
The photographic subject 30 is facing the camera of the electronic device under test 10, and the straight line distance between the photographic subject 30 and the electronic device under test 10 is 1 meter. In general, the camera mode of the electronic device under test 10 may include portrait scenes and/or non-portrait scenes. And the camera processing algorithm corresponding to the portrait scene is generally different from the camera processing algorithm corresponding to the non-portrait scene. Therefore, in the embodiment provided by the present application, the photographing duration of the electronic device 10 to be tested may be determined based on the portrait scene and/or the non-portrait scene, and the specific content may refer to the content of the subsequent embodiment, which will not be described in detail herein. Based on this, the photographic subject 30 may include a graphic card (e.g., a standard color card, etc.) and/or a head mold, etc. The image card can be used for a non-portrait scene, and the head model can be used for a portrait scene.
The light source device 40 may include light source devices of various application scenes, for example, may include light source devices of low, medium, high dynamic range, etc. application scenes. In specific implementation, the parameters of the light source device 40 may be adjusted, so that the light source device 40 may be adjusted to be a light source in the aforementioned various application scenarios, which is not limited by the present application. Further, the number of the light source devices 40 may be one or more, and the present application is not limited thereto. In fig. 1, an example is illustrated in which the light source device 40 includes two light source devices.
The test electronic device 50 may be communicatively coupled to the electronic device under test 10 and/or the three-way recording device 20 by a wired connection or a wireless connection. The test electronic device 50 may obtain, from the three-party recording device 20, a video recorded by the three-party recording device 20 during the photographing and previewing process of the electronic device 10 to be tested, and may be used to determine the photographing duration of the electronic device 10 to be tested according to the video.
When the photographing time length of the electronic device 10 to be measured needs to be determined, the photographing time length of the electronic device 10 to be measured may be determined through the system shown in fig. 1.
In particular, before determining the photographing duration of the electronic device 10 to be tested, the photographing object 30 is first selected, for example, a graphic card is selected as the photographing object 30. The card is then fixed to the gray plate. Then, the electronic device 10 to be tested is set at a position 1 m away from the graphics card, and the fixed position of the graphics card on the dust board is adjusted, so that the graphics card is located at the center of the view frame of the electronic device 10 to be tested and is completely displayed in the view frame, and when the electronic device 10 to be tested shoots the graphics card, the composition of the graphics card in the view frame of the electronic device 10 to be tested is a standardized composition. After the adjustment, the electronic device 10 to be tested is fixed by the tripod. The position of the graphics card and the electronic device under test 10 will not be changed in the process of determining the photographing time length of the electronic device under test 10 later.
Then, the three-party recording device 20 is disposed in front of the electronic device 10 to be measured, and the position of the three-party recording device 20 is adjusted so that the electronic device 10 to be measured is located in the shooting visual field range (or may also be referred to as the recording visual field range) of the three-party recording device 20, and the position of the three-party recording device 20 is not changed in the process of determining the shooting duration of the electronic device 10 to be measured.
After that, the light source device 40 is turned on, and the parameters of the light source device 40 are adjusted to a desired application scene, for example, the parameters of the light source device 40 are adjusted to a low-light application scene.
After the parameters of the light source device 40 are adjusted, the three-party recording device 20 is started, and the electronic device 10 to be tested is recorded through the three-party recording device 20 in the subsequent photographing and previewing process. Then, the camera application program of the electronic device 10 to be tested is started to run, and the picture card is photographed and previewed through the camera application program of the electronic device 10 to be tested.
As shown in fig. 2 (a), a user may trigger the camera application program to call the camera of the electronic device to be tested 10 to take a picture of the picture card by clicking the shutter button 60 on the preview interface (or may also be referred to as a shooting interface) of the camera application program of the electronic device to be tested 10, so as to obtain the 1 st picture of the picture card. Thereafter, as shown in fig. 2 (b), when the 1 st picture of the graphic card fills the image thumbnail frame 70 in the preview interface (i.e., the 1 st picture of the graphic card is completely displayed in the image thumbnail frame 70 of the preview interface), the user may trigger the electronic device 10 to be tested to load and display the 1 st picture of the graphic card on the detail interface of the camera application program by clicking the operation of the image thumbnail frame 70, and when the 1 st picture of the graphic card is completely loaded, the 1 st picture of the graphic card may be completely displayed on the detail interface of the camera application program as shown in fig. 2 (c), so far, the process of photographing the graphic card and generating the 1 st picture of the graphic card by the electronic device 10 to be tested is completed.
And then returning to the preview interface of the camera application program of the electronic device 10 to be tested again, and taking the 2 nd picture of the picture card according to the mode shown in fig. 2. Similarly, the 3 rd, … … th and M th pictures of the picture card can be obtained in the manner described above. M may be an integer greater than or equal to 1. The specific value of M may be set according to the requirements of the actual application scenario, for example, M may be set to 3.
In the above manner, after the M pictures of the graphics card are captured by the camera application program of the electronic device 10 to be tested, the recording process of the three-party recording device 20 may be stopped. Then, the video of the photographing and previewing process of the M pictures photographed by the camera application of the electronic device 10 to be tested, that is, the video obtained by recording the photographing and previewing process of the M pictures photographed by the camera application of the electronic device 10 to be tested (hereinafter referred to as photographing video of the electronic device to be tested) by the three-party recording device 20 may be stored.
It should be noted that, after obtaining the M pictures of the picture card in the above manner, other light sources may be switched, for example, parameters of the light source device 40 may be adjusted to application scenes such as a middle-bright, high-bright or high-dynamic range, and then in the switched light source application scenes, the shooting object 30 is shot and previewed in the above manner, so as to determine shooting durations of the electronic device 10 to be tested in various light source application scenes. Alternatively, the shooting object 30 may be switched, for example, the shooting object 30 is switched to a head model, and in a portrait scene, the shooting and previewing are performed on the head model by the electronic device 10 to be tested, and then, the shooting duration of the electronic device 10 to be tested in the portrait scene may be determined. Subsequently, when the photographing speed performance of the electronic device 10 to be tested is evaluated, the photographing time length of the electronic device 10 to be tested in various light sources and application scenes can be fused, and the photographing speed performance of the electronic device 10 to be tested is evaluated.
According to the requirements of actual application scenes, after the photographing videos of the electronic device 10 to be tested in various application scenes are recorded by the three-party recording device 20, the test electronic device 50 can obtain the photographing videos of the electronic device 10 to be tested from the three-party recording device 20, and then determine the photographing duration of the electronic device 10 to be tested according to the obtained photographing videos, and the specific content can refer to the content of the subsequent embodiments, which will not be described in detail herein.
It should be noted that the types of the electronic device under test 10 and the electronic device under test 50 may be the same or different. Types of electronic devices under test 10 and test 50 include, but are not limited to, cell phones, tablet computers, and the like. The embodiment of the application does not limit the specific forms of the electronic device 10 to be tested and the electronic device 50 to be tested, and the electronic device 10 to be tested can be any electronic device with photographing and previewing functions. Fig. 1 illustrates an example in which an electronic device under test 10 is a mobile phone and a test electronic device 50 is a notebook computer.
Illustratively, the specific structure of the electronic device under test 10, as well as the specific structure of the test electronic device 50, may refer to the schematic structural diagram of the electronic device 100 shown in fig. 3.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A. A plurality of speakers 170A may be provided in the electronic apparatus 100, for example, one speaker 170A may be provided at the top of the electronic apparatus 100, one speaker 170A may be provided at the bottom, or the like.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear. In some embodiments, speaker 170A and receiver 170B may also be provided as one component, as the application is not limited in this regard.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 4 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 4, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
An embodiment of the method for determining a photographing duration provided by the present application is described below.
Referring to fig. 5, fig. 5 is a flowchart of a method for determining a photographing duration according to an embodiment of the present application. The method may be applied to a test electronic device, for example, in the test electronic device 50 shown in fig. 1. As shown in fig. 5, the method may include the steps of:
and step S101, acquiring a photographing video of the electronic equipment to be tested.
In the implementation, when the photographing time length of any one electronic device needs to be determined, the electronic device can be determined as the electronic device to be tested. In combination with the foregoing embodiments, it can be known that the photographing video of the electronic device to be tested may be a video recorded during the photographing and previewing process of M pictures photographed by the camera application of the electronic device to be tested, that is, the photographing video of the electronic device to be tested includes a video during the photographing and previewing process of M pictures photographed by the camera application of the electronic device to be tested. M is an integer greater than or equal to 1. The specific value of M can be set according to the requirements of actual application scenes, and generally, the larger the value of M is, the higher the accuracy of the photographing time length of the electronic equipment to be measured is determined later. For example, M may be set to 3.
In one possible implementation manner, screen recording software can be installed in the electronic device to be tested, the photographing video of the electronic device to be tested can be recorded through the screen recording software, then the test electronic device can be in communication connection with the electronic device to be tested in a wired or wireless mode, and then the photographing video of the electronic device to be tested can be obtained from the electronic device to be tested. However, recording the photographed video through the screen recording software may affect the photographing speed performance of the electronic device to be tested, and affect the accuracy of the determined photographing duration.
In order to improve accuracy of the photographing duration, in another possible implementation, the photographing video of the electronic device 10 to be tested may be obtained through a three-party recording device (e.g., the three-party recording device 20 shown in fig. 1). And then, the test electronic equipment can acquire the photographing video of the electronic equipment to be tested from the three-party recording equipment. The specific implementation may refer to the content of the foregoing embodiment, and will not be described herein.
In the present application, the embodiment shown in fig. 5 is an example of a certain shooting object and a certain light source application scene, and an embodiment of a method for determining a shooting duration provided in the present application is described. In the other shooting objects and other light source application scenarios, how to determine the shooting duration of the electronic device to be detected can refer to the embodiment shown in fig. 5, which is not listed here.
And step S102, respectively determining the picture preview time length and the picture generation time length of each picture in M pictures shot by the electronic equipment to be detected according to the shooting video.
The image preview duration of the nth image in the M images is a duration from when the electronic device to be tested receives a photographing operation for photographing the nth image (i.e., a shutter button of a camera application of the electronic device to be tested starts to change) to when an image thumbnail frame of a preview interface (or referred to as a photographing interface) of the camera application of the electronic device to be tested completely displays the nth image (i.e., the image thumbnail frame is filled with the nth image). Before shooting the nth picture, the user clicks a shutter button of a preview interface of a camera application program of the electronic device to be tested, and through the operation, the camera application program of the electronic device to be tested can be triggered to call a camera of the electronic device to be tested to shoot the nth picture. The time length for generating the nth picture is the time length from the complete display of the nth picture by the image thumbnail frame of the preview interface of the camera application program of the electronic device to be tested to the complete display of the nth picture (namely, the loading of the nth picture is completed) by the detail interface of the camera application program. N is an integer greater than or equal to 1 and less than or equal to M.
It is known from the foregoing embodiments that the three-party recording device may be a high frame rate camera. Therefore, in the photographing video of the electronic device to be tested, the photographing process and the preview process of each of the M pictures taken by the electronic device to be tested may include multiple frames of images. Therefore, before determining the image preview time and the image generation time of each image in the M images shot by the electronic device to be tested according to the shot video, the shot video needs to be subjected to framing processing to obtain each frame of image included in the shot video. And then, according to each frame of image included in the photographed video, determining the picture preview time length and the picture generation time length of each picture in M pictures photographed by the electronic equipment to be tested.
Next, taking an nth picture of the M pictures as an example, a specific implementation manner of determining the graph preview duration and the graph generation duration of each picture of the M pictures is described. The determining manner of the graph preview time length and the graph generation time length of each of the M pictures can refer to the specific implementation manner of determining the graph preview time length and the graph generation time length of the nth picture.
Illustratively, determining the graph preview duration and the graph generation duration of the nth picture may be referred to the embodiment shown in fig. 6. Fig. 6 is a flowchart of a method for determining a preview duration and a generation duration of an nth picture according to an embodiment of the present application. As shown in fig. 6, the method may include the steps of:
Step S201, determining a first shooting time, a second shooting time and a third shooting time according to a shooting video of the electronic device to be tested.
The first shooting time is a shooting time (or called recording time) when the electronic device to be detected in the shooting video receives a first frame image of a shooting operation for shooting an nth picture. That is, the first photographing time refers to the photographing time of the first frame image in which the shutter button of the camera application of the electronic device to be tested changes after the shutter button is clicked before the nth picture is photographed in all the frame images included in the photographed video.
The second shooting time is the shooting time of a first frame image of the N-th picture in which an image thumbnail frame in a camera application program of the electronic equipment to be detected in the shooting video is completely displayed. The second shooting time refers to the shooting time of the first frame image in which the image thumbnail frame of the camera application program is filled after the electronic device to be tested shoots the nth picture in all the frame images included in the shooting video.
The third shooting time is the shooting time of the first frame image of the nth picture completely displayed by the detail interface of the camera application program of the electronic equipment to be tested in the shooting video. The third shooting time refers to the shooting time of the first frame image after the electronic device to be tested shoots the nth picture in all frame images included in the shooting video.
Based on this, determining the first photographing time may be implemented as follows: finding out a first frame image of the shooting operation of the electronic equipment to be detected, which is received by the electronic equipment to be detected and used for shooting an N-th picture, from all frame images included in the shooting video of the electronic equipment to be detected, namely, after a shutter button of a camera application program of the electronic equipment to be detected is clicked before the N-th picture is shot, recording the first frame image as a first image; determining a shooting time (recording time) of a first image according to the shooting time of each frame of image included in the photographed video; the photographing time of the first image is determined as the first photographing time.
The determination of the second photographing time may be achieved as follows: finding out a first frame image of which the image thumbnail frame completely displays an N-th picture in a camera application program of the electronic equipment to be detected from all frame images included in a shooting video of the electronic equipment to be detected, namely, finding out the first frame image of which the image thumbnail frame of the camera application program is filled up after the electronic equipment to be detected shoots the N-th picture, and marking the first frame image as a second image; determining the shooting time of a second image according to the shooting time of each frame of image included in the shooting video; the photographing time of the second image is determined as the second photographing time.
The determination of the third photographing time may be implemented as follows: finding out a first frame image of an N-th picture displayed completely on a detail interface of a camera application program of the electronic equipment to be detected from all frame images included in a photographing video of the electronic equipment to be detected, namely, finding out the first frame image of the N-th picture loaded after the electronic equipment to be detected photographs the N-th picture, and marking the first frame image as a third image; determining the shooting time of a third image according to the shooting time of each frame of image included in the shooting video; and determining the shooting time of the third image as the third shooting time.
The first image and the second image can be found out from all frame images included in the photographed video of the electronic device to be tested through selection operation of a tester.
The embodiment shown in fig. 7 can be referred to by acquiring the third image according to the photographed video of the electronic device to be tested, that is, by finding the third image from all the frame images included in the electronic device to be tested. Fig. 7 is a flowchart of a method for acquiring a third image according to a photographed video of an electronic device to be tested according to an embodiment of the present application. The method may comprise the steps of:
and step 301, acquiring an H-frame image of the shooting and previewing process of the N-th picture shot by the electronic equipment to be tested according to the shooting video of the electronic equipment to be tested.
In step S301, the test electronic device needs to find out all images in the photographing and previewing process of the nth picture taken by the electronic device to be tested from all frame images included in the photographing video of the electronic device to be tested, and totaling H frame images, where H is an integer greater than or equal to 3.
Step S302, respectively obtaining a definition average value and a brightness average value of each frame of image in the H frame of image.
Step S303, determining a third image according to the definition average value and the brightness average value of each frame of image in the H frame of images.
In a possible implementation manner, the third image is determined according to the definition average value and the brightness average value of each frame of image in the H-frame image, and the method can be implemented as follows: sequentially determining the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image in the H-th frame image and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image according to the sequence from small to large, wherein K is an integer greater than or equal to 1 and less than or equal to H-2; if the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image is smaller than a first preset threshold value, and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image is smaller than a second preset threshold value, determining that the K-th frame image is a third image.
That is, the 1 st frame image and the 3 rd frame image are extracted from the H frame image, and then the sharpness difference between the sharpness average value of the 1 st frame image and the sharpness average value of the 3 rd frame image, and the brightness difference between the brightness average value of the 1 st frame image and the brightness average value of the 3 rd frame image are determined. If the determined definition difference is smaller than the first preset threshold value and the brightness difference is smaller than the second preset threshold value, determining that the 1 st frame image in the H frame images is a third image. Or if the determined definition difference is greater than or equal to a first preset threshold value and/or the brightness difference is greater than or equal to a second preset threshold value, sequentially extracting the 2 nd frame image and the 4 th frame image from the H frame image, and then determining the definition difference between the definition average value of the 2 nd frame image and the definition average value of the 4 th frame image and the brightness difference between the brightness average value of the 2 nd frame image and the brightness average value of the 4 th frame image. If the determined definition difference is smaller than the first preset threshold value and the brightness difference is smaller than the second preset threshold value, determining that the 2 nd frame image in the H frame images is a third image. And so on until the third image is determined.
Further, the following formula can be adopted And determining the definition difference between the definition average value of the K frame image and the definition average value of the K+2 frame image. Wherein DeltaC k Indicating poor definition, C k Represents the definition average value of the K frame image, C k+2 The mean value of the sharpness of the k+2 frame image is represented.
Further, the following formula can be adoptedA luminance difference between the luminance average of the K-th frame image and the luminance average of the k+2-th frame image is determined. Wherein DeltaL k Representing the difference in brightness, L k Representing the luminance average value of the K-th frame image, L k+2 The luminance average of the k+2 frame image is represented.
The first preset threshold and the second preset threshold can be set according to the requirements of the actual application scene. For example, the first preset threshold may be set to 5% and the second preset threshold may be set to 1%.
It should be noted that, in some other optional application scenarios, the third image may also be determined by determining a sharpness difference and a brightness difference between the kth frame image and the kth+1th frame image in the H frame image. Alternatively, the third image may be also determined by determining a difference in sharpness and a difference in brightness between a kth frame image and a kth+3 frame image among the H frame images. The application is not limited in this regard.
Step S202, determining a graph preview duration and a graph generation duration of an nth graph according to the first shooting time, the second shooting time and the third shooting time.
In a possible implementation manner, determining the graph preview duration and the graph generation duration of the nth graph according to the first shooting time, the second shooting time and the third shooting time may be implemented in the following manner: and calculating to obtain the graph preview time length of the Nth picture according to the following formula T1=T4-T3. Wherein T1 represents a graph preview duration of the nth picture, T3 represents a first photographing time, and T4 represents a second photographing time. According to the following formula t2=t5-T4, the picture generation duration of the nth picture is calculated. Wherein T2 represents a picture generation duration of the nth picture, T5 represents a third photographing time, and T4 represents a second photographing time.
Step S103, determining the photographing time length of the electronic device to be tested according to the picture preview time length and the picture generation time length of each picture in the M pictures.
In a possible implementation manner, according to the graph preview duration and the graph generation duration of each of the M pictures, the determining the photographing duration of the electronic device to be tested may be implemented in the following manner: generating an average value of the picture preview time durations and an average value of the picture generation time durations of the M pictures according to the picture preview time durations and the picture generation time durations of each picture in the M pictures, wherein the average value of the picture preview time durations is an average value of the picture preview time durations of the M pictures, and the average value of the picture generation time durations of the M pictures is an average value of the picture generation time durations of the M pictures; and determining the sum of the graph preview time length average value and the picture generation time length average value as the photographing time length of the electronic equipment to be tested.
It should be noted that, the photographing time length of the electronic device to be measured in any one of the light source application scenes may be determined according to the methods shown in fig. 5 to 7, or the photographing time length of the electronic device to be measured in the portrait scene or the non-portrait scene may be determined according to the methods shown in fig. 5 to 7. When the photographing speed performance of the electronic equipment to be tested is evaluated, the photographing time length under any application scene can be independently used, the photographing speed performance of the electronic equipment to be tested can be evaluated, the photographing time lengths under various application scenes can be fused, and the photographing speed performance of the electronic equipment to be tested is evaluated.
In the method for determining the photographing time length provided by the embodiment of the application, the video obtained by recording the photographing and previewing processes of the M pictures photographed by the camera application program of the electronic device to be tested can be obtained, then the picture previewing time length and the picture generating time length of each picture in the M pictures can be determined according to the recorded video, and then the photographing time length of the electronic device to be tested can be determined according to the picture previewing time length and the picture generating time length of each picture in the M pictures. That is, by the method provided by the application, the picture preview time length and the picture generation time length of each picture in the M pictures shot by the electronic equipment to be tested can be accurately determined according to the video shot by the camera application program of the electronic equipment to be tested and recorded in the shooting and previewing process of the M pictures, and then the shooting time length of the electronic equipment to be tested can be accurately determined according to the picture preview time length and the picture generation time length of each picture in the M pictures, so that the shooting time length of the electronic equipment to be tested is accurately quantized, and subsequently, the shooting speed performance of the electronic equipment to be tested can be evaluated according to the accurately quantized shooting time length, and the evaluation result is more accurate.
The method embodiments described herein may be independent schemes or may be combined according to internal logic, and these schemes fall within the protection scope of the present application.
It will be appreciated that in the various method embodiments described above, the methods and operations performed by the test electronics may also be performed by components (e.g., chips, modules, or circuits) that may be used to test the electronics.
The embodiment describes a method for determining the photographing time length. It will be appreciated that the test electronics, in order to achieve the above-described functions, comprise corresponding hardware structures and/or software modules that perform each of the functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the test electronic device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The method for determining the photographing duration according to the embodiment of the application is described in detail above with reference to fig. 1 to 7. The following describes in detail the apparatus provided in the embodiment of the present application with reference to fig. 8 and 9. It should be understood that the descriptions of the apparatus embodiments and the descriptions of the method embodiments correspond to each other, and thus, descriptions of details not described may be referred to the above method embodiments, which are not repeated herein for brevity.
Referring to fig. 8, fig. 8 is a block diagram of a device for determining photographing duration according to an embodiment of the present application. The device can be part of an electronic device and is applied to the electronic device. And may be an electronic device, to which the present application is not limited. As shown in fig. 8, the apparatus 800 may include: an acquisition module 801, a first determination module 802, and a second determination module 803. The apparatus 800 may perform the operations performed by the test electronics in any of the method embodiments described above with respect to fig. 1-7.
For example, in an alternative embodiment of the present application, the obtaining module 801 may be configured to obtain a photo video of an electronic device to be tested, where the photo video includes a video of a photographing and previewing process of taking M pictures by a camera application of the electronic device to be tested, where M is an integer greater than or equal to 1.
The first determining module 802 may be configured to determine, according to the photographed video, a picture preview duration and a picture generation duration of each picture in the M pictures, where the picture preview duration of an nth picture in the M pictures is a duration from when the electronic device to be tested receives a photographing operation for photographing the nth picture to when an image thumbnail frame of the camera application completely displays the nth picture, and the picture generation duration of the nth picture is a duration from when the image thumbnail frame completely displays the nth picture to when a detail interface of the camera application completely displays the nth picture, and N is an integer greater than or equal to 1 and less than or equal to M.
The second determining module 803 may be configured to determine a photographing duration of the electronic device to be tested according to a graph preview duration and a graph generation duration of each of the M pictures.
In a possible implementation manner, the first determining module 802 is configured to determine, according to the photographed video, a graph preview duration and a graph generation duration of the nth picture, specifically: the first determining module 802 is configured to: determining a first shooting time of the electronic equipment to be detected in the shooting video when receiving the first frame image of the shooting operation according to the shooting video, wherein the image thumbnail frame completely displays a second shooting time of the first frame image of the Nth picture, and the detail interface completely displays a third shooting time of the first frame image of the Nth picture; and determining the graph preview time length and the graph generation time length of the Nth picture according to the first shooting time, the second shooting time and the third shooting time.
In a possible implementation manner, the first determining module 802 is configured to determine, according to the photographed video, a third photographing time of the detail interface in the photographed video to completely display the first frame image of the nth picture, which specifically is: the first determining module 802 is configured to: acquiring an H frame image of the shooting and previewing process of the N-th picture shot by the electronic equipment to be detected in the shooting video, wherein H is an integer greater than or equal to 3; respectively acquiring a definition average value and a brightness average value of each frame of image in the H frame of image; and determining the third shooting time according to the definition average value and the brightness average value of each frame of image in the H frame of images.
In a possible implementation manner, the first determining module 802 is configured to determine the third capturing time according to a definition average value and a brightness average value of each frame of the H-frame images, which is specifically: the first determining module 802 is configured to: sequentially determining the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image in the H-th frame image and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image according to the sequence from small to large, wherein K is an integer greater than or equal to 1 and less than or equal to H-2; and if the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image is smaller than a first preset threshold value, and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image is smaller than a second preset threshold value, determining that the shooting time of the K-th frame image is the third shooting time.
In a possible implementation manner, the second determining module 803 is configured to determine a photographing duration of the electronic device to be tested according to a graph preview duration and a graph generation duration of each of the M pictures, where the photographing duration is specifically: the second determining module 803 is configured to: generating a picture preview time length average value and a picture generation time length average value of the M pictures according to the picture preview time length and the picture generation time length of each picture in the M pictures; and determining the sum of the graph preview time length average value and the picture generation time length average value as the photographing time length of the electronic equipment to be tested.
That is, the apparatus 800 may implement steps or processes performed by the test electronic device in the embodiment of the method for determining a photographing duration shown in any of fig. 1 to 7, and the apparatus 800 may include a module for performing the method performed by the test electronic device in the embodiment of the method for determining a photographing duration shown in any of fig. 1 to 7. It should be understood that the specific process of executing the corresponding steps by each module is already described in detail in the above embodiment of the photographing time determining method, and is not described herein for brevity.
The embodiment of the application also provides a processing device which comprises at least one processor and a communication interface. The communication interface is configured to provide information input and/or output to the at least one processor, which is configured to perform the method of the above-described method embodiments.
It should be understood that the processing means may be a chip. For example, referring to fig. 9, fig. 9 is a block diagram of a chip according to an embodiment of the present application. The chip shown in fig. 9 may be a general-purpose processor or a special-purpose processor. The chip 900 may include at least one processor 901. The at least one processor 901 may be configured to support the apparatus shown in fig. 8 to perform the technical solutions shown in any one of the embodiments in fig. 1 to 7.
Optionally, the chip 900 may further include a transceiver 902, where the transceiver 902 is configured to receive control of the processor 901, and is configured to support the apparatus shown in fig. 8 to perform the technical solutions shown in any one of the embodiments in fig. 1 to fig. 7. Optionally, the chip 900 shown in fig. 9 may further include a storage medium 903. In particular, the transceiver 902 may be replaced by a communication interface that provides information input and/or output to the at least one processor 901.
It should be noted that the chip 900 shown in fig. 9 may be implemented using the following circuits or devices: one or more field programmable gate arrays (field programmable gate array, FPGA), programmable logic devices (programmable logic device, PLD), application specific integrated chips (application specific integrated circuit, ASIC), system on chip (SoC), central processing unit (central processor unit, CPU), network processors (network processor, NP), digital signal processing circuits (digital signal processor, DSP), microcontrollers (micro controller unit, MCU), controllers, state machines, gate logic, discrete hardware components, any other suitable circuit, or any combination of circuits capable of executing the various functions described throughout this application.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method. To avoid repetition, a detailed description is not provided herein.
It will be appreciated that the memory in embodiments of the application may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
According to the method provided by the embodiment of the application, the embodiment of the application also provides a computer program product, which comprises: computer program or instructions which, when run on a computer, cause the computer to perform the method of any of the embodiments shown in fig. 1 to 7.
According to the method provided by the embodiment of the present application, the embodiment of the present application further provides a computer storage medium storing a computer program or instructions, which when executed on a computer, cause the computer to perform the method of any one of the embodiments shown in fig. 1 to 7.
According to the method provided by the embodiment of the application, the embodiment of the application also provides the electronic equipment. The electronic device includes, but is not limited to, a mobile phone, a tablet computer, a personal computer, a workstation device, a large screen device (e.g., a smart screen, a smart television, etc.), a palm game, a home game, a virtual reality device, an augmented reality device, a mixed reality device, a vehicle-mounted smart terminal, etc. The electronic device may include the photographing duration determining apparatus provided in the foregoing embodiment of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working processes of the system, the device module and the electronic apparatus described above may refer to corresponding processes in the foregoing method embodiments, which are not described herein again.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each module may exist alone physically, or two or more modules may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above-mentioned determining device, system, processing device, chip, computer storage medium, computer program product, and electronic device for determining photographing duration according to the embodiments of the present application are all configured to execute the above-mentioned method, so that the beneficial effects achieved by the device can refer to the beneficial effects corresponding to the above-mentioned method, and are not described herein again.
It should be understood that, in the embodiments of the present application, the execution sequence of each step should be determined by the function and the internal logic, and the size of the sequence number of each step does not mean that the execution sequence is sequential, and does not limit the implementation process of the embodiments.
All parts of the specification are described in a progressive manner, and all parts of the embodiments which are the same and similar to each other are referred to each other, and each embodiment is mainly described as being different from other embodiments. In particular, for embodiments of the apparatus, system, chip, computer storage medium, computer program product, electronic device for determining a photographing duration, the description is relatively simple, as it is substantially similar to the method embodiments, and the relevant points are referred to in the description of the method embodiments.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
The embodiments of the present application described above do not limit the scope of the present application.

Claims (12)

1. A method for determining a photographing duration, the method comprising:
acquiring photographing videos of electronic equipment to be tested, wherein the photographing videos comprise videos of photographing and previewing processes of M pictures photographed by a camera application program of the electronic equipment to be tested, and M is an integer greater than or equal to 1;
respectively determining a picture preview time length and a picture generation time length of each picture in the M pictures according to the photographed video, wherein the picture preview time length of an nth picture in the M pictures is a time length from the photographing operation for photographing the nth picture received by the electronic equipment to the complete display of the nth picture by an image thumbnail frame of the camera application program, the picture generation time length of the nth picture is a time length from the complete display of the nth picture by the image thumbnail frame to the complete display of the nth picture by a detail interface of the camera application program, and N is an integer which is more than or equal to 1 and less than or equal to M;
and determining the photographing time length of the electronic equipment to be tested according to the picture preview time length and the picture generation time length of each picture in the M pictures.
2. The method of claim 1, wherein determining a picture preview duration and a picture generation duration for the nth picture from the captured video comprises:
Determining a first shooting time of the electronic equipment to be detected in the shooting video when receiving the first frame image of the shooting operation according to the shooting video, wherein the image thumbnail frame completely displays a second shooting time of the first frame image of the Nth picture, and the detail interface completely displays a third shooting time of the first frame image of the Nth picture;
and determining the graph preview time length and the graph generation time length of the Nth picture according to the first shooting time, the second shooting time and the third shooting time.
3. The method according to claim 2, wherein determining, according to the photographed video, a third photographing time of the detail interface in the photographed video to completely display the first frame image of the nth picture includes:
acquiring an H frame image of the shooting and previewing process of the N-th picture shot by the electronic equipment to be detected in the shooting video, wherein H is an integer greater than or equal to 3;
respectively acquiring a definition average value and a brightness average value of each frame of image in the H frame of image;
and determining the third shooting time according to the definition average value and the brightness average value of each frame of image in the H frame of images.
4. The method of claim 3, wherein determining the third photographing time according to the mean value of sharpness and the mean value of brightness of each of the H-frame images comprises:
sequentially determining the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image in the H-th frame image and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image according to the sequence from small to large, wherein K is an integer greater than or equal to 1 and less than or equal to H-2;
and if the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image is smaller than a first preset threshold value, and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image is smaller than a second preset threshold value, determining that the shooting time of the K-th frame image is the third shooting time.
5. The method according to any one of claims 1-4, wherein the determining the photographing duration of the electronic device to be tested according to the graph preview duration and the graph generation duration of each of the M pictures includes:
Generating a picture preview time length average value and a picture generation time length average value of the M pictures according to the picture preview time length and the picture generation time length of each picture in the M pictures;
and determining the sum of the graph preview time length average value and the picture generation time length average value as the photographing time length of the electronic equipment to be tested.
6. A device for determining a photographing duration, the device comprising:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring photographing videos of electronic equipment to be tested, the photographing videos comprise videos of photographing and previewing processes of M pictures photographed by a camera application program of the electronic equipment to be tested, and M is an integer greater than or equal to 1;
the first determining module is configured to determine, according to the photographed video, a picture preview duration and a picture generation duration of each picture in the M pictures, where the picture preview duration of an nth picture in the M pictures is a duration from when a photographing operation for photographing the nth picture is received from the electronic device to be tested to when an image thumbnail frame of the camera application completely displays the nth picture, and the picture generation duration of the nth picture is a duration from when the image thumbnail frame completely displays the nth picture to when a detail interface of the camera application completely displays the nth picture, and N is an integer greater than or equal to 1 and less than or equal to M;
And the second determining module is used for determining the photographing time length of the electronic equipment to be tested according to the picture preview time length and the picture generation time length of each picture in the M pictures.
7. The apparatus of claim 6, wherein the first determining module is configured to determine, according to the photographed video, a graph preview duration and a graph generation duration of the nth picture, specifically:
the first determining module is configured to:
determining a first shooting time of the electronic equipment to be detected in the shooting video when receiving the first frame image of the shooting operation according to the shooting video, wherein the image thumbnail frame completely displays a second shooting time of the first frame image of the Nth picture, and the detail interface completely displays a third shooting time of the first frame image of the Nth picture;
and determining the graph preview time length and the graph generation time length of the Nth picture according to the first shooting time, the second shooting time and the third shooting time.
8. The apparatus of claim 7, wherein the first determining module is configured to determine, according to the photographed video, a third photographing time of the first frame image in which the detail interface in the photographed video completely displays the nth picture, specifically:
The first determining module is configured to:
acquiring an H frame image of the shooting and previewing process of the N-th picture shot by the electronic equipment to be detected in the shooting video, wherein H is an integer greater than or equal to 3;
respectively acquiring a definition average value and a brightness average value of each frame of image in the H frame of image;
and determining the third shooting time according to the definition average value and the brightness average value of each frame of image in the H frame of images.
9. The apparatus of claim 8, wherein the first determining module is configured to determine the third capturing time according to a sharpness average and a brightness average of each frame of the H-frame images, specifically:
the first determining module is configured to:
sequentially determining the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image in the H-th frame image and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image according to the sequence from small to large, wherein K is an integer greater than or equal to 1 and less than or equal to H-2;
and if the definition difference between the definition average value of the K-th frame image and the definition average value of the K+2-th frame image is smaller than a first preset threshold value, and the brightness difference between the brightness average value of the K-th frame image and the brightness average value of the K+2-th frame image is smaller than a second preset threshold value, determining that the shooting time of the K-th frame image is the third shooting time.
10. The apparatus according to any one of claims 6-9, wherein the second determining module is configured to determine, according to a graph preview duration and a graph generation duration of each of the M pictures, a photographing duration of the electronic device to be tested, specifically:
the second determining module is configured to:
generating a picture preview time length average value and a picture generation time length average value of the M pictures according to the picture preview time length and the picture generation time length of each picture in the M pictures;
and determining the sum of the graph preview time length average value and the picture generation time length average value as the photographing time length of the electronic equipment to be tested.
11. An electronic device, the electronic device comprising: one or more processors and one or more memories; the one or more memories store computer programs or instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-5.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program or instructions which, when executed, is adapted to carry out the method according to any one of claims 1-5.
CN202211216155.XA 2022-09-30 2022-09-30 Method and device for determining photographing duration and electronic equipment Active CN116708751B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211216155.XA CN116708751B (en) 2022-09-30 2022-09-30 Method and device for determining photographing duration and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211216155.XA CN116708751B (en) 2022-09-30 2022-09-30 Method and device for determining photographing duration and electronic equipment

Publications (2)

Publication Number Publication Date
CN116708751A true CN116708751A (en) 2023-09-05
CN116708751B CN116708751B (en) 2024-02-27

Family

ID=87843998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211216155.XA Active CN116708751B (en) 2022-09-30 2022-09-30 Method and device for determining photographing duration and electronic equipment

Country Status (1)

Country Link
CN (1) CN116708751B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009206827A (en) * 2008-02-27 2009-09-10 Canon Inc Imaging apparatus, and control method of imaging apparatus
CN105491284A (en) * 2015-11-30 2016-04-13 小米科技有限责任公司 Preview image display method and device
US20170244897A1 (en) * 2016-02-18 2017-08-24 Samsung Electronics Co., Ltd. Electronic device and operating method thereof
US20170316270A1 (en) * 2014-10-28 2017-11-02 Zte Corporation Method, system, and device for processing video shooting
CN109218810A (en) * 2018-08-29 2019-01-15 努比亚技术有限公司 A kind of video record parameter regulation method, equipment and computer readable storage medium
CN109413326A (en) * 2018-09-18 2019-03-01 Oppo(重庆)智能科技有限公司 Camera control method and Related product
CN110086985A (en) * 2019-03-25 2019-08-02 华为技术有限公司 A kind of method for recording and electronic equipment of time-lapse photography
WO2020057661A1 (en) * 2018-09-21 2020-03-26 华为技术有限公司 Image capturing method, device, and apparatus
WO2020073959A1 (en) * 2018-10-12 2020-04-16 华为技术有限公司 Image capturing method, and electronic device
WO2020173379A1 (en) * 2019-02-27 2020-09-03 华为技术有限公司 Picture grouping method and device
CN112532857A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Shooting method and equipment for delayed photography
CN112738414A (en) * 2021-04-06 2021-04-30 荣耀终端有限公司 Photographing method, electronic device and storage medium
US20210337136A1 (en) * 2020-04-27 2021-10-28 Beijing Xiaomi Pinecone Electronics Co., Lid. Method and apparatus for processing video, and storage medium
CN113810602A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Shooting method and electronic equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009206827A (en) * 2008-02-27 2009-09-10 Canon Inc Imaging apparatus, and control method of imaging apparatus
US20170316270A1 (en) * 2014-10-28 2017-11-02 Zte Corporation Method, system, and device for processing video shooting
CN105491284A (en) * 2015-11-30 2016-04-13 小米科技有限责任公司 Preview image display method and device
US20170244897A1 (en) * 2016-02-18 2017-08-24 Samsung Electronics Co., Ltd. Electronic device and operating method thereof
CN109218810A (en) * 2018-08-29 2019-01-15 努比亚技术有限公司 A kind of video record parameter regulation method, equipment and computer readable storage medium
CN109413326A (en) * 2018-09-18 2019-03-01 Oppo(重庆)智能科技有限公司 Camera control method and Related product
WO2020057661A1 (en) * 2018-09-21 2020-03-26 华为技术有限公司 Image capturing method, device, and apparatus
WO2020073959A1 (en) * 2018-10-12 2020-04-16 华为技术有限公司 Image capturing method, and electronic device
WO2020173379A1 (en) * 2019-02-27 2020-09-03 华为技术有限公司 Picture grouping method and device
CN110086985A (en) * 2019-03-25 2019-08-02 华为技术有限公司 A kind of method for recording and electronic equipment of time-lapse photography
CN112532857A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Shooting method and equipment for delayed photography
US20210337136A1 (en) * 2020-04-27 2021-10-28 Beijing Xiaomi Pinecone Electronics Co., Lid. Method and apparatus for processing video, and storage medium
CN112738414A (en) * 2021-04-06 2021-04-30 荣耀终端有限公司 Photographing method, electronic device and storage medium
CN113810602A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Shooting method and electronic equipment

Also Published As

Publication number Publication date
CN116708751B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN113132620B (en) Image shooting method and related device
WO2020073959A1 (en) Image capturing method, and electronic device
CN115473957B (en) Image processing method and electronic equipment
CN114650363B (en) Image display method and electronic equipment
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN113810603B (en) Point light source image detection method and electronic equipment
CN113704205B (en) Log storage method, chip, electronic device and readable storage medium
CN113448382B (en) Multi-screen display electronic device and multi-screen display method of electronic device
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN115914461B (en) Position relation identification method and electronic equipment
CN116055859B (en) Image processing method and electronic device
CN116048217B (en) Electronic equipment operation method and device and electronic equipment
WO2022033344A1 (en) Video stabilization method, and terminal device and computer-readable storage medium
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
CN114283195A (en) Method for generating dynamic image, electronic device and readable storage medium
CN113573045A (en) Stray light detection method and stray light detection device
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN116703741B (en) Image contrast generation method and device and electronic equipment
CN116389884B (en) Thumbnail display method and terminal equipment
CN115482143B (en) Image data calling method and system for application, electronic equipment and storage medium
CN116051351B (en) Special effect processing method and electronic equipment
CN116703689B (en) Method and device for generating shader program and electronic equipment
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN116048831B (en) Target signal processing method and electronic equipment
CN116233599B (en) Video mode recommendation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant