EP4064689B1 - Procédé de photographie vidéo et dispositif électronique - Google Patents

Procédé de photographie vidéo et dispositif électronique Download PDF

Info

Publication number
EP4064689B1
EP4064689B1 EP21791909.1A EP21791909A EP4064689B1 EP 4064689 B1 EP4064689 B1 EP 4064689B1 EP 21791909 A EP21791909 A EP 21791909A EP 4064689 B1 EP4064689 B1 EP 4064689B1
Authority
EP
European Patent Office
Prior art keywords
camera
shooting
frame rate
electronic device
moving object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP21791909.1A
Other languages
German (de)
English (en)
Other versions
EP4064689A4 (fr
EP4064689A1 (fr
Inventor
Qiuyang Wei
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of EP4064689A1 publication Critical patent/EP4064689A1/fr
Publication of EP4064689A4 publication Critical patent/EP4064689A4/fr
Application granted granted Critical
Publication of EP4064689B1 publication Critical patent/EP4064689B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • H04N5/783Adaptations for reproducing at a rate different from the recording rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • This disclosure generally relates to the field of artificial intelligence (artificial intelligence, AI) technologies.
  • This invention relates to a video shooting method and an electronic device.
  • an intelligent device provided with a camera can support more and more shooting functions, for example, a function of shooting a slow motion video.
  • a principle of shooting a "slow motion video” is as follows: A shooting frame rate used when the intelligent device shoots a moving object is N times of a play frame rate, where N is an integer greater than or equal to 2. In this way, a moving speed of the moving object in the video played by the intelligent device is 1/N of an actual moving speed of the moving object, to obtain the "slow motion video" of the moving object.
  • a shooting frame rate needs to be manually set by a user in advance. Consequently, a degree of intelligence is relatively low, and a video with the shooting frame rate set by the user cannot achieve a relatively good play effect, resulting in a poor shooting effect.
  • CN 107395972 A discloses a shooting method of a quick moving object and a terminal.
  • the shooting method of a quick moving object is applied to a terminal device with two cameras, and includes the steps: a terminal acquires the monitoring information of a target object at at least two different positions, wherein the moving direction of the target object is from the vision field range of the first camera to the vision field range of the second camera, and at least one position is arranged in the vision field range of the first camera and the vision field range of the second camera at the same time; the terminal determines the moving speed of the target object according to the monitoring information of the at least two different positions; when the moving speed exceeds the first threshold, the terminal determines the time information that the target object moves to the second vision field center of the second camera; and the terminal sends a shooting command to the second camera, and the shooting command indicates the second camera to shoot the target object according to the time information.
  • the shooting method of a quick moving object and the terminal can solve the problem that current shooting of a quick moving object is not clear in imaging
  • CN 104967803 A discloses a video recording method and a video recording device.
  • the video recording method comprises the steps of tracking a target tracking object in selected video when the video is recorded, and determining the moving speed of the target tracking object based on a tracking result; and adjusting the current video recording frame rate according to the moving speed of the target tracking object.
  • EP 3 481 048 A1 discloses an electronic device for recording an image using multiple cameras and an operating method thereof.
  • the electronic device may include: a first camera configured to acquire an image at a first frame rate; a second camera configured to acquire an image at a second frame rate higher than the first frame rate; a memory; and a processor, wherein the processor is configured to: identify first capturing attributes corresponding to the first camera; set at least some of second capturing attributes corresponding to the second camera, to correspond to the first capturing attributes; acquire a first image set corresponding to an external object through the first camera based on a signal for capturing the external object; identify whether a designated condition related to a frame rate increase is satisfied; acquire a second image set corresponding to the external object at the second frame rate through the second camera when the designated condition is satisfied; and store, in the memory, at least some first images from the first image set and at least some second images from the second image set, as an image corresponding to the external object.
  • the object of the present invention is to provide a video shooting method and an electronic device with automatic slow motion frame rate setting. This object is solved by the attached independent claims and further embodiments and improvements of the invention are listed in the attached dependent claims.
  • expressions like "...aspect according to the invention”, “according to the invention”, or “the present invention” relate to technical teaching of the broadest embodiment as claimed with the independent claims.
  • this invention provides a video shooting method.
  • the video shooting method is applied to an electronic device, the electronic device is provided with a first camera and a second camera, and the first camera and the second camera are different in at least one of a field of view (field of view, FOV) and a shooting direction.
  • the method includes: receiving a shooting instruction entered by a user; invoking the first camera to perform preview; if it is detected that a target moving object moves in the FOV of the first camera, determining a target shooting frame rate based on a moving speed of the target moving object; and invoking the second camera to shoot a video of the target moving object at the target shooting frame rate.
  • the electronic device to which the method in this application is applied may be an electronic device that includes a first camera and a second camera. Both the first camera and the second camera may be rear-facing cameras of the electronic device or front-facing cameras of the electronic device, or one of the first camera and the second camera is a front-facing camera of the electronic device, and the other is a rear-facing camera of the electronic device. Further, the FOV of the first camera is different from that of the second camera, or the shooting direction of the first camera is different from that of the second camera, or both the FOV and the shooting direction of the first camera are different from those of the second camera.
  • the electronic device After receiving the shooting instruction, invokes the first camera to perform preview. After the target moving object enters the FOV of the first camera, the electronic device can determine a shooting parameter such as the target shooting frame rate based on the moving speed of the target moving object. Then, the electronic device invokes the second camera to shoot a slow motion video of the target moving object by using the shooting parameter such as the determined target shooting frame rate.
  • a shooting parameter such as the target shooting frame rate
  • the electronic device can determine a shooting frame rate of the slow motion video. In this way, not only a degree of intelligence is high, but also a relatively suitable shooting frame rate can be determined, so that a shooting effect can be optimized, and user experience can be improved.
  • the method includes: when the shooting instruction is a first shooting instruction, determining a camera with a relatively large FOV as the first camera; or when the shooting instruction is a second shooting instruction, determining a camera with a relatively small FOV as the first camera.
  • the target moving object may move from one point to another point, for example, a moving track of a traveling train.
  • the target moving object may alternatively move around from a central point in an explosive manner, for example, moving tracks of fireworks during blooming.
  • the shooting instruction in this application may be a first shooting instruction or a second shooting instruction.
  • the electronic device is configured to shoot a scenario in which the target moving object may move from one point to another point.
  • the electronic device determines a camera with a relatively large FOV as the first camera.
  • the electronic device When the shooting instruction is the second shooting instruction, the electronic device is configured to shoot a scenario in which the target moving object moves around from a central point in an explosive manner. Correspondingly, the electronic device determines a camera with a relatively small FOV as the first camera. It can be learned that the video shooting method in this application can be applied to a plurality of shooting scenarios, and is widely applicable.
  • the invoking the first camera to perform preview includes: when the shooting instruction is the first shooting instruction, invoking the first camera to perform preview at a first frame rate; or when the shooting instruction is the second shooting instruction, invoking the first camera to perform preview at a second frame rate.
  • the electronic device may control the first camera to perform preview at the first frame rate, and the first frame rate may be, for example, 30 fps.
  • the electronic device may control the first camera to perform preview at the second frame rate, and the second frame rate is, for example, a highest frame rate supported by the first camera. It can be learned that, in the video shooting method in this application, preview can be performed by using a frame rate corresponding to a shooting scenario, and the video shooting method is widely applicable.
  • the determining a target shooting frame rate based on a moving speed of the target moving object includes: determining a best frame rate based on the moving speed of the target moving object; and determining, as the target shooting frame rate, a frame rate that is in frame rates supported by the second camera and that is adjacent to the best frame rate and greater than the best frame rate.
  • the electronic device may determine the best frame rate based on the moving speed of the target moving object. Further, in some embodiments, the electronic device supports the best frame rate, and the electronic device may use the best frame rate as the target shooting frame rate. In some other embodiments, the electronic device does not support the best frame rate, and the electronic device includes only two cameras.
  • the electronic device determines, as the target shooting frame rate, a frame rate that is in frame rates supported by the second camera and that is adjacent to the best frame rate and greater than the best frame rate. In this implementation, a best shooting effect can be ensured, thereby improving user experience.
  • the determining a target shooting frame rate based on a moving speed of the target moving object includes: determining a best frame rate based on the moving speed of the target moving object; calculating a difference between the best frame rate and each of frame rates supported by the second camera, to obtain one or more differences; determining whether a smallest difference is less than a first threshold, where the smallest difference belongs to the one or more differences, and the first threshold is a value obtained by multiplying a frame rate corresponding to the smallest difference by a preset percentage; and if the smallest difference is less than the first threshold, determining the frame rate corresponding to the smallest difference as the target shooting frame rate.
  • the electronic device may determine the target shooting frame rate based on a difference between the best frame rate and each of frame rates supported by other cameras than the first camera, and then select one camera entity as the second camera. It should be noted that a smaller difference between frame rates indicates a better shooting effect corresponding to a shooting frame rate corresponding to the difference. Based on this, the electronic device may select the smallest difference from the one or more differences, and further, when the smallest difference is less than the value obtained by multiplying the frame rate corresponding to the smallest difference by the preset percentage, determine the frame rate corresponding to the smallest difference as the target shooting frame rate. In addition, the electronic device may further use a camera that supports the frame rate corresponding to the smallest difference as the second camera. In this implementation, a best shooting effect can be ensured, thereby improving user experience.
  • the moving speed of the target moving object may be represented as L1 T 1 .
  • the first duration may be preset duration.
  • the first duration is 0.05 seconds.
  • the first duration may be a reciprocal of a frame rate used when the first camera performs preview.
  • the electronic device can determine the best frame rate based on the moving speed of the target moving object during preview, and can further determine the target shooting frame rate. In this way, not only a degree of intelligence is high, but also a relatively suitable shooting frame rate can be determined, so that a shooting effect can be optimized.
  • the method further includes: determining one or more of the following parameters based on the moving speed of the target moving object: a moment of starting the second camera, a trigger location of starting the second camera, an exposure parameter of the second camera, and a focal length of the second camera, where the trigger location is a location to which the target moving object moves.
  • the electronic device can start the second camera at a proper time based on the moving speed of the target moving object or the like, so that the second camera performs shooting by using a proper shooting parameter.
  • the second camera can capture the target moving object in a proper shooting mode, so that the electronic device does not need to cache several videos of a moment at which the target moving object enters a trigger area, thereby saving storage space.
  • the method when the invoking the second camera to shoot a video of the target moving object at the target shooting frame rate, the method further includes: invoking the first camera to shoot the video of the target moving object at a third shooting frame rate, where the third shooting frame rate is different from the target shooting frame rate.
  • the user may enter an instruction for displaying a video on a plurality of screens to the electronic device.
  • the electronic device may further invoke the first camera to shoot the video of the target moving object.
  • a shooting frame rate of the first camera is different from the target shooting frame rate.
  • the electronic device can simultaneously perform shooting by using at least two cameras, so that videos played at different moving speeds in a same shooting scenario can be obtained.
  • the method further includes: playing a first video file shot by the first camera and a second video file shot by the second camera, where a third shooting frame rate corresponding to the first video file is different from the target shooting frame rate corresponding to the second video file.
  • the electronic device may play two video files.
  • the two video files are shot in a same shooting scenario by using different shooting frame rates.
  • the two video files are presented at different moving speeds. It can be learned that, in this implementation, the electronic device can simultaneously perform shooting by using at least two cameras, so that videos played at different moving speeds in a same shooting scenario can be obtained, thereby improving viewing experience of the user.
  • the invention also provides an electronic device.
  • the electronic device has functions of implementing the foregoing method.
  • the functions may be implemented by hardware, or may be implemented by executing corresponding software by hardware.
  • the hardware or software includes one or more modules corresponding to the foregoing functions.
  • a structure of the foregoing electronic device includes a processor and a memory, and the processor is configured to process the electronic device to implement a corresponding function in the foregoing method.
  • the memory is configured to store program instructions and data that are necessary for the electronic device.
  • the invention also provides a computer storage medium.
  • the computer storage medium stores instructions, and when the instructions are run on a computer, the computer is enabled to perform some or all of the steps of the video shooting method in the first aspect and the possible implementations of the first aspect.
  • this application provides a computer program product.
  • the computer program product When the computer program product is run on a computer, the computer is enabled to perform some or all of the steps of the video shooting method in the first aspect and the possible implementations of the first aspect.
  • an electronic device including at least two cameras may use one of the at least two cameras to preview a target moving object. Then, the electronic device determines a shooting camera, a shooting frame rate of the shooting camera, a start moment, a focal length, an exposure parameter, and the like based on a moving parameter of the target moving object during preview, an attribute parameter of a preview camera, and the like, and then starts, based on the determined data, the shooting camera in the at least two cameras to shoot a slow motion video. In this way, not only a best shooting frame rate can be determined based on a moving speed of the target moving object, to achieve a high degree of intelligence, but also storage space can be saved, to optimize a shooting effect.
  • first and second may be used to describe an object of a type in the following embodiments, the object should not be limited to these terms. These terms are merely used to distinguish between specific objects of this type.
  • terms such as first and second may be used to describe a camera in the following embodiments, but the camera should not be limited to these terms. These terms are merely used to distinguish between different cameras of an electronic device. This goes the same when terms such as first and second are used in the following embodiments to describe an object of another type. Details are not described herein.
  • the following embodiments describe an electronic device and a video shooting method applied to the electronic device.
  • the electronic device in this application may be an electronic device that includes at least two cameras (camera), such as a mobile phone, a tablet computer, a monitoring device, or a vehicle-mounted device.
  • the electronic device may be, for example, a device running iOS ® , Android ® , Microsoft ® , or another operating system.
  • the at least two cameras in this application are different in a field of view (field of view, FOV), or a shooting direction, or both an FOV and a shooting direction.
  • the FOV of the camera is a range of an area that can be captured by the camera.
  • the FOV of one of the at least two cameras is 120 degrees
  • the FOV of the other camera is 60 degrees.
  • the FOV of one of the at least two cameras is 150 degrees
  • the shooting direction of the camera is 30 degrees to the left relative to a screen of the electronic device.
  • the FOV of the other of the at least two cameras is 80 degrees
  • the shooting direction of the camera is 30 degrees to the right relative to the screen of the electronic device.
  • a "user interface (user interface, UI)" used to display a video in this application is a media interface for interaction and information exchange between an application or an operating system and a user, and implements conversion between an internal form of information and an acceptable form of the user.
  • a user interface for an application is source code written in specific computer language such as Java or extensible markup language (extensible markup language, XML).
  • the interface source code is parsed and rendered on a terminal device, and finally presented as content that can be recognized by the user, for example, a control such as a video, a picture, a text, or a button.
  • a control (control) is also referred to as a widget (widget), and is a basic element of a user interface.
  • Typical controls include a toolbar (toolbar), a menu bar (menu bar), a text box (text box), a button (button), a scroll bar (scrollbar), a picture, and a text.
  • An attribute and content of the control in the interface are defined by using a label or a node.
  • XML specifies the control included in the interface by using a node such as ⁇ Textview>, ⁇ ImgView>, or ⁇ VideoView>.
  • One node corresponds to one control or one attribute in the interface. After being parsed and rendered, the node is presented as content that is visible to the user.
  • an interface for many applications such as a hybrid application (hybrid application) generally includes a web page.
  • the web page is also referred to as a page, and may be understood as a special control embedded in an application interface.
  • the web page is source code written in specific computer language, such as hypertext markup language (hypertext markup language, HTML), cascading style sheets (cascading style sheets, CSS), and JavaScript (JavaScript, JS).
  • the source code as the web page may be loaded and displayed as recognizable content of a user by using a browser or a web page display component similar to a browser function.
  • Specific content included in the web page is also defined by using a label or a node in the source code as the web page.
  • HTML defines an element and an attribute of the web page by using ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • a common presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which is a user interface that is displayed in a graphic form and that is related to a computer operation.
  • GUI graphics user interface
  • the user interface may be an interface element such as an icon, a window, or a control that is displayed on a display of an electronic device.
  • the control may include a visible interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, and a widget.
  • a “frame rate (frame rate)” in this application may also be referred to as a "frame rate” or a “frame rate”, and is a frequency (rate) at which a bitmap image is continuously played or captured in units of a frame.
  • the frame rate may be represented by frames per second (frames per second, fps).
  • a frame rate at which a camera captures an image may be referred to as a shooting frame rate.
  • One camera may support a plurality of shooting frame rates, such as 30 fps, 90 fps, 120 fps, and 480 fps. Shooting frame rates supported by different cameras may be different.
  • a play frame rate of the electronic device is usually fixed, for example, 30 fps.
  • the electronic device invokes one camera.
  • the user preset a shooting frame rate (not shown in the figure) and manually sets an area 01, where the area 01 is used to trigger slow motion video shooting.
  • the electronic device shoots the moving object 02 at the frame rate preset by the user, to obtain a slow motion video of the moving object 02.
  • a shooting frame rate and a trigger area need to be manually set by the user, and a degree of intelligence is low.
  • the electronic device cannot recognize the moving object in time, resulting in incorrect focusing and a poor shooting effect.
  • the electronic device needs to cache several videos of the moment at which the moving object enters the trigger area, and consequently relatively large storage space needs to be occupied.
  • This application provides a video shooting method and an electronic device, and the electronic device includes at least two cameras.
  • both of the at least two cameras may be rear-facing cameras of the electronic device.
  • both of the at least two cameras may be front-facing cameras of the electronic device.
  • the at least two cameras may include a front-facing camera of the electronic device and a rear-facing camera of the electronic device.
  • the electronic device invokes one of the at least two cameras to perform preview. After a moving object enters an FOV of the preview camera, the electronic device determines a shooting frame rate based on a moving speed of the moving object.
  • the electronic device invokes the other of the at least two cameras to shoot a video of the moving object at the determined shooting frame rate.
  • a degree of intelligence is high, but also a relatively suitable shooting frame rate can be determined, so that a shooting effect can be optimized, and user experience can be improved.
  • An example electronic device 100 provided in the following embodiments of this application is first described.
  • FIG. 2A is a schematic diagram of a structure of an electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communications module 150, a wireless communications module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headset jack 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, a barometric pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, an optical proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
  • the structure shown in this application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than those shown in the figure, or some components may be combined, or some components may be split, or different component arrangements may be used.
  • the components shown may be implemented by hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a neural-network processing unit (neural-network processing unit, NPU), and/or the like.
  • Different processing units may be independent components, or may be integrated into one or more processors.
  • the electronic device 100 may also include one or more processors 110.
  • the controller may be a nerve center and a command center of the electronic device 100.
  • the controller may generate an operation control signal based on an instruction operation code and a time sequence signal, to complete control of instruction detection and the like.
  • a memory may be further disposed in the processor 110, and is configured to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data just used or used cyclically by the processor 110. If the processor 110 needs to use the instructions or the data again, the processor 110 may directly invoke the instructions or the data from the memory, to avoid repeated access, and reduce a waiting time of the processor 110, thereby improving efficiency of the electronic device 100.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (inter-integrated circuit, I2C) interface, an inter-integrated circuit sound (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver/transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (general-purpose input/output, GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, a universal serial bus (universal serial bus, USB) interface, and/or the like.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • the I2C interface is a two-way synchronization serial bus, and includes a serial data line (serial data line, SDA) and a serial clock line (serial clock line, SCL).
  • the processor 110 may include a plurality of groups of I2C buses.
  • the processor 110 may be separately coupled to the touch sensor 180K, a charger, a flash, the camera 193, and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 communicates with the touch sensor 180K through the I2C bus interface, thereby implementing a touch function of the electronic device 100.
  • the I2S interface may be configured to perform audio communication.
  • the processor 110 may include a plurality of groups of I2S buses.
  • the processor 110 may be coupled to the audio module 170 by using the I2S bus, to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit an audio signal to the wireless communications module 160 through the I2S interface, to implement a function of answering a call by using a Bluetooth headset.
  • the PCM interface may also be configured to: perform audio communication, and sample, quantize, and encode an analog signal.
  • the audio module 170 may be coupled to the wireless communications module 160 through the PCM bus interface.
  • the audio module 170 may also transmit an audio signal to the wireless communications module 160 through the PCM interface, to implement a function of answering a call by using a Bluetooth headset. Both the I2S interface and the PCM interface may be configured to perform audio communication.
  • the UART interface is a universal serial data bus, and is configured to perform asynchronous communication.
  • the bus may be a two-way communications bus, and converts to-be-transmitted data between serial communication and parallel communication.
  • the UART interface is usually configured to connect the processor 110 to the wireless communications module 160.
  • the processor 110 communicates with a Bluetooth module in the wireless communications module 160 through the UART interface, to implement a Bluetooth function.
  • the audio module 170 may transmit an audio signal to the wireless communications module 160 through UART interface, to implement a function of playing music by using a Bluetooth headset.
  • the MIPI interface may be configured to connect the processor 110 to a peripheral component such as the display 194 and the camera 193.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and the like.
  • the processor 110 communicates with the camera 193 through the CSI interface, to implement a photographing function of the electronic device 100.
  • the processor 110 communicates with the display 194 through the DSI interface, to implement a display function of the electronic device 100.
  • the GPIO interface may be configured by software.
  • the GPIO interface may be configured as a control signal, or may be configured as a data signal.
  • the GPIO interface may be configured to connect the processor 110 to the camera 193, the display 194, the wireless communications module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface may also be configured as the I2C interface, the I2S interface, the UART interface, the MIPI interface, or the like.
  • the USB interface 130 is an interface that conforms to a USB standard specification, and may be specifically a mini USB interface, a micro USB interface, a USB type-C interface, or the like.
  • the USB interface 130 may be configured to connect to the charger to charge the electronic device 100, or may be configured to transmit data between the electronic device 100 and a peripheral device, or may be configured to connect to a headset and play audio by using the headset.
  • the interface may be further configured to connect to another electronic device such as an AR device.
  • an interface connection relationship between the modules that is shown in this application is merely an example for description, and does not constitute a limitation on a structure of the electronic device 100.
  • the electronic device 100 may alternatively use an interface connection manner different from that in the foregoing embodiment, or a combinations of a plurality of interface connection manners.
  • the charging management module 140 is configured to receive a charging input from the charger.
  • the charger may be a wireless charger, or may be a wired charger.
  • the charging management module 140 may receive a charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive a wireless charging input by using a wireless charging coil of the electronic device 100.
  • the charging management module 140 may further supply power to the electronic device through the power management module 141 while charging the battery 142.
  • the power management module 141 is configured to connect the battery 142 and the charging management module 140 to the processor 110.
  • the power management module 141 receives an input of the battery 142 and/or an input of the charging management module 140, and supplies power to the processor 110, the internal memory 121, an external memory, the display 194, the camera 193, and the wireless communications module 160.
  • the power management module 141 may further monitor parameters such as a battery capacity, a battery cycle count, and a battery health status (electric leakage or impedance).
  • the power management module 141 may alternatively be disposed in the processor 110.
  • the power management module 141 and the charging management module 140 may alternatively be disposed in a same device.
  • a wireless communication function of the electronic device 100 may be implemented through the antenna 1, the antenna 2, the mobile communications module 150, the wireless communications module 160, the modem processor, the baseband processor, and the like.
  • the antenna 1 and the antenna 2 are configured to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be configured to cover one or more communications frequency bands. Different antennas may be further multiplexed to improve antenna utilization.
  • the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In some other embodiments, the antenna may be used in combination with a tuning switch.
  • the mobile communications module 150 may provide a solution, applied to the electronic device 100, to wireless communication including 2G/3G/4G/5G and the like.
  • the mobile communications module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (low noise amplifier, LNA), and the like.
  • the mobile communications module 150 may receive an electromagnetic wave through the antenna 1, perform processing such as filtering and amplification on the received electromagnetic wave, and send a processed electromagnetic wave to the modem processor for demodulation.
  • the mobile communications module 150 may further amplify a signal modulated by the modem processor, and convert the signal into an electromagnetic wave for radiation through the antenna 1.
  • at least some function modules in the mobile communications module 150 may be disposed in the processor 110.
  • at least some function modules in the mobile communications module 150 may be disposed in a same device as at least some modules in the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is configured to modulate a to-be-sent low-frequency baseband signal into a medium or highfrequency signal.
  • the demodulator is configured to demodulate a received electromagnetic wave signal into a low-frequency baseband signal. Then, the demodulator transmits the low-frequency baseband signal obtained through demodulation to the baseband processor for processing. After being processed by the baseband processor, the low-frequency baseband signal is transmitted to the application processor.
  • the application processor outputs a sound signal through an audio device (which is not limited to the speaker 170A, the receiver 170B, or the like), or displays an image or a video through the display 194.
  • the modem processor may be an independent component. In some other embodiments, the modem processor may be independent of the processor 110, and is disposed in a same device as the mobile communications module 150 or another function module.
  • the wireless communications module 160 may provide a solution, applied to the electronic device 100, to wireless communication including a wireless local area network (wireless local area network, WLAN) (such as a wireless fidelity (wireless fidelity, Wi-Fi) network), Bluetooth (Bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), an infrared (infrared, IR) technology, and the like.
  • the wireless communications module 160 may be one or more components integrating at least one communications processor module.
  • the wireless communications module 160 receives an electromagnetic wave through the antenna 2, performs frequency modulation and filtering on an electromagnetic wave signal, and sends a processed signal to the processor 110.
  • the wireless communications module 160 may further receive a to-be-sent signal from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into an electromagnetic wave for radiation through the antenna 2.
  • the antenna 1 is coupled to the mobile communications module 150, and the antenna 2 is coupled to the wireless communications module 160, so that the electronic device 100 can communicate with a network and other devices by using a wireless communications technology.
  • the wireless communications technology may include a global system for mobile communications (global system for mobile communications, GSM), a general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (time-division code division multiple access, TD-CDMA), long term evolution (long term evolution, LTE), BT, a GNSS, a WLAN, NFC, FM, an IR technology, and/or the like.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA wideband code division multiple access
  • WCDMA wideband code division multiple access
  • time-division code division multiple access time-division code division
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a BeiDou navigation satellite system (BeiDou navigation satellite system, BDS), a quasi-zenith satellite system (quasi-zenith satellite system, QZSS), and/or a satellite based augmentation system (satellite based augmentation system, SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BeiDou navigation satellite system BeiDou navigation satellite system
  • BDS BeiDou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation system
  • the solution to wireless communication provided by the mobile communications module 150 may enable the electronic device to communicate with a device (for example, a cloud server) in the network, and the solution to WLAN wireless communication provided by the wireless communications module 160 may also enable the electronic device to communicate with a device (for example, a cloud server) in the network. In this way, the electronic device may perform data transmission with the cloud server.
  • a device for example, a cloud server
  • the solution to WLAN wireless communication provided by the wireless communications module 160 may also enable the electronic device to communicate with a device (for example, a cloud server) in the network. In this way, the electronic device may perform data transmission with the cloud server.
  • the electronic device 100 may implement a display function through the display 194, the application processor, and the like.
  • the display 194 is configured to display a control, information, a video, an image, and the like.
  • the display 194 may display a camera control, and the camera control is configured to receive an instruction of a user to enable a photographing function of the camera 193.
  • the display 194 includes a display panel.
  • the display panel may be a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), an active-matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), a flexible light-emitting diode (flexible light-emitting diode, FLED), a Miniled, a MicroLed, a Micro-oLed, a quantum dot light emitting diode (quantum dot light emitting diode, QLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED active-matrix organic light emitting diode
  • FLED flexible light-emitting diode
  • Miniled a MicroLed, a Micro-oLed, a quantum dot light emitting diode (quantum dot light emitting diode, QLED), or the like.
  • the electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
  • the ISP is configured to process data fed back by the camera 193. For example, during photographing, a shutter is pressed, light is transmitted to a photosensitive element of the camera through a lens, and an optical signal is converted into an electrical signal.
  • the photosensitive element of the camera transmits the electrical signal to the ISP for processing, to convert the electrical signal into a visible image.
  • the ISP may further perform algorithm optimization on noise, brightness, and complexion of the image.
  • the ISP may further optimize parameters such as exposure and a color temperature of a photographing scenario.
  • the ISP may be disposed in the camera 193.
  • the camera 193 is configured to capture a still image or a video.
  • An object generates an optical image through the lens and projects the optical image to the photosensitive element.
  • the photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts an optical signal into an electrical signal, and then transmits the electrical signal to the ISP, so that the ISP converts the electrical signal into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • the DSP converts the digital image signal into an image signal in a standard format such as RGB or YUV.
  • the electronic device 100 may include at least two camera entities.
  • the digital signal processor is configured to process a digital signal. In addition to a digital image signal, the digital signal processor may also process another digital signal. For example, when the electronic device 100 selects a frequency, the digital signal processor is configured to perform Fourier transform on frequency energy.
  • the video codec is configured to compress or decompress a digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a plurality of encoding formats, for example, moving picture experts group (moving picture experts group, MPEG)-1, MPEG-2, MPEG-3, and MPEG-4.
  • moving picture experts group moving picture experts group, MPEG-1, MPEG-2, MPEG-3, and MPEG-4.
  • the NPU is a neural-network (neural-network, NN) computing processor, quickly processes input information by referring to a structure of a biological neural network, for example, by referring to a transfer mode between human brain neurons, and may further continuously perform self-learning.
  • Applications such as intelligent cognition of the electronic device 100 may be implemented by using the NPU, for example, image recognition, facial recognition, speech recognition, and text understanding.
  • the external memory interface 120 may be configured to connect to an external storage card, such as a micro SD card, to extend a storage capability of the electronic device 100.
  • the external storage card communicates with the processor 110 through the external memory interface 120, to implement a data storage function. For example, data such as music, a photo, and a video is stored in the external storage card.
  • the internal memory 121 may be configured to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 may run the foregoing instructions stored in the internal memory 121, so that the electronic device 100 performs a video shooting method, various functional applications, data processing, and the like provided in some embodiments of this application.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the program storage area may store an operating system.
  • the program storage area may further store one or more applications (such as Gallery or Contacts).
  • the data storage area may store data created in a use process of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one disk storage device, a flash memory, or a universal flash storage (universal flash storage, UFS).
  • the electronic device 100 may implement an audio function such as music play or recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset jack 170D, an application processor, and the like.
  • an audio function such as music play or recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset jack 170D, an application processor, and the like.
  • the audio module 170 is configured to convert digital audio information into an analog audio signal output, and is also configured to convert an analog audio input into a digital audio signal.
  • the audio module 170 may be further configured to encode and decode an audio signal.
  • the audio module 170 may be disposed in the processor 110, or some function modules of the audio module 170 may be disposed in the processor 110.
  • the speaker 170A also referred to as a "horn" is configured to convert an audio electrical signal into a sound signal.
  • the electronic device 100 may be used to listen to music or answer a call in a hands-free mode through the speaker 170A.
  • the receiver 170B also referred to as an "earpiece", is configured to convert an audio electrical signal into a sound signal.
  • the receiver 170B may be put close to a human ear to listen to a voice.
  • the microphone 170C also referred to as a "mike” or a “microphone” is configured to convert a sound signal into an electrical signal.
  • the user may make a sound near the microphone 170C through the mouth of the user, to input a sound signal to the microphone 170C.
  • At least one microphone 170C may be disposed in the electronic device 100.
  • two microphones 170C may be disposed in the electronic device 100, to collect a sound signal and implement a noise reduction function.
  • three, four, or more microphones 170C may alternatively be disposed in the electronic device 100, to collect a sound signal, implement noise reduction, and identify a sound source, so as to implement a directional recording function and the like.
  • the headset jack 170D is configured to connect to a wired headset.
  • the headset jack 170D may be the USB interface 130, or may be a 3.5 mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, or a cellular telecommunications industry association of the USA (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • the pressure sensor 180A is configured to sense a pressure signal, and may convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be disposed on the display 194.
  • the capacitive pressure sensor may include at least two parallel plates made of conductive materials. When a force is applied to the pressure sensor 180A, capacitance between electrodes changes.
  • the electronic device 100 determines pressure intensity based on the change in the capacitance.
  • the electronic device 100 detects intensity of the touch operation through the pressure sensor 180A.
  • the electronic device 100 may also calculate a touch location based on a detection signal of the pressure sensor 180A.
  • touch operations that are performed at a same touch location but have different touch operation intensity may correspond to different operation instructions. For example, when a touch operation whose touch operation intensity is less than a first pressure threshold is performed on a Messages application icon, an instruction for viewing an SMS message is performed. When a touch operation whose touch operation intensity is greater than or equal to the first pressure threshold is performed on the Messages application icon, an instruction for creating a new SMS message is performed.
  • the gyro sensor 180B may be configured to determine a moving posture of the electronic device 100. In some embodiments, an angular velocity of the electronic device 100 around three axes (namely, axes x, y, and z) may be determined through the gyro sensor 180B.
  • the gyro sensor 180B may be configured to implement image stabilization during photographing. For example, when the shutter is pressed, the gyro sensor 180B detects an angle at which the electronic device 100 jitters, and calculates, based on the angle, a distance for which a lens module needs to be compensated, and allows the lens to cancel the jitter of the electronic device 100 through reverse motion, to implement image stabilization.
  • the gyro sensor 180B may also be used in a navigation scenario and a somatic game scenario.
  • the barometric pressure sensor 180C is configured to measure barometric pressure. In some embodiments, the electronic device 100 calculates an altitude by using the barometric pressure measured by the barometric pressure sensor 180C, to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may detect opening and closing of a flip cover through the magnetic sensor 180D.
  • the electronic device 100 may detect opening and closing of a flip over based on the magnetic sensor 180D.
  • a feature such as automatic unlocking upon opening of the flip cover is set based on a detected opening or closing state of the flip cover.
  • the acceleration sensor 180E may detect magnitude of acceleration in various directions (generally on three axes) of the electronic device 100, and may detect magnitude and a direction of the gravity when the electronic device 100 is still.
  • the acceleration sensor 180E may be further configured to identify a posture of the electronic device, and is applied to an application such as switching between landscape mode and portrait mode or a pedometer.
  • the distance sensor 180F is configured to measure a distance.
  • the electronic device 100 may measure the distance in an infrared or a laser manner. In some embodiments, in a photographing scenario, the electronic device 100 may measure a distance through the distance sensor 180F to implement quick focusing.
  • the optical proximity sensor 180G may include, for example, a light emitting diode (LED) and an optical detector, for example, a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light by using the light emitting diode.
  • the electronic device 100 detects infrared reflected light from a nearby object by using the photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100.
  • the electronic device 100 may detect, through the optical proximity sensor 180G, that the user holds the electronic device 100 close to an ear to make a call, to automatically perform screen-off for power saving.
  • the optical proximity sensor 180G may also be used in a smart cover mode or a pocket mode to automatically perform screen unlocking or locking.
  • the ambient light sensor 180L is configured to sense ambient light brightness.
  • the electronic device 100 may adaptively adjust brightness of the display 194 based on the sensed ambient light brightness.
  • the ambient light sensor 180L may also be configured to automatically adjust white balance during photographing.
  • the ambient light sensor 180L may further cooperate with the optical proximity sensor 180G to detect whether the electronic device 100 is in a pocket, to avoid an accidental touch.
  • the fingerprint sensor 180H is configured to collect a fingerprint.
  • the electronic device 100 may use a feature of the collected fingerprint to implement fingerprint-based unlocking, application lock access, fingerprint-based photographing, fingerprint-based call answering, and the like.
  • the temperature sensor 180J is configured to detect a temperature.
  • the electronic device 100 executes a temperature processing policy by using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 lowers performance of a processor nearby the temperature sensor 180J, to reduce power consumption for thermal protection.
  • the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally because of a low temperature.
  • the electronic device 100 boosts an output voltage of the battery 142 to avoid abnormal shutdown caused by a low temperature.
  • the touch sensor 180K may also be referred to as a touch panel or a touch-sensitive surface.
  • the touch sensor 180K may be disposed on the display 194, and the touch sensor 180K and the display 194 form a touchscreen, which is also referred to as a "touchscreen".
  • the touch sensor 180K is configured to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor may transfer the detected touch operation to the application processor to determine a touch event type.
  • a visual output related to the touch operation may be provided by the display 194.
  • the touch sensor 180K may alternatively be disposed on a surface of the electronic device 100 at a location different from that of the display 194.
  • the bone conduction sensor 180M may obtain a vibration signal. In some embodiments, the bone conduction sensor 180M may obtain a vibration signal of a vibration bone of a human vocal-cord part. The bone conduction sensor 180M may also contact a human pulse to receive a blood pressure beating signal. In some embodiments, the bone conduction sensor 180M may alternatively be disposed in the headset, to obtain a bone conduction headset.
  • the audio module 170 may parse out a speech signal through parsing based on the vibration signal that is of the vibration bone of the human vocal-cord part and that is obtained by the bone conduction sensor 180M, to implement a speech function.
  • the application processor may parse heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, to implement a heart rate detection function.
  • the button 190 includes a power button, a volume button, and the like.
  • the button190 may be a mechanical button, or may be a touch button.
  • the electronic device 100 may receive a key input, and generate a key signal input related to a user setting and function control of the electronic device 100.
  • the motor 191 may generate a vibration prompt.
  • the motor 191 may be configured to provide an incoming call vibration prompt or a touch vibration feedback.
  • touch operations performed on different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations on different areas of the display 194.
  • Different application scenarios for example, a time reminder, information receiving, an alarm clock, and a game
  • a touch vibration feedback effect may be further customized.
  • the indicator 192 may be an indicator light, and may be configured to indicate a charging status and a power change, or may be configured to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is configured to connect to a SIM card.
  • the SIM card may be inserted into the SIM card interface 195 or detached from the SIM card interface 195, to implement contact with or separation from the electronic device 100.
  • the electronic device 100 may support one or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 may support a nano-SIM card, a micro-SIM card, a SIM card, and the like.
  • a plurality of cards may be inserted into a same SIM card interface 195 at the same time.
  • the plurality of cards may be of a same type or different types.
  • the SIM card interface 195 may be compatible with different types of SIM cards.
  • the SIM card interface 195 may be further compatible with an external storage card.
  • the electronic device 100 interacts with a network by using the SIM card, to implement functions such as conversation and data communication.
  • the electronic device 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card may be embedded into the electronic device 100, and cannot be separated from the electronic device 100.
  • the electronic device 100 shown in FIG. 2A may implement, by using the camera 193, a video shooting function described in the following embodiments.
  • the electronic device 100 may display, by using the display 194, various shooting areas described in the following embodiments, and the like.
  • a software system of the electronic device 100 may use a layered architecture, an event-driven architecture, a microkernel architecture, a micro service architecture, or a cloud architecture.
  • an Android system of a layered architecture is used as an example to describe a software structure of the electronic device 100.
  • FIG. 2B is a block diagram of the software structure of the electronic device 100 according to this application.
  • the layered architecture software is divided into several layers, and each layer has a clear role and task.
  • the layers communicate with each other through a software interface.
  • the Android system is divided into four layers: an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
  • the application layer may include a series of application packages.
  • the application package may include applications such as Camera, Gallery, Calls, Navigation, Bluetooth, Music, Videos, and Messages.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for an application at the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
  • the window manager is configured to manage a window program.
  • the window manager may obtain a size of a display, obtain parameters of various display areas on a display interface, and the like.
  • the content provider is configured to: store and obtain data, and enable the data to be accessed by an application.
  • the data may include a video, an image, audio, calls that are made and received, a browsing history and bookmarks, an address book, and the like.
  • the view system includes visual controls such as a control for displaying text, a control for displaying an image, and the like.
  • the view system may be configured to construct an application.
  • a display interface may include one or more views.
  • the display interface includes a Camera icon.
  • the phone manager is configured to provide a communication function for the electronic device 100, for example, management of a call status (including answering and declining).
  • the resource manager provides an application with various resources such as a localized character string, an icon, an image, a layout file, and a video file.
  • the notification manager enables an application to display notification information in a status bar, and may be configured to convey a notification message.
  • the notification message may automatically disappear after a short pause without requiring a user interaction.
  • the notification manager is configured to notify download completion, and give a message notification.
  • the notification manager may alternatively be a notification that appears in a top status bar of the system in a form of a graph or a scroll bar text, for example, a notification of an application running on the background, or may be a notification that appears on the screen in a form of a dialog window.
  • text information is displayed in the status bar, an alert sound is made, the electronic device vibrates, or an indicator light blinks.
  • the Android runtime includes a core library and a virtual machine.
  • the Android runtime is responsible for scheduling and management of the Android system.
  • the core library includes two parts: a function that needs to be invoked in Java language, and a core library of Android.
  • the application layer and the application framework layer run on the virtual machine.
  • the virtual machine executes Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is configured to implement functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library may include a plurality of function modules, for example, a surface manager (surface manager), a media library (media library), a three-dimensional graphics processing library (for example, OpenGL ES), and a 2D graphics engine (for example, SGL).
  • a surface manager surface manager
  • media library media library
  • three-dimensional graphics processing library for example, OpenGL ES
  • 2D graphics engine for example, SGL
  • the surface manager is configured to manage a display subsystem and provide fusion of 2D and 3D layers for a plurality of applications.
  • the media library supports play and recording in a plurality of commonly used audio and video formats, and static image files.
  • the media library may support a plurality of audio and video coding formats such as MPEG-4, H.264, MP3, AAC, AMR, JPG, and PNG.
  • the three-dimensional graphics processing library is configured to implement three-dimensional graphics drawing, image rendering, composition, layer processing, and the like.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is a layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the software system shown in FIG. 2B relates to a program (such as a core library) for invoking a camera, an application module (such as a window manager) for displaying a video picture, an application framework, a display driver, and the like.
  • a program such as a core library
  • an application module such as a window manager
  • the following describes an example of a user operation process in this application with reference to a GUI of the electronic device 100 from a perspective of human-computer interaction.
  • the GUI may be displayed on the display 194 shown in FIG. 2A .
  • a smartphone is used as an example.
  • the smartphone is provided with two cameras, and both of the two cameras are rear-facing cameras of the smartphone. FOVs of the two cameras are different.
  • a Camera application (application, APP) is installed in the smartphone.
  • a main interface GUI of the smartphone displays icon elements such as Camera 31, Calculator 32, Music 33, Clock 34, Contacts 35, Messages 36, Settings 37, and Browser 38, and interface elements such as Navigation Bar and Date and Time.
  • Camera 31 is an interface icon corresponding to the camera APP, and a user taps the camera 31 to trigger running of the camera APP.
  • the smartphone can receive an instruction for running the camera APP corresponding to the tap operation. Further, the smartphone runs the camera APP.
  • the GUI is updated to a shooting interface shown in FIG. 3A-2 .
  • a GUI corresponding to the shooting interface includes interface elements such as a viewfinder 310, Slow Motion 311, Video 312, Photo 313, Panorama 314, and a shooting trigger button 315.
  • the viewfinder 310 is used to display an obtained shot object to the user.
  • Viewfinder 310 is, for example, an FOV of a camera that is used to shoot a slow motion video.
  • Slow Motion 311, Video 312, Photo 313, and Panorama 314 correspond to shooting modes supported by the smartphone.
  • the user triggers different shooting modes, and the smartphone receives an instruction for performing shooting according to a corresponding mode. After performing shooting, the smartphone can present videos or images with different imaging effects.
  • the smartphone can shoot a slow motion video of a shot object in the shooting mode corresponding to Slow Motion 311.
  • the smartphone can shoot a video of a shot object in the shooting mode corresponding to Video 312.
  • a moving speed of the shot object in the video is the same as an actual moving speed of the shot object.
  • the smartphone can shoot an image of a shot object in the shooting mode corresponding to Photo 313.
  • the smartphone can shoot a panoramic image of a shot environment in the shooting mode corresponding to Panorama 314.
  • the shooting trigger button 315 is used to trigger the smartphone to start a shooting operation.
  • the user may tap Slow Motion 311 in the GUI shown in FIG. 3A-2 , to trigger the smartphone to perform shooting in the shooting mode of shooting the slow motion video.
  • the target moving object moves, for example, from one point to another point, for example, a moving track of a traveling train or a moving track after a bullet is shot.
  • the target moving object moves around, for example, from a central point in an explosive manner, for example, moving tracks of fireworks during blooming.
  • the smartphone may set an entrance for shooting scenarios of different nature, to trigger the smartphone to perform different scheduling on the two cameras of the smartphone in different scenarios.
  • the smartphone may display an option of selecting a scenario on the display.
  • the GUI is updated to an interface shown in FIG. 3A-3 .
  • a pull-up menu is displayed near Slow Motion 311 in the GUI, and the pull-up menu includes an option 3111 and an option 3112.
  • Content of the option 3111 is, for example, "scenario 1" and a prompt text of the scenario 1 "shooting a train, a football, a bird, and the like".
  • Content of the option 3112 is, for example, "scenario 2" and a prompt text of the scenario 2 "shooting fireworks”.
  • the user selects "option 3111”.
  • the smartphone hides the pull-up menu.
  • the GUI is updated to an interface shown in FIG. 3A-4 .
  • the GUI shown in FIG. 3A-3 is merely an example display interface, and this application is not limited thereto.
  • the option 3111 and the option 3112 may alternatively be displayed in a drop-down menu.
  • the GUI shown in FIG. 3A-4 includes a viewfinder 310, a shooting trigger button 315, and an option 3113.
  • Content of the option 3113 is, for example, multi-screen display.
  • Functions of the viewfinder 310 and the shooting trigger button 315 are described in the scenario shown in FIG. 3A-2 . Details are not described herein again.
  • the user can trigger the option 3113 as required. After learning that the user selects the option 3113, the smartphone may control a color of "multi-screen display" to change to a color different from that displayed before the option 3113 is selected.
  • the smartphone shoots a slow motion video in a shooting mode that matches the scenario 1 in FIG. 3A-3 .
  • the user does not perform the selection operation in FIG. 3A-4 .
  • the smartphone may receive a preview instruction or a play instruction that is tapped by the user, and then play a shot slow motion video on the display.
  • the user has performed the selection operation in FIG. 3A-4 .
  • the smartphone may receive a preview instruction or a play instruction that is tapped by the user, and then play two slow motion videos on the display.
  • the two slow motion videos are slow motion videos of a same target moving object shot by the user. Shooting frame rates of the two slow motion videos are different, and therefore presented moving speeds of the moving object are different. For example, a presentation effect of the two slow motion videos is shown in FIG. 3B-1 or FIG. 3B-2 .
  • a GUI shown in FIG. 3B-1 includes a play interface 301 of a video 1 and a play interface 302 of the video 1.
  • the play interface 301 and the play interface 302 each occupy half of a screen of the smartphone.
  • a GUI shown in FIG. 3B-2 includes a play interface 301 of a video 1 and a play interface 302 of the video 1.
  • the play interface 301 occupies a full screen of the smartphone.
  • the play interface 302 is embedded in the play interface 301, and is in front of the play interface 301.
  • the user may perform an operation on any video shown in FIG. 3B-1 or any video shown in FIG. 3B-2 .
  • the user may trigger full-screen play of any video through a double-tap operation.
  • the user may further perform an operation such as saving or deletion on any video.
  • the embodiment shown in FIG. 3B-1 is used as an example.
  • the smartphone receives a double-tap operation instruction entered by the user. Further, the smartphone may detect that a video corresponding to the double-tap operation instruction is the video 1. Then, the smartphone hides the play interface of the video 2 to control full-screen play of the video 1.
  • the GUI of the smartphone is updated to an interface shown in FIG. 3B-3 .
  • the embodiment shown in FIG. 3B-1 is used as an example.
  • the smartphone receives a touch-and-hold operation instruction entered by the user.
  • the smartphone displays an operation menu in a play area of the video 1 in response to the touch-and-hold operation instruction.
  • the GUI of the smartphone is updated to an interface shown in FIG. 3B-4 .
  • the GUI shown in FIG. 3B-4 includes a play interface 301 of a video 1, a play interface 302 of the video 1, and an operation menu 303.
  • the operation menu 303 is displayed in the play interface 301 of the video 1.
  • the operation menu 303 includes an option 3031, an option 3032, and an option 3033.
  • content of the option 3031 is saved.
  • content of the option 3032 is deleted.
  • content of the option 3033 is forwarded.
  • the smartphone receives a saving instruction, and the saving instruction corresponds to the video 1. Further, the smartphone saves the video.
  • the smartphone can execute a corresponding operation in response to a corresponding operation instruction. Details are not described herein.
  • FIG. 3A-1 to FIG. 3B-4 are merely example descriptions, and embodiments of this application are not limited thereto.
  • the electronic device 100 may receive a shooting instruction by using a video APP.
  • display content and a display effect that the user views on the interface may vary with a system run on the smartphone, a device brand, a device model, and the like. Details are not described herein.
  • the user does not need to manually set a shooting frame rate and a shooting area to achieve a relatively good shooting effect, so that user experience is better.
  • the following describes an example of a video shooting method in this application from a perspective of the electronic device 100.
  • the video shooting method in this application is applied to the electronic device 100.
  • the electronic device 100 includes at least two cameras. Each of the at least two cameras is the camera 193 shown in FIG. 2A .
  • the following uses two cameras as examples for description.
  • a relationship between FOV areas of the two cameras may include the following several cases:
  • the two cameras are disposed at a same location of the electronic device 100.
  • both of the two cameras are disposed as front-facing cameras or rear-facing cameras.
  • the shooting directions of the two cameras are the same, but the FOVs of the two cameras are different.
  • an FOV area of a camera with a smaller FOV in the two cameras is completely within a range of an FOV area of a camera with a larger FOV
  • an FOV area 41 is an FOV area range of one of the two cameras
  • an FOV area 42 is an FOV area range of the other of the two cameras
  • the FOV area 42 is completely within a range of the FOV area 41.
  • the two cameras are disposed at a same location of the electronic device 100, but the shooting directions of the two cameras are different. For example, one of the two cameras faces 60 degrees to the left of the electronic device 100, and the other camera faces 60 degrees to the right of the electronic device 100. In embodiments, regardless of whether the FOVs of the two cameras are the same, FOV areas of the two cameras partially overlap. As shown in FIG. 4B , an FOV area 41 is an FOV area range of one of the two cameras, an FOV area 42 is an FOV area range of the other of the two cameras, and a shadow part 43 is a part in which the FOV area 41 overlaps the FOV area 42.
  • the two cameras are disposed at different locations of the electronic device 100.
  • one of the two cameras is disposed as a front-facing camera, and the other camera is disposed as a rear-facing camera.
  • FOV areas of the two cameras do not overlap.
  • an FOV area 41 is an FOV area range of one of the two cameras
  • an FOV area 42 is an FOV area range of the other of the two cameras
  • the FOV area 41 does not overlap the FOV area 42.
  • FIG. 5 shows a video shooting method 10.
  • the video shooting method 10 (referred to as the method 10 for short hereinafter) includes the following steps.
  • Step S11 Receive a shooting instruction entered by a user.
  • the shooting instruction is used to instruct an electronic device to shoot a slow motion video.
  • the shooting instruction may be a first shooting instruction or a second shooting instruction.
  • the electronic device 100 performs different scheduling on two cameras in different shooting scenarios. Based on this, when the user triggers the option 3111 (that is, the scenario 1 (shooting a train, a football, a bird, and the like)) shown in FIG. 3A-3 , the electronic device receives the first shooting instruction. When the user triggers the option 3112 (that is, the scenario 2 (shooting fireworks and the like)) shown in FIG. 3A-3 , the electronic device receives the second shooting instruction.
  • the option 3111 that is, the scenario 1 (shooting a train, a football, a bird, and the like)
  • the electronic device receives the first shooting instruction.
  • the option 3112 that is, the scenario 2 (shooting fireworks and the like)
  • the electronic device receives the second shooting instruction.
  • the shooting instruction in embodiments may be preset, and may be prestored in the internal memory 121 shown in FIG. 2A .
  • Step S12 Invoke a first camera to perform preview.
  • the shooting instruction is the first shooting instruction (that is, an instruction for shooting a train, a football, a bird, and the like).
  • the processor 110 may invoke a camera corresponding to the FOV area 41 as the first camera.
  • the processor 110 may invoke any one of the two cameras as the first camera.
  • the processor 110 may invoke a camera with a largest FOV in the more than two cameras as the first camera.
  • an FOV of the first camera is relatively large.
  • a preview frame rate of the first camera may be a first frame rate, and the first frame rate is, for example, 30 fps.
  • the shooting instruction is the second shooting instruction (that is, an instruction for shooting fireworks and the like).
  • the processor 110 may invoke a camera corresponding to the FOV area 42 in FIG. 4A as the first camera.
  • the processor 110 may invoke a camera with a smallest FOV in the more than two cameras as the first camera.
  • a preview frame rate of the first camera may be a second frame rate.
  • the second frame rate may be a highest frame rate that can be supported by the first camera, and the second frame rate is, for example, 480 fps.
  • the processor 110 may not enable a recording function of the first camera when invoking the first camera to perform preview.
  • a preview picture of the first camera may not be displayed on the display 194.
  • Step S13 If it is detected that a target moving object moves in the FOV of the first camera, determine a target shooting frame rate based on a moving speed of the target moving obj ect.
  • the moving object is determined as the target moving object. In some other embodiments, if the first camera detects at least two moving objects at the same time, a moving object with a faster moving speed is determined as the target moving object. In some other embodiments, if the first camera detects at least two moving objects, and the at least two moving objects do not enter the FOV of the first camera at the same time, a moving object that first enters the FOV of the first camera is determined as the target moving object.
  • the target shooting frame rate is a shooting frame rate used when the electronic device 100 shoots a slow motion video.
  • the target shooting frame rate may be a best frame rate that can be supported by the electronic device 100 and that is obtained through calculation, for example, 900 fps.
  • the target shooting frame rate is a shooting frame rate supported by a to-be-invoked second camera. The to-be-invoked second camera may be determined based on the best frame rate.
  • the processor 110 may determine the best frame rate based on an FOV of a camera with a smallest FOV in all cameras of the electronic device 100.
  • the FOV of the camera with a smallest FOV is referred to as a smallest FOV
  • an FOV area 61 shown in FIG. 6A is an FOV area of the first camera
  • an FOV area 62 is a smallest FOV area.
  • the processor 110 may calculate the best frame rate.
  • the moving speed of the target moving object may be L 1 T 1 .
  • T1 is first duration.
  • L1 is a quantity of pixels by which the target moving object moves in the FOV area 61 within the first duration
  • L2 is a total quantity of pixels of the area 62 in a moving direction of the target moving object
  • T2 is total duration for which the target moving object passes through the area 62 at a shooting frame rate during preview
  • k is an integer greater than 1.
  • An FOV of the to-be-invoked second camera is not less than the smallest FOV
  • a shooting frame rate of the to-be-invoked second camera needs to be at least greater than or equal to 1 T 1 .
  • the target shooting frame rate f may be set to a value that is far greater than 1 T 2 by using k, where k is, for example, 300.
  • the first duration may be preset duration.
  • the preset duration may be determined based on the FOV of the first camera. For example, a larger FOV of the first camera may indicate a larger maximum value of the preset duration, and a smaller FOV of the first camera may indicate a smaller maximum value of the preset duration.
  • the preset duration is, for example, 0.05 seconds.
  • the first duration may be a reciprocal of a frame rate used when the first camera performs preview. For example, if the frame rate used when the first camera performs preview is 30 fps, the first duration is 1 30 seconds. For another example, if the frame rate used when the first camera performs preview is 24 fps, the first duration is 1 24 seconds.
  • L1 is a quantity of pixels obtained after the target moving object moves in the FOV area 61 by one frame.
  • scenario 1 is used as an example for description in FIG. 6A
  • the embodiment shown in FIG. 6A is also applicable to the scenario 2.
  • the parameters may be flexibly set based on the scenario 2. Details are not described herein again.
  • the processor 110 starts to calculate the target shooting frame rate.
  • the preset frame may be determined based on an actual moving speed of the target moving object, and the preset frame is, for example, five frames.
  • the processor 110 starts to calculate the target shooting frame rate.
  • the target shooting frame rate can be calculated, so as to avoid a case in which the target shooting frame rate is repeatedly calculated because one of a moving track and the moving speed of the target moving object is changed, thereby saving resources.
  • the electronic device 100 may determine the target shooting frame rate and the second camera based on the best frame rate.
  • the second camera is the other camera than the first camera.
  • a camera corresponding to the FOV area 41 in FIG. 4A is determined as the first camera
  • a camera corresponding to the FOV area 42 is determined as the second camera.
  • the target shooting frame rate may be a frame rate supported by the second camera.
  • the processor 110 may select, as the target shooting frame rate, a frame rate that is in frame rates supported by the second camera and that is adjacent to the best frame rate and greater than the best frame rate.
  • the best frame rate is 650 fps
  • the shooting frame rates supported by the second camera include 120 fps, 240 fps, 480 fps, and 960 fps
  • the processor 110 may determine 960 fps as the target shooting frame rate.
  • the processor 110 may calculate a difference between the best frame rate and each of frame rates supported by other cameras than the first camera, to obtain one or more differences. Then, the processor 110 may determine whether a smallest difference in the one or more differences is less than a first threshold, where the first threshold is a value obtained by multiplying a frame rate corresponding to the smallest difference (referred to as a corresponding frame rate hereinafter) by a preset percentage. If the smallest difference is less than the first threshold, the processor 110 determines the corresponding frame rate as the target shooting frame rate, and determines a camera that supports the corresponding frame rate as the second camera.
  • the preset percentage may be, for example, 25%.
  • the best frame rate is 500 fps
  • the shooting frame rates supported by other cameras than the first camera in the electronic device 100 include 120 fps, 240 fps, 480 fps, and 960 fps.
  • a difference between 480 fps and the best frame rate 500 fps is smallest, and the smallest difference is 20.
  • the first threshold is 25% of 480 fps, that is, 120, and the smallest difference 20 is less than the first threshold 120.
  • the processor 110 determines 480 fps as the target shooting frame rate, and determines a camera that supports 480 fps as the second camera.
  • the processor 110 may, for example, determine the target shooting frame rate and the second camera by using another method. Details are not described herein.
  • Step S14 Invoke the second camera to shoot a video of the target moving object at the target shooting frame rate.
  • the processor 110 may invoke the determined second camera to shoot the video of the target moving object at the target shooting frame rate.
  • the processor 110 may invoke the second camera to perform shooting at the target shooting frame rate when the target moving object enters the FOV of the second camera. In some other embodiments, the processor 110 may invoke the second camera to perform shooting at the target shooting frame rate when the target moving object enters the FOV of the second camera for a period of time, or when the target moving object enters a preset area of the FOV of the second camera. In embodiments, the processor 110 may determine, based on total duration of a slow motion video that is to be obtained, a time or an area for starting the second camera.
  • the total duration of shooting the slow motion video is 15 seconds (s).
  • the shooting instruction is the second shooting instruction (that is, the instruction for shooting fireworks and the like)
  • the processor 110 may immediately start the second camera to perform shooting at the target shooting frame rate.
  • the electronic device including the at least two cameras invokes one of the at least two cameras to perform preview.
  • the electronic device calculates, based on the moving speed of the target moving object, the target shooting frame rate for shooting the slow motion video.
  • the electronic device invokes the other of the at least two cameras to shoot the slow motion video of the moving object at the target shooting frame rate. In this way, the electronic device can automatically determine the shooting frame rate.
  • a degree of intelligence and a degree of automation are high, but also a relatively suitable shooting frame rate can be determined based on an actual moving speed of the moving object, so that a shooting effect can be optimized, and user experience can be improved.
  • the electronic device 100 may further calculate a moment for starting the second camera.
  • the electronic device 100 needs to occupy a specific period of time from a moment for starting the second camera to a moment for performing a shooting operation by using the second camera. Based on this, in embodiments of this application, the electronic device 100 may determine the moment for starting the second camera, so that a shooting operation can be performed at a preset shooting moment by using the second camera.
  • an FOV area 61 is an FOV area of the first camera
  • an FOV area 63 is an FOV area of the second camera.
  • a target moving object 60 enters the FOV area 61, and after determining the second camera, the processor 110 may calculate a shooting parameter of the second camera.
  • the electronic device 100 may calculate, according to the following algorithm, a moment T3 for starting the second camera.
  • T 3 L 3 L 1 ⁇ T 1 ⁇ T 0 . Meanings of L1 and T1 are described in the foregoing embodiment. Details are not described herein again.
  • T0 is a moment at which the second camera is started to perform a shooting operation, and T0 is known.
  • L3 is a quantity of pixels between the target moving object and a location 30 at which the second camera starts shooting.
  • the electronic device 100 may start the second camera at the moment T3. Then, the second camera is started within duration from the moment T3 to the moment T0, the electronic device 100 may use the second camera to shoot the slow motion video of the target moving object at the moment T0.
  • the electronic device 100 may further calculate a trigger location for starting the second camera, where the trigger location is a location to which the target moving object moves.
  • Tn is a current moment. Meanings of T1, T3, and L1 are described in the foregoing embodiment. Details are not described herein again.
  • the electronic device 100 may start the second camera when the target moving object moves to the trigger location. After the second camera is started, the electronic device 100 controls the second camera to shoot the slow motion video of the target moving object.
  • the electronic device 100 calculates, by using a centerline of the target moving object as a reference, a pixel, a time, and the like obtained after the target moving object moves by one frame. In this case, in embodiments, the electronic device also determines, by using the centerline of the target moving object as a reference, whether the target moving object arrives at the foregoing trigger location. It is assumed that, after the target moving object enters the FOV of the first camera, the electronic device 100 calculates, by using one edge of the target moving object (for example, a right edge of the target moving object in FIG.
  • the electronic device also determines, by using the corresponding edge of the target moving object (the right edge of the target moving object in FIG. 6B ) as a reference, whether the target moving object arrives at the foregoing trigger location.
  • the electronic device can start the second camera at a proper time based on the moving speed of the target moving object or the like, so that the second camera starts to perform shooting at a preset shooting moment.
  • the second camera can capture the target moving object, so that the electronic device does not need to cache several videos of a moment at which the target moving object enters the trigger area, thereby saving storage space.
  • the processor 110 may further obtain a parameter such as a focal length of the first camera, and further determine a focal length of the second camera based on the parameter such as the focal length of the first camera, so as to perform focusing on the second camera before the second camera is started.
  • the processor 110 may further obtain at least one of an auto white balance (auto white balance) parameter, an auto exposure (auto exposure) parameter, and an automatic focus (automatic focus) parameter of the first camera. Further, before the second camera is started, the processor 110 determines an exposure parameter of the second camera based on the obtained parameter.
  • auto white balance auto white balance
  • auto exposure auto exposure
  • automatic focus automatic focus
  • the foregoing embodiments are all example descriptions, and the technical solutions in embodiments of this application are not limited thereto.
  • the electronic device may be implemented in one manner of the foregoing embodiments, or may be implemented in any combination manner of the foregoing embodiments. This is not limited herein.
  • an electronic device including at least two cameras may use one of the at least two cameras to preview a target moving object. Then, the electronic device determines a shooting camera, a shooting frame rate of the shooting camera, a start moment, a focal length, an exposure parameter, and the like based on a moving parameter of the target moving object during preview, an attribute parameter of a preview camera, and the like, and then starts, based on the determined data, the shooting camera in the at least two cameras to shoot a slow motion video. In this way, not only a best shooting frame rate can be determined based on a moving speed of the target moving object, to achieve a high degree of intelligence, but also storage space can be saved, to optimize a shooting effect.
  • the electronic device 100 receives an instruction for multi-screen display.
  • the electronic device 100 may further enable a recording (that is, shooting) function of the first camera. That is, the electronic device 110 may shoot the target moving object by using both the first camera and the second camera.
  • a shooting frame rate used when the first camera shoots a video may be a third shooting frame rate.
  • the third shooting frame rate is different from the target shooting frame rate.
  • the third shooting frame rate may be the same as a frame rate used when the first camera performs preview.
  • the third shooting frame rate may alternatively be different from a frame rate used when the first camera performs preview. This is not limited herein.
  • the electronic device 100 may view, on the display 194, two video pictures shown in FIG. 3B-1 or FIG. 3B-2 . Then, the electronic device 100 may further receive another instruction entered by the user, and perform an operation corresponding to the instruction, so that the display 194 presents a display effect shown in FIG. 3B-3, FIG. 3B-4 , or the like.
  • the electronic device can simultaneously perform shooting by using at least two cameras, so that videos played at different frame rates can be obtained, thereby improving viewing experience of the user.
  • this application provides, for example, a smartphone 200.
  • the smartphone 200 is provided with a wide-angle camera and a long-focus camera, and both the wide-angle camera and the long-focus camera are rear-facing cameras of the smartphone 200.
  • a shooting direction of the wide-angle camera is the same as a shooting direction of the long-focus camera.
  • An FOV of the wide-angle camera is greater than an FOV of the long-focus camera.
  • the FOV of the long-focus camera is completely within the FOV of the wide-angle camera, and a presentation effect is similar to that shown in FIG. 4A . Details are not described herein again.
  • a Camera app is installed in the smartphone 200.
  • Example 1 Shoot a slow motion video of a football shooting moment.
  • the user may trigger the option 3111 (scenario 1) in the "slow motion” option.
  • scenario 1 for a process of human-computer interaction between the user and the smartphone 200, and an interface change presented by the GUI in the process of interaction between the user and the smartphone 200, refer to the embodiments shown in FIG. 3A-1 to FIG. 3A-3 . Details are not described herein again.
  • the smartphone 200 receives a first shooting instruction. Further, the smartphone 200 starts, in response to the first shooting instruction, the wide-angle camera to perform preview. In embodiments, for example, the wide-angle camera performs preview at 30 fps. In addition, the smartphone 200 does not enable a recording function of the wide-angle camera.
  • an area 71 is an FOV area of the wide-angle camera
  • an area 72 is an FOV area of the long-focus camera.
  • the smartphone 200 may, for example, determine a quantity L1 of pixels by which a football moves within duration T1.
  • T1 is, for example, 1 30 seconds.
  • L1 is a quantity of pixels that the football moves by one frame in the area 71.
  • the smartphone 200 may, for example, determine a target shooting frame rate of the long-focus camera based on a moving speed of the football. For example, the smartphone 200 may determine a best frame rate f in the manner in the embodiment shown in FIG.
  • the smartphone 200 includes only two cameras. Therefore, a smaller FOV is the FOV of the long-focus camera.
  • the best frame rate f is, for example, 450 fps.
  • frame rates supported by the long-focus camera include 30 fps, 240 fps, 480 fps, and 960 fps.
  • the smartphone 200 determines the target shooting frame rate of the long-focus camera as 480 fps.
  • the smartphone 200 may further determine a focal length of the long-focus camera, an exposure parameter of the long-focus camera, and the like. Details are not described herein.
  • the smartphone 200 may start to determine the target shooting frame rate of the long-focus camera, the moment T3 at which the long-focus camera is started, the focal length of the long-focus camera, and the exposure parameter of the long-focus camera after the football moves by five frames.
  • the smartphone 200 starts the long-focus camera at the moment T3, so that the long-focus camera shoots a video of a football shooting moment by using the determined focal length, exposure parameter, and target shooting frame rate 480 fps.
  • the user cannot view a picture previewed by the wide-angle camera on the display of the smartphone 200, that is, a picture in the area 71 shown in FIG. 7A .
  • a picture that the user can view on the display is, for example, the slow motion video of the football shooting moment, that is, a picture in the area 72 shown in FIG. 7A .
  • the smartphone 300 includes rear-facing cameras: a wide-angle camera and a long-focus camera, where a shooting direction of the wide-angle camera of the smartphone 300 is different from a shooting direction of the long-focus camera of the smartphone 300, an FOV area of the long-focus camera partially overlaps an FOV area of the wide-angle camera, and a presentation effect is similar to that shown in FIG. 4B .
  • the smartphone 300 may invoke the wide-angle camera to perform preview and invoke the long-focus camera to perform shooting, or may invoke the long-focus camera to perform preview and invoke the wide-angle camera to perform shooting.
  • This application is not limited thereto.
  • an implementation process of shooting a slow motion video by a smartphone 300 and a process of determining a parameter such as a target shooting frame rate by the smartphone 300 are similar to those of the smartphone 200 in the example 1. Details are not described herein.
  • Example 2 Shoot a slow motion video of a fireworks blooming moment.
  • the user may trigger the option 3112 (scenario 2) in the "slow motion” option.
  • scenario 2 for a process of human-computer interaction between the user and the smartphone 200, and an interface change presented by the GUI in the process of interaction between the user and the smartphone 200, refer to the embodiments shown in FIG. 3A-1 to FIG. 3A-3 . Details are not described herein again.
  • the smartphone 200 receives a second shooting instruction. Further, the smartphone 200 may start, in response to the second shooting instruction, the long-focus camera to perform preview.
  • a preview frame rate of the long-focus camera is, for example, 960 fps.
  • the user can, for example, view a viewfinder of the long-focus camera. Further, the user may, for example, align the long-focus camera of the smartphone 200 with a center location of fireworks at the fireworks blooming moment, to trigger the smartphone 200 to calculate a parameter for shooting the slow motion video. In embodiments, after the fireworks move by one frame, the smartphone 200 may start to calculate various parameters.
  • the smartphone 200 calculates a shooting frame rate of the wide-angle camera, a focal length of the wide-angle camera, and an exposure parameter of the wide-angle camera based on a quantity of pixels and duration that the fireworks move by one frame in the area 72, and after determining the foregoing parameters, immediately starts the wide-angle camera to perform shooting based on the foregoing parameters.
  • the shooting frame rate of the wide-angle camera is, for example, 240 fps.
  • the user can view the viewfinder of the long-focus camera, the user cannot view a picture previewed by the long-focus camera, that is, a picture in the area 72.
  • a picture that the user can view on the display is, for example, the slow motion video of the fireworks blooming moment, that is, a picture in the area 71 shown in FIG. 7B .
  • this application provides, for example, a smartphone 400.
  • the smartphone 400 is provided with a front-facing camera and a rear-facing camera, and a shooting direction of the front-facing camera is completely opposite to a shooting direction of the rear-facing camera.
  • a relationship between an FOV of the front-facing camera and an FOV of the rear-facing camera is shown in FIG. 4C . Details are not described herein again.
  • a Camera app is installed in the smartphone 400.
  • Example 3 Shoot a slow motion video in which a character runs from the back to the front of the user.
  • the smartphone 400 may support shooting of only one scenario.
  • the smartphone 400 may not provide the option 3111 and the option 3112 shown in FIG. 3A-3 .
  • the user triggers the smartphone 400 to shoot the slow motion video.
  • the smartphone 400 for a process of human-computer interaction between the user and the smartphone 400, and an interface change presented by the GUI in the process of interaction between the user and the smartphone 400, refer to the embodiments shown in FIG. 3A-1 and FIG. 3A-2 . Details are not described herein again.
  • an area 73 is, for example, an FOV area of the front-facing camera
  • an area 74 is, for example, an FOV area of the rear-facing camera.
  • the smartphone 400 may start the front-facing camera to perform preview.
  • the front-facing camera performs preview at 30 fps.
  • the smartphone 400 may determine a best frame rate based on a running speed of the shot character and a width of the area 74 in a running direction of the shot character, and further determine a target shooting frame rate of the rear-facing camera based on the best frame rate.
  • the target shooting frame rate of the rear-facing camera is, for example, 240 fps.
  • an implementation process of determining the target shooting frame rate of the rear-facing camera by the smartphone 400 is similar to the embodiment of determining the target shooting frame rate in the example 1. Details are not described herein again.
  • the smartphone 400 may detect a distance between the shot character and the user by using a laser.
  • the smartphone 400 starts the rear-facing camera.
  • the smartphone 400 uses the rear-facing camera to shoot, at the target shooting frame rate 240 fps, a slow motion video in which the shot character is running.
  • FIG. 7A and FIG. 7C are example descriptions, and the technical solutions of this application are not limited thereto.
  • the electronic device may be another device with a plurality of cameras. There may be another applicable scenario, and details are not described herein.
  • the user may enter an instruction for "multi-screen display", to trigger the electronic device to display the plurality of pictures. Details are not described herein.
  • this specification does not show all implementation scenarios applicable to this application.
  • another implementation method based on the technical idea of this application also falls within the protection scope of this application.
  • an electronic device including at least two cameras may use one of the at least two cameras to preview a target moving object. Then, the electronic device determines a shooting camera, a shooting frame rate of the shooting camera, a start moment, a focal length, an exposure parameter, and the like based on a moving parameter of the target moving object during preview, an attribute parameter of a preview camera, and the like, and then starts, based on the determined data, the shooting camera in the at least two cameras to shoot a slow motion video. In this way, not only a best shooting frame rate can be determined based on a moving speed of the target moving object, to achieve a high degree of intelligence, but also storage space can be saved, to optimize a shooting effect.
  • an electronic device 80 may include a receiving module 801, an invoking module 802, and a determining module 803.
  • the electronic device 80 may be configured to perform operations of the electronic device in FIG. 3A-1 to FIG. 3B-4 and FIG. 5 to FIG. 7C .
  • the receiving module 801 may be configured to receive a shooting instruction entered by a user.
  • the invoking module 802 may be configured to invoke a first camera to perform preview.
  • the determining module 803 may be configured to: when it is detected that a target moving object moves in an FOV of the first camera, determine a target shooting frame rate based on a moving speed of the target moving object.
  • the invoking module 802 may be further configured to invoke a second camera to shoot a video of the target moving object at the target shooting frame rate.
  • the electronic device may invoke the first camera of the electronic device to perform preview. After previewing the target moving object, the electronic device calculates, based on the moving speed of the target moving object, the target shooting frame rate for shooting the slow motion video, and then invokes the second camera to shoot the slow motion video of the target moving object based on the target shooting frame rate. In this way, the electronic device can automatically determine the shooting frame rate. In this way, not only a degree of intelligence and a degree of automation are high, but also a relatively suitable shooting frame rate can be determined based on an actual moving speed of the moving object, so that a shooting effect can be optimized, and user experience can be improved.
  • the determining module 803 may be configured to: when the shooting instruction is a first shooting instruction, determine a camera with a relatively large FOV as the first camera.
  • the determining module 803 may be further configured to: when the shooting instruction is a second shooting instruction, determine a camera with a relatively small FOV as the first camera.
  • the invoking module 802 may be further configured to: when the shooting instruction is the first shooting instruction, invoke the first camera to perform preview at a first frame rate; or when the shooting instruction is the second shooting instruction, invoke the first camera to perform preview at a second frame rate.
  • the determining module 803 may be further configured to: determine a best frame rate based on the moving speed of the target moving object; and determine, as the target shooting frame rate, a frame rate that is in frame rates supported by the second camera and that is adjacent to the best frame rate and greater than the best frame rate.
  • the determining module 803 may be further configured to: calculate a difference between the best frame rate and each of frame rates supported by the second camera, to obtain one or more differences; determine whether a smallest difference is less than a first threshold, where the smallest difference belongs to the one or more differences, and the first threshold is a value obtained by multiplying a frame rate corresponding to the smallest difference by a preset percentage; and when the smallest difference is less than the first threshold, determine the frame rate corresponding to the smallest difference as the target shooting frame rate.
  • the determining module 803 may be further configured to determine one or more of the following parameters based on the moving speed of the target moving object: a moment of starting the second camera, a trigger location of starting the second camera, an exposure parameter of the second camera, and a focal length of the second camera, where the trigger location is a location to which the moving object moves.
  • the invoking module 802 may be further configured to: when invoking the second camera to shoot the video of the target moving object at the target shooting frame rate, invoke the first camera to shoot the video of the target moving object at a third shooting frame rate, where the third shooting frame rate is different from the target shooting frame rate.
  • the electronic device 80 may further include a play module.
  • the play module may be configured to play a first video file shot by the first camera and a second video file shot by the second camera, where a third shooting frame rate corresponding to the first video file is different from the target shooting frame rate corresponding to the second video file.
  • the invoking module 802 and the determining module 803 may be implemented by a processor, and the receiving module 801 may be implemented by a transceiver.
  • the electronic device 81 may include a processor 811, a transceiver 812, and a memory 813.
  • the memory 813 may be configured to: store a program/code pre-installed when the electronic device 81 is delivered from a factory, or store code executed by the processor 811.
  • the electronic device 81 in embodiments of this application may correspond to the electronic device 100 in the foregoing embodiments.
  • the transceiver 812 is configured to receive various instructions entered by a user, and the processor 811 is configured to execute video shooting operations of the electronic devices in FIG. 3A-1 to FIG. 3B-4 and FIG. 5 to FIG. 7C . Details are not described herein again.
  • this application further provides a computer storage medium.
  • a computer storage medium disposed in any device may store a program. When the program is executed, some or all of the steps in the embodiments of the video shooting method provided in FIG. 3A-1 to FIG. 3B-4 and FIG. 5 to FIG. 7C may be implemented.
  • the storage medium in any device may be a magnetic disk, an optical disc, a read-only memory (read-only memory, ROM), a random access memory (random access memory, RAM), or the like.
  • the transceiver may be a wired transceiver.
  • the wired transceiver may be, for example, an optical interface, an electrical interface, or a combination thereof.
  • the wired transceiver may be, for example, various sensors.
  • the processor may be a central processing unit (central processing unit, CPU), a network processor (network processor, NP), or a combination of a CPU and an NP.
  • the processor may further include a hardware chip.
  • the foregoing hardware chip may be an application-specific integrated circuit (application-specific integrated circuit, ASIC), a programmable logic device (programmable logic device, PLD), or a combination thereof.
  • the PLD may be a complex programmable logic device (complex programmable logic device, CPLD), a field-programmable gate array (field-programmable gate array, FPGA), a generic array logic (generic array logic, GAL), or any combination thereof.
  • the memory may include a volatile memory (volatile memory), such as a random-access memory (random-access memory, RAM).
  • the storage device may also include a non-volatile memory (non-volatile memory), such as a read-only memory (read-only memory, ROM), a flash memory (flash memory), a hard disk drive (hard disk drive, HDD), or a solid-state drive (solid-state drive, SSD).
  • the memory may further include a combination of the foregoing types of memories.
  • FIG. 8B may further include a bus interface.
  • the bus interface may include any quantity of interconnected buses and bridges. Specifically, various circuits of one or more processors represented by the processor and a memory represented by the memory are linked together. The bus interface may further link various other circuits such as a peripheral device, a voltage regulator, and a power management circuit together. These are all well-known in the art. Therefore, details are not described in this specification.
  • the bus interface provides interfaces.
  • the transceiver provides units for communicating with various other devices on a transmission medium.
  • the processor is responsible for managing a bus architecture and common processing, and the memory may store a packet used when the processor performs an operation.
  • the various illustrative logic units and circuits described in embodiments of this application may implement or operate the described functions by using a design of a general-purpose processor, a digital signal processor, an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA) or another programmable logic apparatus, a discrete gate or transistor logic, a discrete hardware component, or any combination thereof.
  • the general-purpose processor may be a microprocessor.
  • the general-purpose processor may alternatively be any conventional processor, a controller, a microcontroller, or a state machine.
  • the processor may alternatively be implemented by using a combination of computing apparatuses, such as a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors combined with one digital signal processor core, or any other similar configuration.
  • the steps of the methods or algorithms described in embodiments of this application may be directly embedded into hardware, a software unit executed by the processor, or a combination thereof.
  • the software unit may be stored in a RAM, a flash memory, a ROM, an EPROM, an EEPROM, a register, a hard disk, a removable disk, a CD-ROM, or another form of storage medium in the art.
  • the storage medium may be connected to the processor, so that the processor can read information from the storage medium, and can store and write information into the storage medium.
  • the storage medium may alternatively be integrated into the processor.
  • the processor and the storage medium may be disposed in an ASIC, and the ASIC may be disposed in the electronic device.
  • the processor and the storage medium may alternatively be disposed in different components in the electronic device.
  • a sequence number of the processes does not mean an execution sequence.
  • the execution sequence of the processes should be determined according to functions and internal logic of the processes, and should not constitute any limitation on implementation processes of embodiments.
  • All or some of the foregoing embodiments may be implemented by using software, hardware, firmware, or any combination thereof.
  • the software is used for implementation, all or some of the foregoing embodiments may be implemented in a form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer program instruction is loaded and executed on a computer, all or some of the procedures or the functions described in this application are generated.
  • the computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, transmitted from one website site, computer, server, or packet center to another website site, computer, server, or packet center in a wired (for example, coaxial cable, optical fiber, or digital subscriber line (DSL)) manner or a wireless (for example, infrared, wireless, or microwave) manner.
  • the computer-readable storage medium may be any available medium accessible by a computer or a packet storage device such as a server or a packet center integrating one or more available media.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid-state drive (solid-state drive, SSD)).
  • a magnetic medium for example, a floppy disk, a hard disk, a magnetic tape
  • an optical medium for example, a DVD
  • a semiconductor medium for example, a solid-state drive (solid-state drive, SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Claims (8)

  1. Procédé de prise de vue vidéo (10), appliqué à un dispositif électronique (80, 81, 100, 110), dans lequel le dispositif électronique (80, 81, 100, 110) est pourvu d'une première caméra et d'une seconde caméra, la première caméra et la seconde caméra sont différentes dans au moins l'un parmi un champ de vision (61, 62, 63) et une direction de prise de vue, et le procédé (10) comprend :
    la réception (S11) d'une instruction de prise de vue entrée par un utilisateur, l'instruction de prise de vue étant utilisée pour ordonner au dispositif électronique de tourner une vidéo au ralenti ;
    l'invocation (S12) de la première caméra pour effectuer une prévisualisation à une fréquence d'images prédéterminée ;
    s'il est détecté qu'un objet mobile cible (60) se déplace dans le champ de vision (61) de la première caméra pendant une prévisualisation, la détermination d'une vitesse de déplacement de l'objet mobile cible (60) sur la base de la quantité de pixels, L1, obtenue après que l'objet mobile cible se déplace dans le champ de vision (61) de la première caméra dans une première durée, T1, d'une trame, et la détermination (S13) d'une fréquence d'images de prise de vue cible (240, 480, 500) sur la base d'une vitesse de déplacement de l'objet mobile cible (60) ; et
    l'invocation (S14) de la seconde caméra pour tourner une vidéo (312) de l'objet mobile cible (60) à la fréquence d'images de prise de vue de cible (240, 480, 500) ;
    dans lequel la détermination d'une fréquence d'images de prise de vue (240, 480, 500) sur la base d'une vitesse de déplacement de l'objet mobile cible (60) comprend : la détermination d'une meilleure fréquence d'images (240, 480, 500) sur la base de la vitesse de déplacement de l'objet mobile cible (60) ;
    le calcul d'une différence entre la meilleure fréquence d'images (240, 480, 500) et chacune des fréquences d'images (240, 480, 500) prises en charge par la seconde caméra, pour obtenir une ou plusieurs différences ;
    la détermination du fait de savoir si une plus petite différence (20) dans la ou les différences est inférieure à un premier seuil (120), dans lequel le premier seuil (120) est une valeur obtenue en multipliant une fréquence d'images (240, 480, 500) correspondant à la plus petite différence (20) d'un pourcentage prédéfini ; et
    si la plus petite différence (20) est inférieure au premier seuil (120), la détermination de la fréquence d'images (240, 480, 500) correspondant à la plus petite différence (20) en tant que fréquence d'images de prise de vue cible (240, 480, 500) ; et
    dans lequel la détermination d'une meilleure fréquence d'images (240, 480, 500) sur la base de la vitesse de déplacement de l'objet mobile cible 60) comprend :
    la détermination, selon T2 = L2 L1 T1 = T1 L2 L 1
    Figure imgb0027
    , une durée totale T2 pendant laquelle l'objet mobile cible (60) passe par une zone de prise de vue (62) de la seconde caméra, dans lequel L1 T 1
    Figure imgb0028
    est la vitesse de déplacement de l'objet mobile cible (60), T1 est la première durée, L1 est la quantité de pixels par laquelle l'objet mobile cible (60) se déplace dans le champ de vision (61) de la première caméra pendant la première durée, et L2 est une quantité de pixels de la zone de prise de vue (62) de la seconde caméra dans une direction de déplacement de l'objet mobile cible (60) ; et
    la détermination de la meilleure fréquence d'images (240, 480, 500) f selon f = k 1 T 2
    Figure imgb0029
    , dans lequel k est un nombre entier supérieur à 1.
  2. Procédé (10) selon la revendication 1, comprenant :
    lorsque l'instruction de prise de vue est une première instruction de prise de vue, la détermination d'une caméra (31, 193) avec un champ de vision relativement important en tant que première caméra ; ou
    lorsque l'instruction de prise de vue est une seconde instruction de prise de vue, la détermination d'une caméra (31, 193) avec un champ de vision relativement petit en tant que première caméra.
  3. Procédé (10) selon la revendication 1 ou 2, dans lequel l'invocation de la première caméra pour effectuer une prévisualisation comprend :
    lorsque l'instruction de prise de vue est la première instruction de prise de vue, l'invocation de la première caméra pour effectuer une prévisualisation à une première fréquence d'images ; ou
    lorsque l'instruction de prise de vue est la seconde instruction de prise de vue, l'invocation de la première caméra pour effectuer une prévisualisation à une deuxième fréquence d'images.
  4. Procédé (10) selon l'une quelconque des revendications 1 à 3, dans lequel s'il est détecté que l'objet mobile cible (60) se déplace dans le champ de vision de la première caméra, le procédé (10) comprend en outre :
    la détermination d'un ou plusieurs des paramètres suivants sur la base de la vitesse de déplacement de l'objet mobile cible (60) : un moment de démarrage de la seconde caméra, un emplacement déclencheur du démarrage de la seconde caméra, un paramètre d'exposition de la seconde caméra, et une longueur focale de la seconde caméra, dans lequel l'emplacement déclencheur est un emplacement (30) vers lequel l'objet mobile se déplace.
  5. Procédé(10) selon l'une quelconque des revendications 1 à 4, dans lequel, lors de l'invocation de la seconde caméra pour tourner une vidéo (312) de l'objet mobile cible (60) à la fréquence d'images de prise de vue cible (240, 480, 500), le procédé (10) comprend en outre :
    l'invocation de la première caméra pour tourner la vidéo (312) de l'objet mobile cible (60) à une troisième fréquence d'images de prise de vue, dans lequel la troisième fréquence d'images de prise de vue est différente de la fréquence d'images de prise de vue cible (240, 480, 500).
  6. Procédé (10) selon l'une quelconque des revendications 1 à 4, dans lequel, après l'invocation de la seconde caméra pour tourner une vidéo (312) de l'objet mobile cible (60) à la fréquence d'images de prise de vue (240, 480, 500), le procédé (10) comprend en outre :
    la lecture d'un premier fichier vidéo tourné par la première caméra et d'un second fichier vidéo tourné par la seconde caméra.
  7. Dispositif électronique (80, 81, 100, 110), comprenant un processeur (110, 811) et une mémoire (121, 813), dans lequel le processeur (110,811) est couplé à la mémoire (121, 813), la mémoire stocke un code de programme, et le processeur (110, 811) est configuré de telle sorte que, lorsque le processeur invoque et exécute le code de programme dans la mémoire (121, 813), le dispositif électronique (80, 81, 100, 110) effectue le procédé (10) selon l'une quelconque des revendications 1 à 6.
  8. Support de stockage lisible par ordinateur comprenant un programme informatique qui amène l'ordinateur à exécuter le procédé (10) selon l'une quelconque des revendications 1 à 6 lorsque le programme informatique s'exécute sur un ordinateur.
EP21791909.1A 2020-04-24 2021-04-19 Procédé de photographie vidéo et dispositif électronique Active EP4064689B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010331367.7A CN111526314B (zh) 2020-04-24 2020-04-24 视频拍摄方法及电子设备
PCT/CN2021/088189 WO2021213341A1 (fr) 2020-04-24 2021-04-19 Procédé de photographie vidéo et dispositif électronique

Publications (3)

Publication Number Publication Date
EP4064689A1 EP4064689A1 (fr) 2022-09-28
EP4064689A4 EP4064689A4 (fr) 2023-06-21
EP4064689B1 true EP4064689B1 (fr) 2024-06-05

Family

ID=71911192

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21791909.1A Active EP4064689B1 (fr) 2020-04-24 2021-04-19 Procédé de photographie vidéo et dispositif électronique

Country Status (4)

Country Link
US (1) US20230055623A1 (fr)
EP (1) EP4064689B1 (fr)
CN (1) CN111526314B (fr)
WO (1) WO2021213341A1 (fr)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111526314B (zh) * 2020-04-24 2022-04-05 荣耀终端有限公司 视频拍摄方法及电子设备
CN112333382B (zh) * 2020-10-14 2022-06-10 维沃移动通信(杭州)有限公司 拍摄方法、装置及电子设备
CN112399076B (zh) * 2020-10-27 2022-08-02 维沃移动通信有限公司 视频拍摄方法及装置
CN114466238B (zh) * 2020-11-09 2023-09-29 华为技术有限公司 帧解复用方法、电子设备及存储介质
CN112770056B (zh) * 2021-01-20 2022-06-24 维沃移动通信(杭州)有限公司 拍摄方法、拍摄装置和电子设备
CN113329172B (zh) * 2021-05-11 2023-04-07 维沃移动通信(杭州)有限公司 拍摄方法、装置及电子设备
CN113722058B (zh) * 2021-06-16 2022-10-25 荣耀终端有限公司 一种资源调用方法及电子设备
CN114422692B (zh) * 2022-01-12 2023-12-08 西安维沃软件技术有限公司 视频录制方法、装置及电子设备
CN114466232B (zh) * 2022-01-29 2024-07-26 维沃移动通信有限公司 视频处理方法、装置、电子设备和介质
CN116723382B (zh) * 2022-02-28 2024-05-03 荣耀终端有限公司 一种拍摄方法及相关设备
WO2023177245A1 (fr) * 2022-03-17 2023-09-21 Samsung Electronics Co., Ltd. Procédé et système de photographie à exposition longue d'un dispositif à caméras multiples
CN117857911A (zh) * 2022-09-30 2024-04-09 北京小米移动软件有限公司 拍摄方法、装置、电子设备及存储介质
CN116347224B (zh) * 2022-10-31 2023-11-21 荣耀终端有限公司 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8531525B2 (en) * 2009-12-22 2013-09-10 Utc Fire & Security Americas Corporation, Inc. Surveillance system and method for operating same
CN104967803B (zh) * 2015-07-01 2018-01-19 广东欧珀移动通信有限公司 一种视频录制方法及装置
US10547776B2 (en) * 2016-09-23 2020-01-28 Apple Inc. Devices, methods, and graphical user interfaces for capturing and recording media in multiple modes
US10438630B2 (en) * 2017-02-10 2019-10-08 Canon Kabushiki Kaisha Display control apparatus that performs time-line display, method of controlling the same, and storage medium
CN107395972B (zh) * 2017-07-31 2020-03-06 华勤通讯技术有限公司 一种快速移动对象的拍摄方法及终端
CN107396019B (zh) * 2017-08-11 2019-05-17 维沃移动通信有限公司 一种慢动作视频录制方法及移动终端
KR102488410B1 (ko) * 2017-11-07 2023-01-16 삼성전자주식회사 복수의 카메라들을 이용하여 영상을 촬영하는 전자 장치와 이의 동작 방법
CN109743505B (zh) * 2019-01-25 2021-01-19 Oppo广东移动通信有限公司 基于激光测距的视频拍摄方法、装置及电子设备
US11064108B2 (en) * 2019-08-21 2021-07-13 Sony Corporation Frame rate control for media capture based on rendered object speed
CN111526314B (zh) * 2020-04-24 2022-04-05 荣耀终端有限公司 视频拍摄方法及电子设备

Also Published As

Publication number Publication date
EP4064689A4 (fr) 2023-06-21
US20230055623A1 (en) 2023-02-23
EP4064689A1 (fr) 2022-09-28
WO2021213341A1 (fr) 2021-10-28
CN111526314A (zh) 2020-08-11
CN111526314B (zh) 2022-04-05

Similar Documents

Publication Publication Date Title
EP4064689B1 (fr) Procédé de photographie vidéo et dispositif électronique
US11849210B2 (en) Photographing method and terminal
EP3872807B1 (fr) Procédé de commande vocale et dispositif électronique
WO2021017889A1 (fr) Procédé d'affichage d'appel vidéo appliqué à un dispositif électronique et appareil associé
US20230041287A1 (en) Interaction Method for Cross-Device Task Processing, Electronic Device, and Storage Medium
US20220300154A1 (en) Split-Screen Display Processing Method and Apparatus, and Electronic Device
EP4131911A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
EP3674866B1 (fr) Procédé de traitement de notification, et dispositif électronique
CN113885759B (zh) 通知消息处理方法、设备、系统及计算机可读存储介质
US20230216990A1 (en) Device Interaction Method and Electronic Device
EP4199499A1 (fr) Procédé de capture d'image, interface graphique utilisateur et dispositif électronique
US20240276097A1 (en) Quick photographing method, electronic device, and computer-readable storage medium
US20230276125A1 (en) Photographing method and electronic device
CN114356195B (zh) 一种文件传输的方法及相关设备
EP4395290A1 (fr) Procédé de lecture audio bluetooth, dispositif électronique, et support de stockage
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
US20230125072A1 (en) Photo preview method, electronic device, and storage medium
CN116389884B (zh) 缩略图显示方法及终端设备
CN116301483A (zh) 一种应用卡片的管理方法、电子设备和存储介质
WO2024114212A1 (fr) Procédé de commutation de mise au point inter-dispositifs, dispositif électronique et système
WO2023160224A9 (fr) Procédé pour photographier et dispositif associé
WO2023207799A1 (fr) Procédé de traitement de messages et dispositif électronique
CN118331469A (zh) 一种截图方法
CN118555469A (zh) 一种拍摄方法
CN118626182A (zh) 切换应用程序的方法、电子设备

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220624

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40077752

Country of ref document: HK

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602021014165

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04N0005760000

Ipc: H04N0023667000

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: H04N0005760000

Ipc: H04N0023667000

A4 Supplementary search report drawn up and despatched

Effective date: 20230519

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 23/698 20230101ALI20230512BHEP

Ipc: H04N 23/90 20230101ALI20230512BHEP

Ipc: H04N 23/63 20230101ALI20230512BHEP

Ipc: H04N 23/62 20230101ALI20230512BHEP

Ipc: H04N 5/783 20060101ALI20230512BHEP

Ipc: H04N 5/77 20060101ALI20230512BHEP

Ipc: H04N 23/667 20230101AFI20230512BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20231219

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602021014165

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240605

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20240605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240605

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240906

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240905