CN111757005A - Shooting control method and device, computer readable medium and electronic equipment - Google Patents

Shooting control method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN111757005A
CN111757005A CN202010642810.2A CN202010642810A CN111757005A CN 111757005 A CN111757005 A CN 111757005A CN 202010642810 A CN202010642810 A CN 202010642810A CN 111757005 A CN111757005 A CN 111757005A
Authority
CN
China
Prior art keywords
lens module
scene
module
distance
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010642810.2A
Other languages
Chinese (zh)
Inventor
闫鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010642810.2A priority Critical patent/CN111757005A/en
Publication of CN111757005A publication Critical patent/CN111757005A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Abstract

The disclosure provides a shooting control method, a shooting control device, a computer readable medium and an electronic device, and relates to the technical field of data processing. The method comprises the following steps: when a starting instruction of a shooting mode is received, determining a scene distance between a current scene and the electronic equipment according to object distances between target objects in the current scene and the electronic equipment; determining a target lens module in the at least two lens modules based on the scene distance; and opening the target lens module to enter a shooting mode through the target lens module. This is disclosed to realize the automatic switch-over of lens module, has avoided the loaded down with trivial details problem of lens module switching operation among the correlation technique simultaneously to a certain extent.

Description

Shooting control method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a shooting control method, a shooting control apparatus, a computer-readable medium, and an electronic device.
Background
With the continuous development of electronic devices, various electronic devices capable of shooting are increasing. In order to capture images with better quality, people have started to mount a plurality of camera modules on one electronic device. For example, a mobile phone or a tablet pc having a plurality of lens modules such as a main camera module, a telephoto lens module, and a wide-angle lens module.
Disclosure of Invention
The present disclosure is directed to a shooting control method, a shooting control apparatus, a computer-readable medium, and an electronic device, which implement automatic switching of a lens module and avoid the problem of complicated switching operation of the lens module in the related art to a certain extent.
According to a first aspect of the present disclosure, there is provided a shooting control method applied to an electronic device including at least two lens modules, the method including: when a starting instruction of a shooting mode is received, determining a scene distance between a current scene and the electronic equipment according to object distances between target objects in the current scene and the electronic equipment; determining a target lens module group in the at least two lens module groups based on the scene distance; and opening the target lens module to enter a shooting mode through the target lens module.
According to a second aspect of the present disclosure, there is provided a photographing control apparatus applied to an electronic device including at least two lens modules, the apparatus including: the distance determining module is used for determining scene distances between the current scene and the electronic equipment according to object distances between target objects in the current scene and the electronic equipment when a starting instruction of a shooting mode is received; a module determining module for determining a target lens module among the at least two lens modules based on the scene distance; and the module control module is used for opening the target lens module so as to enter a shooting mode through the target lens module.
According to a third aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the above-mentioned method.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the above-described method.
According to the shooting control method provided by the embodiment of the disclosure, when a starting instruction of a shooting mode is received, a scene distance between a current scene and an electronic device is determined according to object distances between target objects in the current scene and the electronic device, then a target lens module is determined in at least two lens modules included in the electronic device according to the scene distance, and then the target lens module is called and started, so that when the electronic device is started and enters the shooting mode, image acquisition can be directly performed through the target lens module. The shooting mode is entered in this way, the lens module which is adaptive to the scene distance can be directly used when the shooting mode is started, and the purpose of automatic switching in advance is realized; meanwhile, the problem that in the related art, after the shooting mode is started, a user needs to perform complicated switching operation, so that the switching efficiency is low is solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
fig. 3 schematically illustrates a flowchart of a photographing control method in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic diagram of the acquisition of object distance between a target object and an electronic device by a TOF module;
FIG. 5 schematically illustrates an architecture diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 6 schematically illustrates a flowchart of another photographing control method in an exemplary embodiment of the present disclosure;
fig. 7 schematically illustrates a flowchart of a method of invoking a lens module in an exemplary embodiment of the present disclosure;
FIG. 8 is a schematic diagram illustrating a dependency relationship in an exemplary embodiment of the present disclosure;
fig. 9 schematically shows a composition diagram of the photographing control apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which a photographing control method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be terminal devices comprising at least two lens modules, including but not limited to desktop computers, smart phones, tablets, cameras, etc. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The shooting control method provided by the embodiment of the present disclosure is generally executed by the terminal apparatuses 101, 102, 103, and accordingly, the shooting control means is generally provided in the terminal apparatuses 101, 102, 103. However, it is easily understood by those skilled in the art that the shooting control method provided in the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the shooting control apparatus may also be disposed in the server 105, which is not particularly limited in the present exemplary embodiment.
For example, when the terminal device 101, 102, 103 receives an opening instruction of the shooting mode, a scene distance between a current scene and the terminal device is determined, then a target lens module is determined in at least two lens modules based on the scene distance, and then the target lens module in the terminal device 101, 102, 103 is called and opened to open through the target lens module and enter the shooting mode.
For another example, the server 105 may establish a connection with the terminal devices 101, 102, and 103, then the server receives a start instruction of the shooting mode of the terminal devices 101, 102, and 103 through the connection 105, and when receiving the start instruction of the shooting mode, the server controls the terminal devices 101, 102, and 103 to determine a scene distance between a current scene and the terminal devices; then, the server 105 determines a target lens module from the at least two lens modules based on the scene distance, and then controls the terminal device 101, 102, 103 to open the target lens module, so as to control the terminal device 101, 102, 103 to enter a shooting mode through the target lens module.
An exemplary embodiment of the present disclosure provides an electronic device for implementing a photographing control method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device includes at least a processor and a memory for storing executable instructions of the processor, the processor being configured to execute the photographing control method via execution of the executable instructions.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, a camera module 291, an indicator 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The camera module 291 may include at least two lens modules, for example, at least two of a main camera lens module, a telephoto lens module, a wide-angle lens module, and a macro lens module.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire range information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291. The depth sensor 2801 may be a tof (time of flight) sensor, an infrared distance measurement sensor, or the like, and the distance information acquired by the sensor may be used to determine a scene distance between the current scene and the electronic device or the camera module.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
In the related art, after a shooting mode is usually started, image data is collected through a fixed main shooting lens module, and then other lens modules are triggered to be started after a user manually enlarges or manually reduces the image data to a certain multiple, so that the image data is further collected through the started lens module. However, in this way, if the current scene that the user needs to directly capture is not suitable for capturing through the fixed main camera lens module, after the shooting mode is turned on, the user needs to switch the lens module through a cumbersome operation.
In view of one or more of the above problems, the present exemplary embodiment provides a shooting control method applied to an electronic device including at least two lens modules, for example, a mobile phone, a tablet pc, a video camera, a wearable device, and the like, on which a plurality of lens modules are mounted. Referring to fig. 3, the photographing control method may include the following steps S310 to S330:
in step S310, when a start instruction of a shooting mode is received, the electronic device determines a scene distance between a current scene and the electronic device according to object distances between target objects in the current scene and the electronic device.
In an exemplary embodiment, when the electronic device is in the non-shooting mode, the user may trigger the electronic device to turn on the shooting mode through a certain operation. The shooting mode can include a shooting mode, a video recording mode and other modes requiring image data acquisition through the lens module. For example, for a mobile phone, an electronic device may be triggered to start a shooting mode by operations such as opening a camera application and opening a scanning function of an application; as another example, for a camera, the camera may be switched to a shooting mode by pressing an image capture button.
In an exemplary embodiment, the scene distance may include a parameter for representing a distance between the current scene and the electronic device. The scene distance is related to the object distance between each target object in the current scene and the electronic device. Therefore, the object distance between each target object in the current scene and the electronic device can be acquired first, and then the scene distance can be calculated and determined according to the object distance of each target object. The detection of the object distance can be realized by a depth sensor arranged in the electronic device, such as a TOF module, an infrared distance measuring module, and the like. The object distance between each target object in the current scene and the electronic equipment can be detected through a depth sensor arranged in the electronic equipment.
Fig. 4 is a schematic diagram of the principle of obtaining the object distance between the target object and the electronic device through the TOF module 400. Referring to fig. 4, after an emitter 410 in a TOF module 400 in an electronic device emits an electromagnetic wave, a timer 420 starts timing, the electromagnetic wave is reflected back after contacting a target object, and after a detector 430 in the TOF module detects the returned electromagnetic wave, the timer 420 ends timing, and an object distance between the target object and the electronic device can be calculated according to a timing result of the timer 420 and a propagation speed of the electromagnetic wave.
In an exemplary embodiment, after the object distances between the respective target objects in the current scene and the electronic device are obtained, an average value of the object distances of all the target objects in the current scene may be calculated, and the average value may be determined as the scene distance between the current scene and the electronic device.
It should be noted that the scene distance may also be obtained by performing other processing on the object distance. For example, the scene distance may be a distance between a target object in a particular area in the current scene and the electronic device; as another example, the scene distance may take the maximum of the object distances of all target objects in the current scene. The present disclosure is not limited thereto.
In step S320, a target lens module is determined among the at least two lens modules based on the scene distance.
In an exemplary embodiment, different lens modules are suitable for shooting scenes with different distances, i.e. scenes with different object distances. For example, when a user needs to shoot a long-range view, image data with better quality can be collected by using the telephoto lens module. Therefore, a preset object distance threshold value can be set in advance according to the special effect of each lens module, and then one of the at least two lens modules is determined as the target lens module according to the determined scene distance and the preset object distance threshold values corresponding to the at least two lens modules. For example, when the scene distance is within the preset object distance threshold range corresponding to the wide-angle lens module, the wide-angle lens module may be determined to be the target lens module.
Optionally, when the electronic device only includes two lens modules, and the two lens modules are the main shooting lens module and the telephoto lens module respectively, the preset object distance threshold of the main shooting lens module and the telephoto lens module can be set through the maximum object distance corresponding to the main shooting lens module. Specifically, since the telephoto lens module is generally used for taking a long-range view, the preset object distance thresholds of the two lens modules may be divided by the maximum object distance corresponding to the main photographing lens module. For example, when the scene distance is greater than the maximum object distance of the main shooting lens module, determining the telephoto lens module as the target lens module; and when the scene distance is less than or equal to the maximum object distance of the main shooting lens module, determining the main shooting lens module as a target lens module.
In step S330, the target lens module is turned on to enter a shooting mode through the target lens module.
In an exemplary embodiment, after the target lens module is determined, the target lens module is turned on, and then a shooting mode is entered through the target lens module. That is, when the shooting mode is started, the target lens module is started. The target lens module is determined before the shooting mode is entered, so that the image data of the current scene can be directly acquired through the target lens module which is more matched with the scene distance of the current scene when the shooting mode is entered; meanwhile, the target lens module is matched with the scene distance of the current scene, so that manual operation of a user for matching the current scene after entering a shooting mode is avoided. In addition, when the preview picture exists, the times of lens module switching after the shooting mode is entered can be reduced in this way, so that the times of the condition that the preview picture disappears caused by lens switching are reduced.
In an exemplary embodiment, when the electronic device is provided with the display device, after the target lens module is opened to enter the shooting mode through the target lens module, a preview picture can be generated through image data of a current scene collected by the target lens module and displayed on the display device.
In an exemplary embodiment, the user may need to photograph other scenes before entering the photographing mode and not exiting, for example, after the user has photographed the current scene, the user may need to photograph other scenes. At this time, after the target lens module is started and enters a shooting mode, the scene distance between the current scene and the electronic equipment can be determined in real time, so that the currently started lens module is switched under the shooting mode according to the scene distance. In this way, the switching of the lens module may occur after the shooting mode is entered, so that the automatic switching can avoid the process that the user needs to perform complicated manual operation to switch the currently enabled lens module after the current scene changes.
The following describes in detail a technical solution of an embodiment of the present disclosure with reference to fig. 5 to 7, taking a TOF module to detect an object distance between a target object and an electronic device, and taking the electronic device including a main shooting lens module and a telephoto lens module as an example.
The shooting control method can be processed by a processor in the electronic equipment, and the processor can control each module in the electronic equipment through a set software framework so as to realize the shooting control method.
For example, referring to fig. 5, the electronic device 500 includes a processor 510, a main shooting lens module 531 and a telephoto lens module 532 in the TOF module 520 and the lens module 530.
Referring to fig. 6, in step S610, when receiving a start instruction of a shooting mode, the processor 510 invokes the TOF module 520 to detect an object distance between a target object and the electronic device 500, and calculates an average value according to the object distance to determine a scene distance of a current scene after obtaining the object distance; in step S620, the processor 510 determines a relationship between the scene distance and the maximum object distance of the main camera lens module 531; in step S630, when the scene distance is less than or equal to the maximum object distance, the processor 510 determines that the main shooting lens module 531 is the target lens module, and starts the main shooting lens module 531; in step S640, when the scene distance is about the maximum object distance, the processor 510 determines that the telephoto lens module 532 is the target lens module, and turns on the telephoto lens module 532.
The process of opening the lens module can be shown in fig. 7. In step S710, when the processor 510 issues a call instruction, the call instruction may carry an identifier of a target lens module, and the corresponding lens module is already opened according to the identifier; in step S720, the processor 510 determines whether the target lens module is successfully turned on; in step S730, after the target lens module is successfully opened, the processor 510 sends a preview image opening instruction; in step S740, the target lens module acquires image data of the current scene according to the instruction to open the preview picture; in step S750, the processor 510 generates a preview image according to the image data collected by the target lens module, and displays the preview image on the corresponding display device.
It should be noted that, as shown in fig. 8, in an actual application process, an application 810 such as a camera application may call a hardware device 830 such as a TOF module and a lens module through a processor 820 and a software framework provided therein, the hardware device may also transmit data to the application such as the camera application through the processor and the software framework provided therein, and the hardware device and the application interact with each other through the processor.
In summary, the exemplary embodiment provides a new switching manner, which can switch before entering the shooting mode, so as to avoid the problem in the related art that a complicated operation needs to be performed after entering the shooting mode, improve the switching efficiency, and avoid the situation that the preview screen disappears.
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 9, the embodiment of the present example further provides a shooting control apparatus 900, which includes a distance determining module 910, a module determining module 920, and a module control module 930. Wherein:
the distance determining module 910 may be configured to determine, when a start instruction of a shooting mode is received, a scene distance between a current scene and the electronic device according to object distances between target objects in the current scene and the electronic device.
The module determining module 920 may be configured to determine a target lens module group among the at least two lens module groups based on the scene distance.
The module control module 930 may be configured to turn on the target lens module to enter a shooting mode through the target lens module.
In an exemplary embodiment, the module determining module 920 may be configured to determine a target lens module in at least two lens modules according to the scene distance and a preset object distance threshold corresponding to the at least two lens modules.
In an exemplary embodiment, when the electronic device includes two lens modules, and the two lens modules are a main lens module and a telephoto lens module, the preset object distance thresholds of the main lens module and the telephoto lens module are determined according to the maximum object distance corresponding to the main lens module.
In an exemplary embodiment, the module control module 930 may be configured to obtain object distances between each target object in the current scene and the electronic device; and calculating the scene distance between the current scene and the electronic equipment according to the object distance of each target object.
In an exemplary embodiment, the module control module 930 may be configured to calculate an average value of the object distances of the target objects, and determine the average value as the scene distance between the current scene and the electronic device.
In an exemplary embodiment, the distance determining module 910 may be configured to determine a scene distance between a current scene and the electronic device in real time; the module control module 930 may be configured to switch a currently enabled lens module in the shooting mode according to the scene distance.
In an exemplary embodiment, the module control module 930 may be configured to generate a preview screen according to the image data of the current scene collected by the target lens module.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3, fig. 6, and fig. 7 may be performed.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. A shooting control method is applied to electronic equipment comprising at least two lens modules, and comprises the following steps:
when a starting instruction of a shooting mode is received, determining a scene distance between a current scene and the electronic equipment according to object distances between target objects in the current scene and the electronic equipment;
determining a target lens module among the at least two lens modules based on the scene distance;
and opening the target lens module to enter a shooting mode through the target lens module.
2. The method of claim 1, wherein determining a target lens module among the at least two lens modules based on the scene distance comprises:
and determining a target lens module in the at least two lens modules according to the scene distance and a preset object distance threshold corresponding to the at least two lens modules.
3. The method according to claim 2, wherein when the electronic device includes two lens modules, and the two lens modules are a main lens module and a telephoto lens module, the preset object distance threshold values of the main lens module and the telephoto lens module are determined according to the maximum object distance corresponding to the main lens module.
4. The method of claim 1, wherein determining the scene distance between the current scene and the electronic device according to the object distance between each target object in the current scene and the electronic device comprises:
acquiring object distances between each target object in the current scene and the electronic equipment;
and calculating the scene distance between the current scene and the electronic equipment according to the object distance of each target object.
5. The method of claim 4, wherein calculating the scene distance between the current scene and the electronic device according to the object distance of each target object comprises:
and calculating an average value of the object distances of the target objects, and determining the average value as the scene distance between the current scene and the electronic equipment.
6. The method according to claim 1, wherein after entering a shooting mode through the target lens module, the method further comprises:
determining a scene distance between a current scene and the electronic equipment in real time;
and switching the currently started lens module in the shooting mode according to the scene distance.
7. The method according to claim 1, wherein after entering a shooting mode through the target lens module, the method further comprises:
and generating a preview picture according to the image data of the current scene acquired by the target lens module.
8. A shooting control device is applied to an electronic device comprising at least two lens modules, and comprises:
the distance determining module is used for determining scene distances between the current scene and the electronic equipment according to object distances between target objects in the current scene and the electronic equipment when a starting instruction of a shooting mode is received;
a module determining module, configured to determine a target lens module from the at least two lens modules based on the scene distance;
and the module control module is used for opening the target lens module so as to enter a shooting mode through the target lens module.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 7 via execution of the executable instructions.
CN202010642810.2A 2020-07-06 2020-07-06 Shooting control method and device, computer readable medium and electronic equipment Pending CN111757005A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010642810.2A CN111757005A (en) 2020-07-06 2020-07-06 Shooting control method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010642810.2A CN111757005A (en) 2020-07-06 2020-07-06 Shooting control method and device, computer readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111757005A true CN111757005A (en) 2020-10-09

Family

ID=72679557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010642810.2A Pending CN111757005A (en) 2020-07-06 2020-07-06 Shooting control method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111757005A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117354624A (en) * 2023-12-06 2024-01-05 荣耀终端有限公司 Camera switching method, device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160301852A1 (en) * 2015-04-10 2016-10-13 Qualcomm Incorporated Methods and apparatus for defocus reduction using laser autofocus
CN108391035A (en) * 2018-03-26 2018-08-10 华为技术有限公司 A kind of image pickup method, device and equipment
CN110602387A (en) * 2019-08-28 2019-12-20 维沃移动通信有限公司 Shooting method and electronic equipment
CN110677621A (en) * 2019-09-03 2020-01-10 RealMe重庆移动通信有限公司 Camera calling method and device, storage medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160301852A1 (en) * 2015-04-10 2016-10-13 Qualcomm Incorporated Methods and apparatus for defocus reduction using laser autofocus
CN108391035A (en) * 2018-03-26 2018-08-10 华为技术有限公司 A kind of image pickup method, device and equipment
CN110602387A (en) * 2019-08-28 2019-12-20 维沃移动通信有限公司 Shooting method and electronic equipment
CN110677621A (en) * 2019-09-03 2020-01-10 RealMe重庆移动通信有限公司 Camera calling method and device, storage medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117354624A (en) * 2023-12-06 2024-01-05 荣耀终端有限公司 Camera switching method, device and storage medium

Similar Documents

Publication Publication Date Title
WO2020224479A1 (en) Method and apparatus for acquiring positions of target, and computer device and storage medium
CN110650379B (en) Video abstract generation method and device, electronic equipment and storage medium
CN111246095B (en) Method, device and equipment for controlling lens movement and storage medium
CN110572716B (en) Multimedia data playing method, device and storage medium
CN111917980B (en) Photographing control method and device, storage medium and electronic equipment
CN111866483B (en) Color restoration method and device, computer readable medium and electronic device
CN111815666B (en) Image processing method and device, computer readable storage medium and electronic equipment
CN111766606A (en) Image processing method, device and equipment of TOF depth image and storage medium
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN111416996B (en) Multimedia file detection method, multimedia file playing device, multimedia file equipment and storage medium
CN112581358A (en) Training method of image processing model, image processing method and device
CN113918767A (en) Video clip positioning method, device, equipment and storage medium
CN110807769B (en) Image display control method and device
CN113766127B (en) Mobile terminal control method and device, storage medium and electronic equipment
CN112289279B (en) Screen brightness adjusting method and device, storage medium and electronic equipment
CN111757005A (en) Shooting control method and device, computer readable medium and electronic equipment
CN112165576A (en) Image display method, image display device, storage medium and electronic equipment
CN113781548A (en) Multi-device pose measurement method, electronic device and system
CN112151017B (en) Voice processing method, device, system, equipment and storage medium
CN113709353B (en) Image acquisition method and device
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN115221888A (en) Entity mention identification method, device, equipment and storage medium
CN112073639A (en) Shooting control method and device, computer readable medium and electronic equipment
CN111367492A (en) Webpage display method and device and storage medium
CN111414496A (en) Artificial intelligence-based multimedia file detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201009