CN111917980A - Photographing control method and device, storage medium and electronic equipment - Google Patents

Photographing control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111917980A
CN111917980A CN202010746376.2A CN202010746376A CN111917980A CN 111917980 A CN111917980 A CN 111917980A CN 202010746376 A CN202010746376 A CN 202010746376A CN 111917980 A CN111917980 A CN 111917980A
Authority
CN
China
Prior art keywords
photographing
target area
motion state
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010746376.2A
Other languages
Chinese (zh)
Other versions
CN111917980B (en
Inventor
李逸超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010746376.2A priority Critical patent/CN111917980B/en
Publication of CN111917980A publication Critical patent/CN111917980A/en
Application granted granted Critical
Publication of CN111917980B publication Critical patent/CN111917980B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a photographing control method, a photographing control device, a computer readable storage medium and an electronic device, and relates to the technical field of image processing. The photographing control method comprises the following steps: if the trigger operation is detected, matching the trigger operation with a preset trigger operation; the preset trigger operation corresponds to a photographing operation mode; if the matching is successful, determining a target area from the areas to be shot comprising a plurality of objects to be shot in response to the trigger operation; acquiring the motion state type of the target area, and determining a target object from the target area; wherein the motion state type is a static state or a motion state; and determining photographing parameters for the target object in the target area according to the motion state type of the target area, and photographing the target object by using the photographing parameters to obtain a photographed image. The technical scheme of the embodiment of the disclosure can improve the quality of the photographed image.

Description

Photographing control method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a photographing control method, a photographing control apparatus, a computer-readable storage medium, and an electronic device.
Background
When the photographing is performed, the target object to be photographed can be selected through modes of manual clicking, automatic judgment and the like of a user. After the target object is confirmed, the corresponding photographing parameters and the like can be adjusted in a targeted manner, so that the parameters such as brightness, definition and color of the target object are preferably ensured to be appropriate, and the photographing quality is improved.
In the related art, when a plurality of objects are simultaneously contained in a viewing frame, because the camera preview only has two operations of focusing and exposure, the shooting parameters of all the objects are the same, so that the shot image has the problems of detail loss or unclear image due to improper shooting parameters, and the image quality is low and the shooting effect is poor.
Disclosure of Invention
The present disclosure provides a photographing control method, a photographing control apparatus, a computer-readable storage medium, and an electronic device, thereby overcoming, at least to some extent, the problem of poor image quality.
According to an aspect of the present disclosure, there is provided a photographing control method including: if the trigger operation is detected, matching the trigger operation with a preset trigger operation; the preset trigger operation corresponds to a photographing operation mode; if the matching is successful, determining a target area from the areas to be shot comprising a plurality of objects to be shot in response to the trigger operation; acquiring the motion state type of the target area, and determining a target object from the target area; wherein the motion state type is a static state or a motion state; and determining photographing parameters for the target object in the target area according to the motion state type of the target area, and photographing the target object by using the photographing parameters to obtain a photographed image.
According to an aspect of the present disclosure, there is provided a photographing control apparatus including: the operation matching module is used for matching the trigger operation with a preset trigger operation if the trigger operation is detected; the preset trigger operation corresponds to a photographing operation mode; the target area determining module is used for responding to the triggering operation to determine a target area from the areas to be shot comprising a plurality of objects to be shot if the matching is successful; the target object determining module is used for acquiring the motion state type of the target area and determining a target object from the target area; wherein the motion state type is a static state or a motion state; and the parameter determining module is used for determining photographing parameters for the target object in the target area according to the motion state type of the target area, and photographing the target object by using the photographing parameters to obtain a photographed image.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a photographing control method as recited in any of the above.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute any one of the above-described photographing control methods via execution of the executable instructions.
In the technical solutions provided in some embodiments of the present disclosure, when the trigger operation is successfully matched with the preset trigger operation corresponding to the photographing operation mode, a target area in a moving state or a static state may be determined from the to-be-photographed areas including a plurality of to-be-photographed objects according to the trigger operation, and further, the target object may be selected from the target area, thereby avoiding a problem that the target area cannot be accurately selected due to the presence of a plurality of to-be-photographed objects, and improving the accuracy of identifying the target area and the target object to be photographed. On the other hand, the photographing parameters of the target object in the target area can be determined according to the motion state type of the target area, so that photographing can be performed, the target objects in different types of target areas can be photographed distinctively, pertinence is improved, the matching degree of the photographing parameters and the target area is also improved, the photographing effect can be improved, the parameters of the image such as definition and integrity are improved, and the quality of the photographed image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 is a schematic view schematically showing a photographed image in the related art;
fig. 2 is a schematic diagram illustrating an application scenario to which the photographing control method or the photographing control apparatus according to the embodiment of the present disclosure may be applied;
FIG. 3 illustrates a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure;
fig. 4 schematically shows a flowchart of a photographing control method according to an exemplary embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating a preset trigger operation in an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart illustrating the determination of a target object in an embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating selection of a target object in a stationary state in an embodiment of the present disclosure;
FIG. 8 is a schematic diagram illustrating selection of a target object in motion in an embodiment of the present disclosure;
fig. 9 schematically shows a block diagram of a photographing control apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation. In addition, all of the following terms "first" and "second" are used for distinguishing purposes only and should not be construed as limiting the present disclosure.
Referring to fig. 1, when the finder frame contains a flying bee and a flower at the same time, the user may shoot the flying bee or want to shoot a stationary flower. Taking focusing as an example, due to the hardware characteristics, the depth of field of the camera is limited, the focusing area is clear, and the virtual focusing area is blurred or blurred. The position of a moving object changes quickly, the moving object can be clearly shot only by short exposure (but the noise is more), and a static object can be exposed for a longer time, so that the definition and the noise are both considered. In the graph a in fig. 1, the photographed bees are clearer when the exposure time is shorter. In fig. 1, in panel B, there may be a partial loss of image detail when the exposure time is short. Because the images cannot be shot distinctively and pertinently, the shot images have the problems of detail loss or image unclear caused by improper shooting parameters, the image quality is low, and the shooting effect is poor.
In order to solve the above technical problem, an embodiment of the present disclosure provides a photographing control method. Fig. 2 is a schematic diagram illustrating an application scenario to which the photographing control method or the photographing control apparatus according to the embodiment of the present disclosure may be applied.
The photographing control method can be applied to the image acquisition process, as shown in fig. 2, and particularly can be applied to the process of photographing a first object 202 and a second object 203 by using a terminal 201. The terminal 201 may be various types of clients capable of being used for shooting, for example, various smart phones, tablet computers, desktop computers, vehicle-mounted devices, wearable devices, and the like, which are capable of capturing images or videos and displaying the images or videos. The first object 202 and the second object 203 may be any type of object to be photographed in various scenes, such as a person, an animal, or a landscape, etc. The first object and the second object may be in a stationary state or in a moving state. In particular, a camera or a camera application on the terminal 201 may be used for image acquisition of the first object or the second image. The camera on the terminal may include a plurality of camera modules.
In the embodiment of the disclosure, the terminal may determine a target area in response to a trigger operation of a user, and further select a target object from the target area to take a picture, thereby obtaining a picture taken.
It should be noted that the photographing control method provided by the embodiment of the present disclosure may be completely executed by the terminal. Accordingly, the photographing control device may be provided in the terminal.
FIG. 3 shows a schematic diagram of an electronic device suitable for use in implementing exemplary embodiments of the present disclosure. It should be noted that the electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
The electronic device of the present disclosure includes at least a processor and a memory for storing one or more programs, which when executed by the processor, cause the processor to implement the photographing control method of the exemplary embodiments of the present disclosure.
Specifically, as shown in fig. 3, the electronic device 300 may include: the mobile phone includes a processor 310, an internal memory 321, an external memory interface 322, a Universal Serial Bus (USB) interface 330, a charging management Module 340, a power management Module 341, a battery 342, an antenna 1, an antenna 2, a mobile communication Module 350, a wireless communication Module 360, an audio Module 370, a speaker 371, a receiver 372, a microphone 373, an earphone interface 374, a sensor Module 380, a display screen 390, a camera Module 391, an indicator 392, a motor 393, a button 394, a Subscriber Identity Module (SIM) card interface 395, and the like. The sensor module 380 may include a depth sensor 3801, a pressure sensor 3802, a gyro sensor 3803, a barometric pressure sensor 3804, a magnetic sensor 3805, an acceleration sensor 3806, a distance sensor 3807, a proximity light sensor 3808, a fingerprint sensor 3809, a temperature sensor 3810, a touch sensor 3811, an ambient light sensor 3812, a bone conduction sensor 3813, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 300. In other embodiments of the present application, electronic device 300 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 310 may include one or more processing units, such as: the Processor 310 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors. Additionally, a memory may be provided in processor 310 for storing instructions and data.
The USB interface 330 is an interface conforming to the USB standard specification, and may specifically be a MiniUSB interface, a microsusb interface, a USB type c interface, or the like. The USB interface 330 may be used to connect a charger to charge the electronic device 300, and may also be used to transmit data between the electronic device 300 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
The charging management module 340 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. The power management module 341 is configured to connect the battery 342, the charging management module 340 and the processor 310. The power management module 341 receives the input from the battery 342 and/or the charging management module 340, and supplies power to the processor 310, the internal memory 321, the display screen 390, the camera module 391, the wireless communication module 360, and the like.
The wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The mobile communication module 350 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 300.
The Wireless Communication module 360 may provide solutions for Wireless Communication applied to the electronic device 300, including Wireless Local Area Networks (WLANs) (e.g., Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like.
The electronic device 300 implements a display function through the GPU, the display screen 390, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 390 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 310 may include one or more GPUs that execute program instructions to generate or alter display information.
The electronic device 300 may implement a shooting function through the ISP, the camera module 391, the video codec, the GPU, the display screen 390, the application processor, and the like. In some embodiments, the electronic device 300 may include 1 or N camera modules 391, where N is a positive integer greater than 1, and if the electronic device 300 includes N cameras, one of the N cameras is a main camera.
The internal memory 321 may be used to store computer-executable program code, which includes instructions. The internal memory 321 may include a program storage area and a data storage area. The external memory interface 322 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 300.
The electronic device 300 may implement an audio function through the audio module 370, the speaker 371, the receiver 372, the microphone 373, the earphone interface 374, the application processor, and the like. Such as music playing, recording, etc.
The audio module 370 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be disposed in the processor 310, or some functional modules of the audio module 370 may be disposed in the processor 310.
The speaker 371, also called a "horn", is used to convert an audio electrical signal into an acoustic signal. The electronic apparatus 300 can listen to music through the speaker 371 or listen to a hands-free call. Receiver 372, also referred to as an "earpiece," is used to convert electrical audio signals into acoustic signals. When the electronic device 300 receives a call or voice information, it can receive voice by placing the receiver 372 close to the human ear. The microphone 373, also called "microphone", is used to convert the sound signal into an electrical signal. When making a call or transmitting voice information, a user can input a voice signal to the microphone 373 by sounding a voice signal through the mouth of the user near the microphone 373. The electronic device 300 may be provided with at least one microphone 373. The headset interface 374 is used to connect wired headsets.
The depth sensor 3801 is used to obtain depth information of a scene with respect to sensors included in the electronic device 300. The pressure sensor 3802 is used for sensing a pressure signal, which can be converted into an electrical signal. The gyroscope sensor 3803 may be used to determine a motion gesture of the electronic device 300. The air pressure sensor 3804 is used for measuring air pressure. The magnetic sensor 3805 includes a hall sensor. The electronic device 300 may detect the opening and closing of the flip holster using the magnetic sensor 3805. The acceleration sensor 3806 can detect the magnitude of acceleration of the electronic device 300 in various directions (typically three axes). The distance sensor 3807 is used to measure distance. The proximity light sensor 3808 may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The fingerprint sensor 3809 is used for collecting a fingerprint. The temperature sensor 3810 is used to detect temperature. Touch sensor 3811 may communicate the detected touch operation to an application processor to determine a touch event type. Visual output associated with the touch operation may be provided via the display screen 390. The ambient light sensor 3812 is used to sense ambient light brightness. The bone conduction sensor 3813 may acquire a vibration signal.
Keys 394 include a power on key, a volume key, and the like. The keys 394 may be mechanical keys. Or may be touch keys. Motor 393 may generate a vibration cue. Motor 393 can be used for both an incoming call vibration prompt and for touch vibration feedback. Indicator 392 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 395 is for connecting a SIM card. The electronic device 300 interacts with the network through the SIM card to implement functions such as communication and data communication.
The present application also provides a computer-readable storage medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer-readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Fig. 4 schematically shows a flowchart of a photographing control method according to an exemplary embodiment of the present disclosure, which may be applied to a capturing end in an image processing process, for example, in an image capturing process of capturing an image, capturing a video, or previewing an image. Referring to fig. 4, the photographing control method may include steps S410 to S440, with the terminal as an execution subject, described in detail as follows:
in step S410, if a trigger operation is detected, matching the trigger operation with a preset trigger operation; and the preset trigger operation corresponds to a photographing operation mode.
In the embodiment of the present disclosure, the trigger operation refers to an operation for triggering selection of a target area. The triggering operation may be performed by a user, and the processor of the terminal may acquire and detect whether the triggering operation of the user is received. The triggering operation may be various types of operations for selecting an area, for example, one or a combination of multiple types of operations may be click, button, voice, expression, and body motion, and is not particularly limited herein as long as the terminal can be triggered to select a target area.
When a trigger operation is detected, the trigger operation may be matched with a stored preset trigger operation. The preset triggering operation is stored in the terminal in advance and used for accurately triggering the operation of selecting the target area. The preset trigger operation may be determined according to the photographing operation mode of the terminal, that is, if the photographing operation mode is different, the corresponding preset trigger operation may be different. The photographing operation mode refers to a mode for performing a photographing operation. One terminal can comprise one or more photographing operation modes, and can be set according to actual requirements. Specifically, the photographing operation mode may be a non-contact view-and-frame photographing mode or a contact view-and-frame photographing mode. The contact viewfinder type photographing refers to a mode of photographing by contacting a viewfinder frame in a photographing interface, for example, clicking a photographing button on the photographing interface. The non-contact framing frame-type photographing refers to a mode of photographing without a contact photographing interface, such as a mode of clicking a key of a terminal, gesture photographing, voice photographing, expression photographing or photographing through an external device (such as an earphone and a selfie stick) connected with the terminal. In the embodiment of the present disclosure, a non-contact framing frame-type photographing is taken as an example of a key photographing mode.
Further, different preset trigger modes can be set based on the photographing operation mode of the terminal. The preset triggering mode in the embodiment of the present disclosure may include, but is not limited to, one or a combination of more than one of a key pressing mode, a clicking mode, and a touch trajectory mode. For the touch trajectory approach, different target areas may be represented by different touch trajectories. For example, the first touch trajectory may be used to represent the target area as a motion state, and the second touch trajectory may be used to represent the target area as a rest state. The first touch track is a u-shaped track, and the second touch track is an n-shaped track.
The preset triggering operation can be configured in two ways, and a flow chart of the preset triggering operation is schematically shown in fig. 5, and referring to the flow chart shown in fig. 5, the preset triggering operation mainly includes the following steps:
in step S510, it is determined whether the photographing operation mode is non-contact framing frame photographing; if yes, go to step S520; if not, go to step S530.
In step S520, in response to a first touch operation, determining a candidate region according to a region corresponding to a touch point of the first touch operation;
in step S521, in response to a second touch operation, the target area is determined from the candidate area according to an operation type of the second touch operation.
The first touch operation and the second touch operation are different in acting object, the first touch operation refers to clicking operation on a photographing interface, and the second touch operation refers to pressing operation on a key. After the first touch operation is detected, the area within the preset range may be used as a candidate area with the touch point of the first touch operation as the center. The preset range can be set according to actual requirements. For the second touch operation, different operation types may correspond to different target areas. For example, the second touch operation is a single key press, and the target area is a static area; the second touch operation is to press the key for multiple times, and the target area is an area in a motion state. Whether the target area is an area in a moving state or an area in a stationary state may also be distinguished by how heavy the key is pressed or how long it is pressed. That is, if the photographing operation mode is an operation such as a button without touching the finder frame, it is necessary to click the rough target region and then confirm that the target region is a still or moving region by an operation such as a subsequent button.
In step S530, in response to a first touch operation, the target area is determined according to a position of a touch point of the first touch operation.
In this step, if the photographing operation mode does not belong to non-contact framing frame photographing, that is, when contact framing frame photographing is performed, the first touch operation refers to a click operation on a photographing interface. The target area may be determined according to a position of a touch point of the first touch operation when the first touch operation is detected. Specifically, the area within the preset range may be taken as the target area with the touch point of the first touch operation as the center. The preset range can be set according to actual requirements. Specifically, the target area may be determined according to a state of the object to be photographed at the position of the touch point of the first touch operation. For example, if the position of the touch point of the first touch operation is the object in the motion state, the target area is the area in the motion state; the position of the touch point of the first touch operation is a stationary object, and the target area is a moving area.
It should be noted that, because each terminal has multiple photographing operation modes, the two preset trigger operations may exist simultaneously. In the embodiment of the disclosure, the preset trigger operation is configured for different photographing operation modes, so that the proper preset trigger operation can be configured for different photographing operation modes, and the effectiveness and pertinence of triggering are improved.
After the preset trigger operation is configured, when the starting operation is detected, the tracks or the form content of the two can be matched. For example, the operations or trajectories of the two may be compared to determine whether they are identical. If the two are the same, the matching is determined to be successful, and if the two are not the same, the matching is failed.
In step S420, if the matching is successful, a target area is determined from the to-be-photographed areas including the plurality of to-be-photographed objects in response to the trigger operation.
In the embodiment of the present disclosure, if the trigger operation and the preset trigger operation are successfully matched, a target area may be selected from areas to be photographed including a plurality of objects to be photographed according to the preset trigger operation corresponding to the trigger operation, and the specific selection manner may be as in step S510 to step S530 in fig. 5, which is not described herein again. The region to be photographed may be a region in the finder frame, in which a plurality of objects to be photographed may be included, and states of the plurality of objects to be photographed may be the same or different. The area to be shot corresponding to one view frame can comprise a plurality of objects to be shot in a static state and/or a plurality of objects to be shot in a moving state.
Specifically, when the camera is opened, the action of the user is monitored in real time, and if the trigger operation is detected to be consistent with any preset trigger operation, the target area is triggered and selected according to a mode corresponding to the trigger operation. And when the action is consistent with the preset action, triggering. For example, when the photographing operation mode is non-contact framing frame photographing, if the triggering operation is clicking a photographing interface and pressing a key for multiple times, the target area is an area in a motion state.
Through the triggering operation in the embodiment of the disclosure, a target area in a static state or a motion state can be selected from the areas to be shot including a plurality of objects to be shot, so that the accuracy and pertinence of the selection of the target area can be improved, and the problem of wrong selection of a shooting subject is avoided.
In step S430, obtaining a motion state type of the target area, and determining a target object from the target area; wherein the motion state type is a static state or a motion state.
In the embodiment of the present disclosure, after the target area is determined, the motion state type thereof, that is, whether the area where the object to be photographed by the user is located belongs to the still state or the motion state, may be acquired. After determining the motion state type of the target area, the target object may be determined from the target area. Since the target area is used to describe the type of the object to be photographed, and there may be a plurality of objects to be photographed belonging to one type, the target object to be finally photographed is selected from the target area.
Fig. 6 schematically shows a flow chart of determining a target object, and referring to fig. 6, the method mainly includes the following steps:
in step S610, it is determined whether the motion state type of the target area is a stationary state; if yes, go to step S620; if not, go to step S630.
In step S620, if the target area is in a stationary state, the touch point of the first touch operation is closest, and the object to be photographed in the stationary state is determined as the target object.
In this step, when the target area to be photographed is a static area, the target area may be selected through the first touch operation and the second touch operation or the first touch operation due to the difference of the photographing operation modes. Based on this, the target object can be selected from the target area according to the touch point of the operation of contacting the photographing interface. The target object may be selected according to a distance to the touch point. Specifically, the touch point of the first touch operation may be closest, and the object to be photographed in the stationary state may be determined as the target object. For example, referring to fig. 7, the object to be photographed in the still state includes an object a and an object B, and if the object a is closest to the touch point C of the first touch operation, the target object is the object a.
In step S630, if the target area is in a motion state, the object to be photographed in the target area is screened, so as to obtain the target object from the target area.
In this step, when the target area to be photographed is an area in a motion state, the target area may be selected through the first touch operation and the second touch operation or the first touch operation due to a difference in photographing operation modes. Based on this, the target object can be selected from the target area according to the touch point of the operation of contacting the photographing interface.
Specifically, it may be determined first whether or not there is a subject to be photographed in a moving state in the target region. Whether the target area is in motion can be determined, for example, by whether the positions of all the objects to be photographed in the target area are changed or other parameters.
And if the object to be shot in the motion state exists, the touch point of the first touch operation is closest, and the object to be shot in the motion state is determined as the target object. In the case where there is a motion state, a target object may be selected from the target area according to a touch point of an operation of contacting the photographing interface. The target object may be selected according to a distance to the touch point. Specifically, the touch point of the first touch operation may be closest, and the object to be photographed in the motion state may be determined as the target object. For example, referring to fig. 8, the object to be photographed in the motion state includes an object D and an object E, and if the object E is closest to the touch point C of the first touch operation, the target object is the object E.
And if the object to be shot in the motion state does not exist, providing prompt information on the shooting interface for reminding the user to select again.
It is added that after the target object is determined, a prompt mark may be provided at a preset position of the target object, so as to remind the user of the selected target object to be photographed.
Continuing to refer to fig. 4, in step S440, determining a photographing parameter for the target object in the target area according to the motion state type of the target area, and photographing the target object by using the photographing parameter to obtain a photographed image.
In the embodiment of the present disclosure, after determining the motion state type of the target area and the target object, in order to avoid the problem of poor image quality caused by using the same photographing parameter for photographing, the photographing parameter may be automatically adjusted according to the motion state type of the target area, so as to obtain the photographing parameter matched with the motion state type.
The photographing parameters may include one or more of a focusing parameter and an exposure time, and may include other types of parameters, which are not limited herein. The focusing parameters refer to parameters for changing the relative position of the object distance and the distance through the functions of a camera and a lens so as to enable a shot object to be blurred and clear. The exposure time refers to the light-sensitive time of the film, and the longer the exposure time, the brighter and the darker the picture produced on the film. The shorter the exposure time is, the moving object can be obtained; the longer the exposure time, the entire motion trajectory of the moving object can be acquired, and thus the obtained moving object is blurred.
For the target area and the target object whose motion state type is the stationary state, the automatic parameter may be set as the photographing parameter. The automatic parameters may be, for example, standard parameters that are not adjusted or default parameters. For example, a default focusing parameter and a default exposure time may be used as photographing parameters for a target subject in a stationary state.
For the target area and the target object whose motion state type is the stationary state, the default parameters have not been able to satisfy the photographing requirement, and thus the photographing parameters may be determined based on the motion parameters of the target object contained in the target area. When the photographing parameter is a focusing parameter, the motion parameter may be a distance of the target object relative to the terminal. Wherein the distance may be positively correlated with focus. That is, the smaller the distance, the smaller the relative position of focus; the greater the distance, the greater the relative position of focus. When the photographing parameter is the exposure time, the default exposure time can be shortened to obtain the corresponding exposure time, so that a clear target object is obtained. Specifically, the target exposure time may be set to the shortest exposure time first. Next, the moving speed of the target object may be judged, and the longest exposure time may be determined. The moving speed can be used as a reference, and the distance of the target object moving in the longest exposure time is smaller than a preset threshold value according to the moving speed, so that the longest exposure time is determined. The preset threshold value can be any value which enables an image to be clear, and can be specifically set according to actual requirements. After the longest exposure time is obtained, the longest exposure time may be taken as an exposure time of the target object in a moving state, and the longest exposure time may be smaller than a default exposure time. Of course, the shortest exposure time may also be directly set as the exposure time of the target object in the moving state.
In the embodiment of the disclosure, the automatic parameters are determined for the target object in the static target area, the photographing parameters are determined for the target object in the moving target area, different photographing parameters can be respectively determined based on different moving state types, the process of dynamically adjusting the photographing parameters according to the moving state type of the target area is realized, the photographing parameters can be accurately determined, the pertinence and the accuracy are improved, the limitation that only the same photographing parameters can be adopted is avoided, and the intelligent adjustment is realized.
After the photographing parameters of the target object are obtained, when a photographing triggering operation is received, the photographing parameters are adopted to photograph the target object, so that a photographed image containing the target object is obtained. The photographing triggering operation may be a voice operation, or an operation of clicking a photographing control or pressing a volume button, and the like. The photographed image may be a clear still image containing a target object in a moving state or a target object in a still state.
According to the technical scheme, the novel operation mode is provided, a user can select a subject (namely a target subject) to be shot in a region to be shot comprising a plurality of subjects to be shot through trigger operation, and the problems that the camera recognizes the subject incorrectly, so that errors such as focusing, exposure and white balance are caused, the shooting parameters are not appropriate, and the quality of a shot image is low are solved. The distinguishing of moving and static objects to be shot can be realized through new gestures such as trigger operation. And the photographing parameters aiming at the target object can be automatically adjusted according to different motion state types of the target object. For the moving target object, the exposure time is reduced, so that the photographed image corresponding to the moving target object is clear, and the image quality is improved. The problem that the target object in a static state loses partial details due to too short exposure time is avoided, and the integrity and the image quality can be improved. Different target areas are selected through trigger operations corresponding to different photographing operation modes, and appropriate photographing parameters are determined according to the motion characteristics (motion state types) of the target areas to carry out photographing control. The limitation caused by the fact that the objects to be photographed in all states can only be photographed according to the same photographing parameters in the related art is avoided, so that the photographing parameters are more consistent with the target area, the matching degree and the accuracy are improved, and the photographing quality and the photographing effect are also improved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Fig. 9 schematically shows a block diagram of the photographing control apparatus of an exemplary embodiment of the present disclosure. Referring to fig. 9, the photographing control apparatus 900 may include the following modules:
an operation matching module 901, configured to match a trigger operation with a preset trigger operation if the trigger operation is detected; the preset trigger operation corresponds to a photographing operation mode;
a target area determining module 902, configured to determine, if the matching is successful, a target area from areas to be photographed that include a plurality of objects to be photographed in response to the trigger operation;
a target object determining module 903, configured to obtain a motion state type of the target area, and determine a target object from the target area; wherein the motion state type is a static state or a motion state;
and a parameter determining module 904, configured to determine a photographing parameter for the target object in the target area according to the motion state type of the target area, and shoot the target object by using the photographing parameter to obtain a photographed image.
In an exemplary embodiment of the present disclosure, the photographing operation mode is non-contact framing frame photographing; the target area determination module includes: the candidate area determining module is used for responding to a first touch operation and determining a candidate area according to an area corresponding to the touch point of the first touch operation; and the first determination control module is used for responding to a second touch operation and determining the target area from the candidate area according to the operation type of the second touch operation.
In an exemplary embodiment of the present disclosure, the photographing operation mode is contact framing frame photographing; the target area determination module includes: and the second determination control module is used for responding to the first touch operation and determining the target area according to the position of the touch point of the first touch operation.
In an exemplary embodiment of the present disclosure, the target object determination module includes: and the first determining module is used for determining that the touch point of the first touch operation is closest and the object to be shot in the static state is the target object if the motion state type of the target area is the static state.
In an exemplary embodiment of the present disclosure, the target object determination module includes: and the second determining module is used for screening the object to be shot in the target area to obtain the target object from the target area if the motion state type of the target area is a motion state.
In an exemplary embodiment of the present disclosure, the second determining module includes: the object judging module is used for judging whether an object to be shot in a motion state exists in the target area; and the object determining module is used for determining that the object to be shot in the motion state is the target object by closest the touch point of the first touch operation if the object to be shot in the motion state exists.
In an exemplary embodiment of the present disclosure, the parameter determination module includes: a first parameter determining module, configured to use an automatic parameter as the photographing parameter of the target object if the motion state type is a static state; and the second parameter determining module is used for determining the photographing parameters according to the motion parameters of the target object if the motion state type is a motion state.
Since each functional module of the photographing control apparatus according to the embodiment of the present disclosure is the same as that in the embodiment of the photographing control method, it is not described herein again.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. A photographing control method, comprising:
if the trigger operation is detected, matching the trigger operation with a preset trigger operation; the preset trigger operation corresponds to a photographing operation mode;
if the matching is successful, determining a target area from the areas to be shot comprising a plurality of objects to be shot in response to the trigger operation;
acquiring the motion state type of the target area, and determining a target object from the target area; wherein the motion state type is a static state or a motion state;
and determining photographing parameters for the target object in the target area according to the motion state type of the target area, and photographing the target object by using the photographing parameters to obtain a photographed image.
2. The photographing control method according to claim 1, wherein the photographing operation mode is a non-contact framing frame photographing;
the determining a target area from an area to be photographed including a plurality of objects to be photographed in response to the trigger operation includes:
responding to a first touch operation, and determining a candidate area according to an area corresponding to a touch point of the first touch operation;
and responding to a second touch operation, and determining the target area from the candidate area according to the operation type of the second touch operation.
3. The photographing control method according to claim 1, wherein the photographing operation mode is a contact view frame photographing;
the determining a target area from an area to be photographed including a plurality of objects to be photographed in response to a trigger operation includes:
and responding to a first touch operation, and determining the target area according to the position of the touch point of the first touch operation.
4. The photographing control method according to claim 1, wherein the determining a target object from the target area includes:
and if the motion state type of the target area is a static state, the touch point of the first touch operation is closest, and the object to be shot in the static state is determined as the target object.
5. The photographing control method according to claim 1, wherein the determining a target object from the target area includes:
and if the motion state type of the target area is a motion state, screening the object to be shot in the target area to obtain the target object from the target area.
6. The photographing control method according to claim 5, wherein the screening the object to be photographed in the target area to obtain the target object from the target area comprises:
judging whether an object to be shot in a motion state exists in the target area or not;
and if the object to be shot in the motion state exists, the touch point of the first touch operation is closest, and the object to be shot in the motion state is determined as the target object.
7. The photographing control method according to claim 1, wherein the determining photographing parameters for the target object in the target region according to the motion state type of the target region comprises:
if the motion state type is a static state, taking an automatic parameter as the photographing parameter of the target object;
and if the motion state type is a motion state, determining the photographing parameters according to the motion parameters of the target object.
8. A photographing control apparatus, comprising:
the operation matching module is used for matching the trigger operation with a preset trigger operation if the trigger operation is detected; the preset trigger operation corresponds to a photographing operation mode;
the target area determining module is used for responding to the triggering operation to determine a target area from the areas to be shot comprising a plurality of objects to be shot if the matching is successful;
the target object determining module is used for acquiring the motion state type of the target area and determining a target object from the target area; wherein the motion state type is a static state or a motion state;
and the parameter determining module is used for determining photographing parameters for the target object in the target area according to the motion state type of the target area, and photographing the target object by using the photographing parameters to obtain a photographed image.
9. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the photographing control method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the picture taking control method of any one of claims 1-7 via execution of the executable instructions.
CN202010746376.2A 2020-07-29 2020-07-29 Photographing control method and device, storage medium and electronic equipment Active CN111917980B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010746376.2A CN111917980B (en) 2020-07-29 2020-07-29 Photographing control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010746376.2A CN111917980B (en) 2020-07-29 2020-07-29 Photographing control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111917980A true CN111917980A (en) 2020-11-10
CN111917980B CN111917980B (en) 2021-12-28

Family

ID=73286658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010746376.2A Active CN111917980B (en) 2020-07-29 2020-07-29 Photographing control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111917980B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113747073A (en) * 2021-09-13 2021-12-03 维沃移动通信有限公司 Video shooting method and device and electronic equipment
CN114040107A (en) * 2021-11-19 2022-02-11 智己汽车科技有限公司 Intelligent automobile image shooting system, method, vehicle and medium
CN115474002A (en) * 2021-04-30 2022-12-13 苹果公司 User interface for altering visual media
US11895391B2 (en) 2018-09-28 2024-02-06 Apple Inc. Capturing and displaying images with multiple focal planes
US11962889B2 (en) 2016-06-12 2024-04-16 Apple Inc. User interface for camera effects

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11215429A (en) * 1998-01-28 1999-08-06 Hitachi Ltd Method for switching photographed dynamic image-still image at photographing and record medium with its execution program recorded therein
JP2011023814A (en) * 2009-07-13 2011-02-03 Sharp Corp Imaging apparatus
CN103856708A (en) * 2012-12-03 2014-06-11 原相科技股份有限公司 Automatic focusing method, photographic device and computer readable storage medium
CN105681654A (en) * 2016-01-12 2016-06-15 努比亚技术有限公司 Photographing method and mobile terminal
US20160269624A1 (en) * 2015-03-12 2016-09-15 Samsung Electronics Co., Ltd. Image Photographing Apparatus and Method for Photographing Image Thereof
CN107239205A (en) * 2017-05-03 2017-10-10 努比亚技术有限公司 A kind of photographic method, mobile terminal and storage medium
CN107395997A (en) * 2017-08-18 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN108650457A (en) * 2018-05-03 2018-10-12 Oppo广东移动通信有限公司 Automatic photographing method, device, storage medium and mobile terminal
CN110740265A (en) * 2019-10-31 2020-01-31 维沃移动通信有限公司 Image processing method and terminal equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11215429A (en) * 1998-01-28 1999-08-06 Hitachi Ltd Method for switching photographed dynamic image-still image at photographing and record medium with its execution program recorded therein
JP2011023814A (en) * 2009-07-13 2011-02-03 Sharp Corp Imaging apparatus
CN103856708A (en) * 2012-12-03 2014-06-11 原相科技股份有限公司 Automatic focusing method, photographic device and computer readable storage medium
US20160269624A1 (en) * 2015-03-12 2016-09-15 Samsung Electronics Co., Ltd. Image Photographing Apparatus and Method for Photographing Image Thereof
CN105681654A (en) * 2016-01-12 2016-06-15 努比亚技术有限公司 Photographing method and mobile terminal
CN107239205A (en) * 2017-05-03 2017-10-10 努比亚技术有限公司 A kind of photographic method, mobile terminal and storage medium
CN107395997A (en) * 2017-08-18 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN108650457A (en) * 2018-05-03 2018-10-12 Oppo广东移动通信有限公司 Automatic photographing method, device, storage medium and mobile terminal
CN110740265A (en) * 2019-10-31 2020-01-31 维沃移动通信有限公司 Image processing method and terminal equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11962889B2 (en) 2016-06-12 2024-04-16 Apple Inc. User interface for camera effects
US11895391B2 (en) 2018-09-28 2024-02-06 Apple Inc. Capturing and displaying images with multiple focal planes
CN115474002A (en) * 2021-04-30 2022-12-13 苹果公司 User interface for altering visual media
CN113747073A (en) * 2021-09-13 2021-12-03 维沃移动通信有限公司 Video shooting method and device and electronic equipment
CN113747073B (en) * 2021-09-13 2024-02-02 维沃移动通信有限公司 Video shooting method and device and electronic equipment
CN114040107A (en) * 2021-11-19 2022-02-11 智己汽车科技有限公司 Intelligent automobile image shooting system, method, vehicle and medium
CN114040107B (en) * 2021-11-19 2024-04-16 智己汽车科技有限公司 Intelligent automobile image shooting system, intelligent automobile image shooting method, intelligent automobile image shooting vehicle and intelligent automobile image shooting medium

Also Published As

Publication number Publication date
CN111917980B (en) 2021-12-28

Similar Documents

Publication Publication Date Title
CN111917980B (en) Photographing control method and device, storage medium and electronic equipment
CN112333380B (en) Shooting method and equipment
US20240205535A1 (en) Photographing method and electronic device
CN108399349B (en) Image recognition method and device
CN111815666B (en) Image processing method and device, computer readable storage medium and electronic equipment
CN111770282B (en) Image processing method and device, computer readable medium and terminal equipment
CN112165575B (en) Image blurring processing method and device, storage medium and electronic equipment
CN111161176B (en) Image processing method and device, storage medium and electronic equipment
CN112929654B (en) Method, device and equipment for detecting sound and picture synchronization and storage medium
CN111212412A (en) Near field communication method and device, computer readable storage medium and electronic equipment
CN111399659B (en) Interface display method and related device
CN111580671A (en) Video image processing method and related device
CN113574525A (en) Media content recommendation method and equipment
CN104702848B (en) Show the method and device of framing information
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN112272191B (en) Data transfer method and related device
CN111930335A (en) Sound adjusting method and device, computer readable medium and terminal equipment
EP3629560A1 (en) Full screen terminal, and operation control method and device based on full screen terminal
CN111314763A (en) Streaming media playing method and device, storage medium and electronic equipment
CN114302063B (en) Shooting method and equipment
CN111400004B (en) Video scanning interrupt processing method and device, storage medium and electronic equipment
CN111757005A (en) Shooting control method and device, computer readable medium and electronic equipment
CN111982293A (en) Body temperature measuring method and device, electronic equipment and storage medium
CN111432156A (en) Image processing method and device, computer readable medium and terminal equipment
CN112291472B (en) Preview image processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant