CN114567727B - Shooting control system, shooting control method and device, storage medium and electronic equipment - Google Patents

Shooting control system, shooting control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114567727B
CN114567727B CN202210223652.6A CN202210223652A CN114567727B CN 114567727 B CN114567727 B CN 114567727B CN 202210223652 A CN202210223652 A CN 202210223652A CN 114567727 B CN114567727 B CN 114567727B
Authority
CN
China
Prior art keywords
image
target object
coordinate position
lens
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210223652.6A
Other languages
Chinese (zh)
Other versions
CN114567727A (en
Inventor
黎洪宋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210223652.6A priority Critical patent/CN114567727B/en
Publication of CN114567727A publication Critical patent/CN114567727A/en
Application granted granted Critical
Publication of CN114567727B publication Critical patent/CN114567727B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02KDYNAMO-ELECTRIC MACHINES
    • H02K11/00Structural association of dynamo-electric machines with electric components or with devices for shielding, monitoring or protection
    • H02K11/30Structural association with control circuits or drive circuits
    • H02K11/33Drive circuits, e.g. power electronics

Landscapes

  • Engineering & Computer Science (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure provides a shooting control system, a shooting control method, a shooting control device, a computer readable storage medium and electronic equipment, and relates to the technical field of images. The shooting control system comprises a processing platform and a camera module, wherein the camera module comprises a lens, a driving chip and a motor; the processing platform is used for acquiring a first image shot by the camera module, determining the coordinate position of a target object in the first image, acquiring a second image shot by the camera module when the camera module shakes, determining the coordinate position of the target object in the second image, and calculating the offset of the lens according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; the driving chip is used for receiving the lens offset sent by the processing platform and outputting an electric signal according to the lens offset; the motor is used for receiving the electric signal output by the driving chip and controlling the lens to move according to the electric signal. The shooting effect can be improved.

Description

Shooting control system, shooting control method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of imaging technology, and in particular, to a photographing control system, a photographing control method, a photographing control apparatus, a computer-readable storage medium, and an electronic device.
Background
With the rapid development of terminal technology, terminal devices such as smart phones and tablet computers have become an indispensable tool in daily work and life, and the phenomenon of shooting by using a camera module equipped with the terminal devices is becoming more and more common.
However, in the process of photographing using the camera module of the terminal device, there may be a problem in that the quality of the photographed image is poor due to the shake of the camera module.
Disclosure of Invention
The present disclosure provides a photographing control system, a photographing control method, a photographing control apparatus, a computer-readable storage medium, and an electronic device, and further overcomes, at least to some extent, the problem of poor photographing effect due to shake of a camera module.
According to a first aspect of the present disclosure, there is provided a photographing control system including a processing platform and a camera module including a lens, a driving chip, and a motor; the processing platform is used for acquiring a first image shot by the camera module, determining the coordinate position of a target object in the first image, acquiring a second image shot by the camera module when the camera module shakes, determining the coordinate position of the target object in the second image, and calculating the offset of the lens according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; the driving chip is used for receiving the lens offset sent by the processing platform and outputting an electric signal according to the lens offset; the motor is used for receiving the electric signal output by the driving chip and controlling the lens to move according to the electric signal.
According to a second aspect of the present disclosure, there is provided a photographing control method including: acquiring a first image shot by a camera module, and determining the coordinate position of a target object in the first image; when the camera module shakes, acquiring a second image shot by the camera module, and determining the coordinate position of a target object in the second image; calculating a lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; the lens offset is sent to the driving chip so that the driving chip outputs an electric signal to the motor according to the lens offset, and the motor controls the lens to move according to the electric signal; the driving chip, the motor and the lens are arranged in the camera module.
According to a third aspect of the present disclosure, there is provided a photographing control apparatus including: the first position determining module is used for acquiring a first image shot by the camera module and determining the coordinate position of a target object in the first image; the second position determining module is used for acquiring a second image shot by the camera module when the camera module shakes, and determining the coordinate position of the target object in the second image; the offset calculation module is used for calculating the lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; the lens control module is used for sending the lens offset to the driving chip so that the driving chip outputs an electric signal to the motor according to the lens offset, and the motor controls the lens to move according to the electric signal; the driving chip, the motor and the lens are arranged in the camera module.
According to a fourth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the shooting control method described above.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising a processor; and a memory for storing one or more programs, which when executed by the processor, cause the processor to implement the photographing control method described above.
In some embodiments of the present disclosure, a processing platform calculates a lens offset based on a change in a coordinate position of a target object in an image before and after dithering, and controls lens movement based on the lens offset. On one hand, the shooting control is carried out in an optical anti-shake mode based on image content, so that image tracking of a target object can be realized, the target object is controlled to be as stable as possible in an image, and the exposure time of the target object on a photosensitive element can be prolonged by optical anti-shake, so that the shot target object is clearer, and particularly for shooting the target object in a motion state, the condition of image blurring can be effectively avoided; on the other hand, in the scheme disclosed by the invention, the process of calculating the lens offset by utilizing the position of the target object in the image is configured on the processing platform to be executed, compared with some schemes of deploying the algorithm on the driving chip, the scheme disclosed by the invention can reduce communication interaction between the processing platform and the driving chip, thereby reducing the number of transmission lines configured between the processing platform and the driving chip and saving the cost. In addition, for the manufacturer of the terminal equipment, as the algorithm is mainly deployed on the processing platform instead of the camera module, the purchase price of the camera module can be reduced, and the cost is further saved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
Fig. 1 shows a schematic diagram of a system architecture of a photographing control system of an embodiment of the present disclosure;
fig. 2 shows a schematic diagram of a system architecture of a photographing control system according to another embodiment of the present disclosure;
FIG. 3 illustrates a schematic view of a lens stroke of an embodiment of the present disclosure;
Fig. 4 schematically illustrates a flowchart of a photographing control method of an embodiment of the present disclosure;
Fig. 5 shows a schematic diagram of a one-shot control process of an embodiment of the present disclosure;
fig. 6 schematically shows a block diagram of a photographing control apparatus of an embodiment of the present disclosure;
fig. 7 schematically illustrates a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only and not necessarily all steps are included. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations. In addition, all of the following terms "first," "second," are used for distinguishing purposes only and should not be taken as a limitation of the present disclosure.
The shooting control system of the embodiment of the disclosure may be configured in a terminal device, where the terminal device is any electronic device with a shooting function, including but not limited to a smart phone, a tablet computer, a smart watch, a video camera, a mobile monitoring device, and the like.
Fig. 1 shows a schematic diagram of a system architecture of a photographing control system of an embodiment of the present disclosure.
Referring to fig. 1, a photographing control system of an embodiment of the present disclosure may include a processing platform 11 and a camera module 12.
The processing platform 11 may be a chip platform of a terminal device, and may be a CPU (Central Processing Unit ). The processing platform 11 may be an SOC (System on Chip) of a terminal device, an MCU (Microcontroller Unit, micro control unit), a DSP (DIGITAL SIGNAL Processor), or the like.
The camera module 12 may include at least a driving chip 121, a motor 122, and a lens 123. Although not shown, it is understood that the camera module may also include other components such as a photosensitive element (sensor).
The driving chip 121 may be a chip for driving a load such as a motor. The driving chip 121 may be a single channel driving chip or a multi-channel driving chip, which is not limited in this disclosure. The single-channel driving chip refers to a driving chip with only one path of electric signals transmitted to the motor, and the multi-channel driving chip refers to a driving chip capable of simultaneously outputting multiple paths of electric signals and transmitting the multiple paths of electric signals to the motor. It will be appreciated that the single channel driver chip is not limited to only one output, but may have multiple outputs, and that the output to the motor may have only one output. The electrical signals referred to by embodiments of the present disclosure may be current signals, voltage signals, or other types of electrical signals, which the present disclosure is not limited to.
The Motor 122 may be a Voice Coil Motor (VCM), which is a device for converting electric energy into mechanical energy to drive the lens 123 to move.
The lens 123 may be various types of lenses, such as a fixed focus lens, a zoom lens, a wide angle lens, a telephoto lens, a macro lens, and the like.
In implementing the shooting control process of the embodiment of the present disclosure, first, the processing platform 11 may acquire a first image shot by the camera module 12, and determine a coordinate position of a target object in the first image; next, when the camera module 12 shakes, the processing platform 11 may acquire a second image captured by the camera module 12, and determine a coordinate position of the target object in the second image; subsequently, the processing platform 11 may calculate the lens offset based on the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; the processing stage 11 may then send the lens offset to the driver chip 121 in the camera module 12. The processing platform 11 and the driver chip 121 may be connected through IIC (Inter-INTEGRATED CIRCUIT, integrated circuit bus).
The driving chip 121, after receiving the lens shift amount transmitted from the processing stage 11, may output an electrical signal to the motor 122 according to the lens shift amount. The motor 122 may control the lens 123 to move according to the electric signal. The electrical signal may carry a direction identifier and a signal strength, a direction in which the pushing lens 123 moves may be determined according to the direction identifier, and a distance in which the pushing lens 123 moves may be determined according to the signal strength.
It can be understood that the direction of controlling the movement of the lens 123 is opposite to the direction of the shake of the camera module 12, and the lens shift caused by the shake can be eliminated by combining the lens movement distance corresponding to the shake distance.
The embodiment of the disclosure does not limit the type of the target object, and the target object can be selected by a user according to the image content, or can be a preset specified object, such as a human face, a football, a moon and the like. In the following exemplary description, the coordinate position of the target object center point may be taken as the coordinate position of the target object. However, it will be appreciated that in embodiments where the target object is defined in a rectangular box, the length and width of the rectangular box and the coordinates of a predetermined point (e.g., vertex, center point, etc.) on the rectangular box may also be used to characterize the coordinate position of the target object.
The process of determining the coordinate position of the target object in the first image by the processing stage 11 will be described below.
According to some embodiments of the present disclosure, in a case where the first image is presented on the touch screen of the terminal device, the processing platform 11 may determine the target object in response to the object selection operation, and further determine the coordinate position of the target object in the first image.
For example, the user may click on an object on the first image through the touch screen, and the processing platform 11 determines the target object in response to the click operation of the user. If the user clicks on a football in the first image, the football may be determined as the target object. Next, the coordinates of the center of the soccer ball may be regarded as the coordinate position of the target object in the first image as referred to in the present disclosure.
According to other embodiments of the present disclosure, after the processing platform 11 acquires the first image, the processing platform 11 may perform image recognition on the first image, calculate a similarity between a result of the image recognition and a predefined target object, and in a case where the similarity is greater than a similarity threshold, determine the result of the image recognition as the target object, and further determine a coordinate position of the target object in the first image. The specific value of the similarity threshold is not limited in this disclosure.
Specifically, the processing platform 11 may extract a set of feature points from the first image according to a pre-configured feature extraction algorithm, compare the feature points in the set with feature points of a pre-configured target object, determine similarity between the feature points and the pre-configured target object, and determine the target object by using the similarity, thereby determining a coordinate position of the target object in the first image.
It may be appreciated that, in the case that the target image is not determined from the image, the photographing control scheme of the embodiment of the disclosure is stopped, and the processing platform 11 may continuously perform the confirmation process of the target object until the target object and the coordinate position of the target object in the image are determined, and then continue to perform the photographing control scheme of the embodiment of the disclosure.
In addition, the process of determining the coordinate position of the target object in the second image by the processing platform 11 is similar to the above, and will not be described again.
The process of determining the shake of the camera module 12 by the processing platform 11 will be described below.
According to some embodiments of the present disclosure, processing platform 11 may utilize data sensed by the gyroscopic sensor to determine whether camera module 12 is subject to jitter.
Referring to fig. 2, the photographing control system of the embodiment of the present disclosure may further include a gyro sensor 21. The gyro sensor 21 is an angular motion detection device in which a momentum moment sensitive housing of a high-speed rotation body is positioned around an orthogonal rotation axis with respect to an inertia space, and may be any one of a piezoelectric gyro sensor, a mechanical gyro sensor, an optical fiber gyro sensor, and a laser gyro sensor. In addition, the gyro sensor 21 and the processing platform 11 may be connected through an SPI (SERIAL PERIPHERAL INTERFACE ).
The gyro sensor 21 may detect angular velocities of the camera module 12 in one or more directions, so that it may be determined whether or not the camera module 12 is subject to shake based on the detected angular velocities.
Specifically, the processing platform 11 may obtain the angular velocity sensed by the gyro sensor 21, compare the sensed angular velocity with the angular velocity threshold, and determine that the camera module 12 shakes when the sensed angular velocity is greater than the angular velocity threshold. In the case where the sensed angular velocity is equal to or less than the angular velocity threshold, the processing platform 11 determines that the camera module 12 does not shake. The specific value of the angular velocity threshold is not limited in this disclosure.
According to other embodiments of the present disclosure, in a scene of continuous shooting, whether the camera module 12 shakes may be determined according to the shot image, in the process of determining whether the camera module 12 shakes by the processing platform 11.
Specifically, a fixed object in the scene, such as a table, a tree, a building, etc., may be determined from the captured image, and then, the positions of the fixed object in adjacent frames are determined to determine whether the camera module 12 shakes. For example, a change in the position of a stationary object in an adjacent frame that is within a predetermined range indicates that camera module 12 is shaking.
The process of calculating the lens shift amount by the processing stage 11 will be described below.
In the case where the camera module 12 shakes, the position of the target object imaged on the photosensitive element in the camera module 12 changes, that is, the coordinate positions of the target object in the first image and the second image change.
First, the processing stage 11 may calculate the pixel shift amount of the target object from the coordinate position of the target object in the first image and the coordinate position of the target object in the second image.
If the coordinate position of the target object in the first image is denoted as src_local (x, y), the coordinate position of the target object in the second image is denoted as rotate_local (x, y), and the pixel offset is denoted as delta (x, y), the pixel offset delta (x, y) can be calculated using the following equation:
delta(x,y)=rotate_local(x,y)-src_local(x,y)
next, the processing stage 11 may calculate a lens shift amount using the pixel shift amount of the target object. It is understood that the lens offset includes the direction and degree of lens movement compensation.
Specifically, the processing platform 11 may multiply the pixel offset of the target object by the gain threshold to obtain the lens offset.
If the lens offset is denoted as delta_hall (hall_x, hall_y) and the gain threshold is denoted as pixelgain (x, y), the lens offset delta_hall (hall_x, hall_y) can be calculated using the following equation:
delta_hall(hall_x,hall_y)=delta(x,y)*pixelgain(x,y)
by the above equation, the lens shift amount in the x direction and the lens shift amount in the y direction can be calculated, respectively.
It should be understood that the unit of pixel offset is pixel and the unit of lens offset is code.
It should be noted that the motor 122 may control the movement of the lens 123 in the x-direction and the y-direction, respectively. In an actual scene, the range of travel of the lens 123 is limited by hardware. Fig. 3 shows a schematic view of a lens stroke of an embodiment of the present disclosure. The lens is configured in a central position by default, and when the lens is driven to move, the linear stroke occupies only a part of the full stroke, and the rest is the nonlinear stroke.
The exemplary photographing control scheme of the present disclosure has been described above by taking analysis of the first image and the second image as an example, however, it should be understood that the photographing control process described above may be continuously performed in a scene in which images are continuously acquired. The frequency of controlling the movement of the lens 123 is generally high, and in the process of continuously performing the progressive exposure and imaging, the movement of the lens 123 is continuously controlled to improve the shooting effect.
Further, the embodiment of the disclosure also provides a shooting control method. The photographing control method may be implemented by a terminal device, and specifically, each step of the photographing control method may be performed by the processing platform 11. In this case, a shooting control apparatus described below may be disposed in the processing platform 11.
Fig. 4 schematically shows a flowchart of a photographing control method of an exemplary embodiment of the present disclosure. Referring to fig. 4, the photographing control method may include the steps of:
S42, acquiring a first image shot by the camera module, and determining the coordinate position of the target object in the first image.
The embodiment of the disclosure does not limit the type of the target object, and the target object can be selected by a user according to the image content, or can be a preset specified object, such as a human face, a football, a moon and the like. In the following exemplary description, the coordinate position of the target object center point may be taken as the coordinate position of the target object. However, it will be appreciated that in embodiments where the target object is defined in a rectangular box, the length and width of the rectangular box and the coordinates of a predetermined point (e.g., vertex, center point, etc.) on the rectangular box may also be used to characterize the coordinate position of the target object.
According to some embodiments of the present disclosure, in a case where the first image is presented on a touch screen of the terminal device, the processing platform may determine the target object in response to the object selection operation, and further determine a coordinate position of the target object in the first image.
For example, the user may click on an object on the first image through the touch screen, and the processing platform determines the target object in response to the click operation of the user. If the user clicks on a football in the first image, the football may be determined as the target object. Next, the coordinates of the center of the soccer ball may be regarded as the coordinate position of the target object in the first image as referred to in the present disclosure.
According to other embodiments of the present disclosure, after the processing platform acquires the first image, the processing platform may perform image recognition on the first image, calculate a similarity between a result of the image recognition and a predefined target object, and in a case where the similarity is greater than a similarity threshold, determine the result of the image recognition as the target object, and further determine a coordinate position of the target object in the first image. The specific value of the similarity threshold is not limited in this disclosure.
Specifically, the processing platform may extract a set of feature points from the first image according to a pre-configured feature extraction algorithm, compare the feature points in the set with feature points of a pre-configured target object, determine similarity between the feature points and the pre-configured target object, and determine the target object by using the similarity, thereby determining a coordinate position of the target object in the first image.
It is understood that the first image is an image containing the target object. Under the condition that the shot image does not comprise the target image, the shooting control scheme of the embodiment of the disclosure is stopped, and the processing platform can continuously execute the confirmation process of the target object for the shot image until the target object and the coordinate position of the target object in the image are determined, and then continue to execute the shooting control scheme of the embodiment of the disclosure.
S44, when the camera module shakes, a second image shot by the camera module is obtained, and the coordinate position of the target object in the second image is determined.
According to some embodiments of the present disclosure, the processing platform may determine whether the camera module is dithered using data sensed by the gyro sensor.
The gyro sensor can detect the angular velocity of the camera module in one or more directions, so that whether the camera module shakes or not can be judged according to the detected angular velocity.
Specifically, the processing platform can acquire the angular velocity sensed by the gyroscope sensor, compare the sensed angular velocity with an angular velocity threshold, and determine that the camera module shakes under the condition that the sensed angular velocity is greater than the angular velocity threshold. And under the condition that the sensed angular speed is smaller than or equal to the angular speed threshold value, the processing platform determines that the camera module does not shake. The specific value of the angular velocity threshold is not limited in this disclosure.
According to other embodiments of the present disclosure, in a scene of continuous shooting, whether the camera module shakes may be determined according to a shot image according to a process of determining whether the camera module shakes by the processing platform.
Specifically, a fixed object in a scene, such as a table, a tree, a building, etc., can be determined through the photographed image, and then, the positions of the fixed object in adjacent frames are determined to determine whether the camera module shakes. For example, if the position of the fixed object in the adjacent frame changes within a predetermined range, it indicates that the camera module is jittered.
The process of determining the coordinate position of the target object in the second image is similar to that of the first image in step S42, and will not be described again.
S46, calculating the lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image.
In the case of camera module shake, the imaging position of the target object on the photosensitive element in the camera module may change, that is, the coordinate positions of the target object in the first image and the second image may change.
First, the processing platform may calculate the pixel offset of the target object from the coordinate position of the target object in the first image and the coordinate position of the target object in the second image.
If the coordinate position of the target object in the first image is denoted as src_local (x, y), the coordinate position of the target object in the second image is denoted as rotate_local (x, y), and the pixel offset is denoted as delta (x, y), the pixel offset delta (x, y) can be calculated using the following equation:
delta(x,y)=rotate_local(x,y)-src_local(x,y)
next, the processing platform may calculate a lens offset using the pixel offset of the target object. It is understood that the lens offset includes the direction and degree of lens movement compensation.
Specifically, the processing platform may multiply the pixel offset of the target object by a gain threshold to obtain a lens offset.
If the lens offset is denoted as delta_hall (hall_x, hall_y) and the gain threshold is denoted as pixelgain (x, y), the lens offset delta_hall (hall_x, hall_y) can be calculated using the following equation:
delta_hall(hall_x,hall_y)=delta(x,y)*pixelgain(x,y)
by the above equation, the lens shift amount in the x direction and the lens shift amount in the y direction can be calculated, respectively.
It should be understood that the unit of pixel offset is pixel and the unit of lens offset is code.
S48, the lens offset is sent to the driving chip, so that the driving chip outputs an electric signal to the motor according to the lens offset, and the motor controls the lens to move according to the electric signal.
In an exemplary embodiment of the present disclosure, a driving chip, a motor, and a lens are configured in a camera module.
The process of controlling the lens movement is described below in one embodiment.
And taking the center of the initial position of the lens as an origin, and establishing an xy coordinate system on a plane where the lens is positioned. When the current coordinate position of the lens is (-1, +5), if the offset of the x axis in the lens offset calculated by the processing platform is +2 and the offset of the y axis is-3, the driving chip outputs an electric signal according to the lens offset, controls the motor to move for 2 unit lengths in the positive direction of the x axis and 3 unit lengths in the negative direction of the y axis.
In addition, the lens shift amount may also be expressed in the form of a vector, that is, the lens shift amount may include a shift direction and a shift amount of the lens, which is not limited by the present disclosure.
Fig. 5 shows a schematic diagram of a one-shot control process of an embodiment of the present disclosure.
As shown in fig. 5, in the stationary state of the camera module, the target object may be imaged on the photosensitive element to generate the first image.
When the lens and the photosensitive element rotate as a unit, the imaging position of the target object shifts from a stationary state, and at this time, a second image is generated.
After the shooting control scheme based on the embodiment of the disclosure, the lens offset can be calculated by utilizing the change of the coordinate position of the target object, so as to push the lens. Thereby, the imaging of the target object on the photosensitive element is compensated to an intermediate position when in a stationary state.
The exemplary photographing control scheme of the present disclosure is described above in terms of a one-time control process, however, it should be understood that the photographing control process described above may be continuously performed in a scene where images are continuously acquired. The frequency of controlling the lens movement is generally higher, and the lens movement is continuously controlled in the process of continuously exposing and imaging line by line so as to improve the shooting effect.
It should be noted that although the steps of the methods in the present disclosure are depicted in the accompanying drawings in a particular order, this does not require or imply that the steps must be performed in that particular order, or that all illustrated steps be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
Further, the embodiment of the disclosure also provides a shooting control device.
Fig. 6 schematically shows a block diagram of the photographing control apparatus 6 of the exemplary embodiment of the present disclosure. Referring to fig. 6, the photographing control apparatus 6 according to an exemplary embodiment of the present disclosure may include a first position determining module 61, a second position determining module 63, an offset amount calculating module 65, and a lens control module 67.
Specifically, the first position determining module 61 may be configured to obtain a first image captured by the camera module, and determine a coordinate position of the target object in the first image; the second position determining module 63 may be configured to obtain a second image captured by the camera module when the camera module shakes, and determine a coordinate position of the target object in the second image; the offset calculation module 65 may be configured to calculate a lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; the lens control module 67 may be configured to send the lens offset to the driving chip so that the driving chip outputs an electrical signal to the motor according to the lens offset, and the motor controls the lens to move according to the electrical signal. The driving chip, the motor and the lens are arranged in the camera module.
Since each functional module of the photographing control apparatus according to the embodiment of the present disclosure is the same as that in the above-described system and method embodiments, the description thereof will not be repeated here.
Fig. 7 shows a schematic diagram of an electronic device suitable for use in implementing exemplary embodiments of the present disclosure. The terminal device of the exemplary embodiments of the present disclosure may be configured as in the form of fig. 7. It should be noted that the electronic device shown in fig. 7 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present disclosure.
The electronic device of the present disclosure includes at least a processor and a memory for storing one or more programs, which when executed by the processor, enable the processor to implement the photographing control method of the exemplary embodiments of the present disclosure.
Specifically, as shown in fig. 7, the electronic device 70 may include: processor 710, internal memory 721, external memory interface 722, universal serial bus (Universal Serial Bus, USB) interface 730, charge management module 740, power management module 741, battery 742, antenna 1, antenna 2, mobile communication module 750, wireless communication module 760, audio module 770, sensor module 780, display 790, camera module 791, indicator 792, motor 793, keys 794, and subscriber identity module (Subscriber Identification Module, SIM) card interface 795, among others. The sensor module 780 may include a depth sensor, a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It is to be understood that the illustrated structure of the presently disclosed embodiments does not constitute a particular limitation of the electronic device 70. In other embodiments of the present disclosure, the electronic device 70 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 710 may include one or more processing units such as, for example: the Processor 710 may include an application Processor (Application Processor, AP), a modem Processor, a graphics Processor (Graphics Processing Unit, GPU), an image signal Processor (IMAGE SIGNAL Processor, ISP), a controller, a video codec, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), a baseband Processor and/or a neural network Processor (Neural-network Processing Unit, NPU), and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In addition, a memory may be provided in the processor 710 for storing instructions and data.
The electronic device 70 may implement a photographing function through an ISP, a camera module 791, a video codec, a GPU, a display 790, an application processor, and the like. In some embodiments, the electronic device 70 may include 1 or N camera modules 791, where N is a positive integer greater than 1, if the electronic device 70 includes N cameras, one of the N cameras is the master camera.
Internal memory 721 may be used to store computer-executable program code, including instructions. The internal memory 721 may include a storage program area and a storage data area. External memory interface 722 may be used to interface an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of electronic device 70.
The present disclosure also provides a computer-readable storage medium that may be included in the electronic device described in the above embodiments; or may exist alone without being incorporated into the electronic device.
The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer-readable storage medium carries one or more programs which, when executed by one such electronic device, cause the electronic device to implement the methods as described in the embodiments of the present disclosure.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (7)

1. The shooting control system is characterized by comprising a processing platform and a camera module, wherein the camera module comprises a driving chip, a motor and a lens;
the processing platform is used for acquiring a first image shot by the camera module, determining the coordinate position of a target object in the first image, acquiring a second image shot by the camera module when the camera module shakes, determining the coordinate position of the target object in the second image, and calculating the lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image; in a scene of continuous shooting, if the position change of a fixed object in adjacent frames is within a preset range, the camera module shakes;
The driving chip is used for receiving the lens offset sent by the processing platform and outputting an electric signal according to the lens offset;
The motor is used for receiving the electric signal output by the driving chip and controlling the lens to move according to the electric signal;
Wherein the process of the processing platform determining the coordinate position of the target object in the first image is configured to:
Under the condition that the first image is displayed on the touch screen, responding to an object selection operation, determining a target object, and determining the coordinate position of the target object in the first image; or alternatively
And carrying out image recognition on the first image, calculating the similarity between the image recognition result and a target object, determining the image recognition result as the target object under the condition that the similarity is larger than a similarity threshold value, and determining the coordinate position of the target object in the first image.
2. The photographing control system of claim 1, wherein the process of the processing platform calculating the lens offset from the coordinate position of the target object in the first image and the coordinate position of the target object in the second image is configured to: and calculating the pixel offset of the target object according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image, and calculating the lens offset by using the pixel offset of the target object.
3. The photographing control system of claim 2, wherein the process of the processing platform calculating the lens offset using the pixel offset of the target object is configured to: and multiplying the pixel offset of the target object by a gain threshold to obtain the lens offset.
4. A photographing control method, characterized by comprising:
acquiring a first image shot by a camera module, and determining the coordinate position of a target object in the first image;
when the camera module shakes, acquiring a second image shot by the camera module, and determining the coordinate position of the target object in the second image; in a scene of continuous shooting, if the position change of a fixed object in adjacent frames is within a preset range, the camera module shakes;
calculating a lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image;
The lens offset is sent to a driving chip, so that the driving chip outputs an electric signal to a motor according to the lens offset, the motor controls the lens to move according to the electric signal, and the driving chip, the motor and the lens are configured in the camera module;
wherein the process of determining the coordinate position of the target object in the first image is configured to:
Under the condition that the first image is displayed on the touch screen, responding to an object selection operation, determining a target object, and determining the coordinate position of the target object in the first image; or alternatively
And carrying out image recognition on the first image, calculating the similarity between the image recognition result and a target object, determining the image recognition result as the target object under the condition that the similarity is larger than a similarity threshold value, and determining the coordinate position of the target object in the first image.
5. A photographing control apparatus, comprising:
the first position determining module is used for acquiring a first image shot by the camera module and determining the coordinate position of a target object in the first image;
The second position determining module is used for acquiring a second image shot by the camera module when the camera module shakes, and determining the coordinate position of the target object in the second image; in a scene of continuous shooting, if the position change of a fixed object in adjacent frames is within a preset range, the camera module shakes;
The offset calculation module is used for calculating the lens offset according to the coordinate position of the target object in the first image and the coordinate position of the target object in the second image;
The lens control module is used for sending the lens offset to the driving chip so that the driving chip outputs an electric signal to the motor according to the lens offset, the motor controls the lens to move according to the electric signal, and the driving chip, the motor and the lens are configured in the camera module;
wherein the process of the first position determination module determining the coordinate position of the target object in the first image is configured to:
Under the condition that the first image is displayed on the touch screen, responding to an object selection operation, determining a target object, and determining the coordinate position of the target object in the first image; or alternatively
And carrying out image recognition on the first image, calculating the similarity between the image recognition result and a target object, determining the image recognition result as the target object under the condition that the similarity is larger than a similarity threshold value, and determining the coordinate position of the target object in the first image.
6. A computer-readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the photographing control method according to claim 4.
7. An electronic device, comprising:
A processor;
A memory for storing one or more programs that, when executed by the processor, cause the processor to implement the photographing control method of claim 4.
CN202210223652.6A 2022-03-07 2022-03-07 Shooting control system, shooting control method and device, storage medium and electronic equipment Active CN114567727B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210223652.6A CN114567727B (en) 2022-03-07 2022-03-07 Shooting control system, shooting control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210223652.6A CN114567727B (en) 2022-03-07 2022-03-07 Shooting control system, shooting control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN114567727A CN114567727A (en) 2022-05-31
CN114567727B true CN114567727B (en) 2024-07-05

Family

ID=81717028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210223652.6A Active CN114567727B (en) 2022-03-07 2022-03-07 Shooting control system, shooting control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114567727B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115278062A (en) * 2022-07-05 2022-11-01 Oppo广东移动通信有限公司 Control method, control device, terminal, and computer-readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003101863A (en) * 2001-09-25 2003-04-04 Fuji Photo Optical Co Ltd Image jitter prevention device
CN110012224A (en) * 2019-03-26 2019-07-12 Oppo广东移动通信有限公司 Camera stabilization system, method, electronic equipment and computer readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010100677A1 (en) * 2009-03-05 2010-09-10 富士通株式会社 Image processing device and shake amount calculation method
CN106161930A (en) * 2016-06-27 2016-11-23 乐视控股(北京)有限公司 Camera control method and device
CN110233969B (en) * 2019-06-26 2021-03-30 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN111246089B (en) * 2020-01-14 2021-09-28 Oppo广东移动通信有限公司 Jitter compensation method and apparatus, electronic device, computer-readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003101863A (en) * 2001-09-25 2003-04-04 Fuji Photo Optical Co Ltd Image jitter prevention device
CN110012224A (en) * 2019-03-26 2019-07-12 Oppo广东移动通信有限公司 Camera stabilization system, method, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN114567727A (en) 2022-05-31

Similar Documents

Publication Publication Date Title
CN111641835B (en) Video processing method, video processing device and electronic equipment
CN110035228B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
CN113454982B (en) Electronic device for stabilizing image and method of operating the same
US9479709B2 (en) Method and apparatus for long term image exposure with image stabilization on a mobile device
CN109951638B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
KR20200048609A (en) Method for processing image using artificial neural network and electronic device for supporting the same
KR20130037746A (en) Photographing apparatus, motion estimation apparatus, method for image compensation, method for motion estimation, computer-readable recording medium
CN112414400B (en) Information processing method and device, electronic equipment and storage medium
CN112840634A (en) Electronic device and method for obtaining image
CN115701125B (en) Image anti-shake method and electronic equipment
US20220360714A1 (en) Camera movement control method and device
CN114567727B (en) Shooting control system, shooting control method and device, storage medium and electronic equipment
KR20150085919A (en) Method for processing image and electronic device thereof
CN116437206A (en) Anti-shake method, apparatus, electronic device, and computer-readable storage medium
US11159725B2 (en) Image processing apparatus, image processing method, and recording medium
WO2020190008A1 (en) Electronic device for auto focusing function and operating method thereof
CN116012262B (en) Image processing method, model training method and electronic equipment
CN114390186A (en) Video shooting method and electronic equipment
EP4395357A1 (en) Electronic device including camera, and moving image generating method for photographing moving object
JP2020136774A (en) Image processing apparatus for detecting motion vector, control method of the same, and program
CN115134532A (en) Image processing method, image processing device, storage medium and electronic equipment
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN115086541A (en) Shooting position determining method, device, equipment and medium
CN111757005A (en) Shooting control method and device, computer readable medium and electronic equipment
CN115022540B (en) Anti-shake control method, device and system and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant