CN115278056A - Shooting method, shooting device, electronic equipment and medium - Google Patents

Shooting method, shooting device, electronic equipment and medium Download PDF

Info

Publication number
CN115278056A
CN115278056A CN202210730500.5A CN202210730500A CN115278056A CN 115278056 A CN115278056 A CN 115278056A CN 202210730500 A CN202210730500 A CN 202210730500A CN 115278056 A CN115278056 A CN 115278056A
Authority
CN
China
Prior art keywords
image sensor
image data
information
image
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210730500.5A
Other languages
Chinese (zh)
Inventor
裴珺
吴旭邦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210730500.5A priority Critical patent/CN115278056A/en
Publication of CN115278056A publication Critical patent/CN115278056A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses a shooting method, a shooting device, electronic equipment and a shooting medium, and belongs to the technical field of shooting. Wherein, the method comprises the following steps: under the condition of controlling an image sensor to perform exposure, acquiring motion information of a photographic subject through a sensing pixel of the image sensor, wherein the motion information is information of the movement of the photographic subject in a field of view of the image sensor; determining position compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the position compensation information so that the image sensor is stationary relative to the photographic subject during exposure; and outputting the first image data after the exposure is finished.

Description

Shooting method, shooting device, electronic equipment and medium
Technical Field
The application belongs to the technical field of shooting, and particularly relates to a shooting method, a shooting device, electronic equipment and a shooting medium.
Background
In general, when a user photographs a photographic subject using the electronic apparatus, the electronic apparatus may control a rolling shutter of the electronic apparatus to expose a plurality of rows of pixels of an image sensor line by line, so that a high-resolution photographed image of the photographic subject may be obtained.
However, since the object may be a moving object, and when the electronic device exposes a portion of pixels corresponding to the moving object, the exposure time is long, which may cause a deformation phenomenon of the moving object in the captured image.
Thus, the shooting effect of the electronic equipment is poor.
Disclosure of Invention
An object of the embodiments of the present application is to provide a shooting method, an apparatus, an electronic device, and a medium, which can reduce deformation of an image area where a moving object is located when the electronic device exposes a portion of pixels corresponding to the moving object, so as to improve a shooting effect of the electronic device.
In a first aspect, an embodiment of the present application provides a shooting method, which is applied to a shooting device including an image sensor, and the method includes: under the condition of controlling an image sensor to perform exposure, acquiring motion information of a shooting object through a real sensing pixel of the image sensor, wherein the motion information is information of the motion of the shooting object in a field of view of the image sensor; determining position compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the position compensation information so that the image sensor is stationary with respect to the photographic subject during the exposure; after the exposure is completed, the first image data is output.
In a second aspect, an embodiment of the present application provides a camera, which includes an image sensor, and further includes: the device comprises an acquisition module, a determination module, a control module and a processing module. The acquisition module is used for acquiring motion information of a shooting object through the real sensing pixels of the image sensor under the condition of controlling the image sensor to perform exposure, wherein the motion information is information of the motion of the shooting object in the field of view of the image sensor. And the determining module is used for determining the position compensation information of the image sensor based on the motion information acquired by the acquiring module. And the control module is used for controlling the image sensor to move based on the position compensation information determined by the determination module so that the image sensor is static relative to the shooting object in the exposure process. And the processing module is used for outputting the first image data after the exposure is finished.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the steps of the method according to the first aspect.
In the embodiment of the present application, in the case of controlling an image sensor of a photographing device to perform exposure, the photographing device may acquire motion information of a photographic subject (i.e., information that the photographic subject moves within a field of view of the image sensor) through a sensing pixel of the image sensor, then determine position compensation information of the image sensor based on the motion information, and further control the image sensor to move according to the position compensation information, so that the image sensor is stationary with respect to the photographic subject during the exposure, and output first image data after the exposure is completed. The shooting device can firstly acquire the motion information of the motion of the shooting object in the field of view of the image sensor, and then control the image sensor of the shooting device to move based on the position compensation information determined by the motion information, so that the image sensor is static relative to the shooting object in the exposure process. Therefore, the relative position change between the shooting object and the image sensor can be reduced, and the phenomenon that the image area where the moving object is located is deformed can be reduced, so that the shooting effect of the shooting device can be improved.
Drawings
Fig. 1 is a schematic flowchart of a shooting method provided in an embodiment of the present application;
FIG. 2 is a second flowchart illustrating an image capturing method according to an embodiment of the present application;
fig. 3 is a third schematic flowchart of a shooting method according to an embodiment of the present application;
fig. 4 is a circuit connection diagram of a real pixel according to an embodiment of the present disclosure;
fig. 5 is a fourth schematic flowchart of a shooting method provided in the embodiment of the present application;
fig. 6 is a fifth schematic flowchart of a shooting method provided in the embodiment of the present application;
fig. 7 is a schematic structural diagram of a shooting device provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 9 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The shooting method, the shooting device, the electronic device, and the shooting medium provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 shows a flowchart of a shooting method provided in an embodiment of the present application. As shown in fig. 1, the photographing method provided by the embodiment of the present application may include steps 101 to 104 described below.
Step 101, when the shooting device controls the image sensor to expose, the shooting device acquires the motion information of the shooting object through the sensing pixels of the image sensor.
In the embodiment of the application, a rolling shutter can be arranged in the shooting device.
Optionally, in this embodiment of the application, when the user triggers the camera to start the shooting class application, the camera may display a preview interface of the shooting class application, start a camera of the camera, and control the rolling shutter to expose the image sensor (sensor).
Alternatively, in the embodiment of the present application, the photographic subject may be a moving person, an animal, an object, or the like.
In the embodiment of the present application, the motion information is information of a motion of a photographic subject in a field of view of an image sensor.
Optionally, in an embodiment of the present application, the motion information includes at least one of: speed (e.g., speed of movement in the embodiments described below), direction (e.g., direction of movement in the embodiments described below).
In one example, the photographing apparatus may directly acquire motion information of the photographing object through a motion sensor. Wherein the motion sensor may be any one of: infrared sensors, laser sensors, dynamic vision sensors, and the like.
In another example, the photographing apparatus may acquire at least one preview screen while controlling the image sensor to perform exposure, and compare the preview screen to determine motion information of the photographing object.
In still another example, at least one high-sensitivity pixel (for example, a pixel provided with a white (white) filter, or a real sensing pixel in the following embodiments) may be provided in an image sensor of the photographing device, so that in a case where the photographing device controls the image sensor to perform exposure, the at least two high-sensitivity pixels may detect a contour of a photographing object, and when some high-sensitivity pixels of the at least two high-sensitivity pixels detect the contour of the photographing object, the some high-sensitivity pixels may transmit position information of the some high-sensitivity pixels and corresponding time information to the photographing device, so that the photographing device may acquire motion information according to the position information of the some high-sensitivity pixels and the corresponding time information.
And 102, the shooting device determines position compensation information of the image sensor based on the motion information.
It is understood that the position compensation information refers to a target position at which the image sensor of the photographing device is moved next.
Optionally, in this embodiment of the application, in a case that the motion information includes a speed and a direction, the shooting device may first acquire a reading time of each row of pixels of the image sensor, and then determine one target position according to the speed, the direction and the reading time of each row of pixels, so as to determine a plurality of target positions, and obtain the position compensation information.
And 103, controlling the image sensor to move by the shooting device based on the position compensation information so that the image sensor is static relative to the shooting object in the exposure process.
Alternatively, in this embodiment of the application, after the image sensor is controlled by the photographing device to move according to the position compensation information, the photographing device may perform the above-mentioned steps 101 to 103 again, that is, obtain the motion information of the photographing object again, and determine one piece of position compensation information again based on the motion information obtained again, so that the photographing device may control the image sensor to move again according to the one piece of position compensation information.
Further alternatively, the photographing device may expose a first row of pixels of all photosensitive areas of the image sensor, and read the first row of pixels after the exposure of the first row of pixels is finished, so that when the first row of pixels is read, the photographing device may control the image sensor to move according to the determined position compensation information; then, the photographing device may acquire motion information of the photographing object again, determine position compensation information again based on the motion information acquired again, expose a second row of pixels of all photosensitive areas of the image sensor, and read the second row of pixels after the exposure of the second row of pixels is finished, so that when reading the second row of pixels, the photographing device may control the image sensor to move according to the position compensation information determined again, and so on.
It is understood that the position of the image sensor with respect to the photographic subject does not change when different rows of pixels are exposed.
And 104, after the exposure is finished, outputting first image data by the shooting device.
Alternatively, in this embodiment of the application, the photographing device may read each row of pixels of the entire photosensitive area through the image sensor to obtain the first image data, so that the photographing device may process the first image data through the image signal processor to obtain the photographed image.
After obtaining the shot image, the shooting device may display the shot image in a preview interface of the shooting class application, or the shooting device may directly store the shot image in a preset storage area (e.g., a storage area corresponding to an "album").
Optionally, in this embodiment, with reference to fig. 1, as shown in fig. 2, after the step 104, the shooting method provided in this embodiment may further include the following steps 201 and 202.
Step 201, the shooting device controls the image sensor to perform exposure and outputs second image data.
Optionally, in this embodiment of the application, the photographing device may not move the image sensor and read each row of pixels of the entire photosensitive area of the image sensor to obtain the second image data, so that the photographing device may process the second image data through the image signal processor to obtain the photographed image.
Step 202, the shooting device generates a target image according to the first image data and the second image data.
In an embodiment of the present application, the first image data is: in the exposure process, after the image sensor moves according to the position compensation information, the image sensor images the shooting object to obtain image data corresponding to a first image, and the second image data is: and in the exposure process, under the condition that the image sensor of the shooting device does not move, the image sensor is used for imaging the shooting object to obtain image data corresponding to the image.
It can be understood that the first image data is image data corresponding to an image in which the deformation of the photographic object is offset and the background object is dislocated; the second image data is image data corresponding to an image in which the deformation of the photographic subject is not offset and the background subject is not displaced.
Optionally, in this embodiment of the application, the shooting device may perform fusion processing on the first image data and the second image data to obtain target image data, and then process the target image data through the image signal processor, and compress the target image data to generate the target image.
Optionally, in this embodiment of the application, after obtaining the target image, the shooting device may display the target image in a preview interface of the shooting application, or the shooting device may directly store the target image in a preset storage area (for example, a storage area corresponding to an "album").
Therefore, the shooting device can fuse the first image data and the second image data, so that the phenomenon that a moving object in the target image is deformed can be reduced.
Optionally, in this embodiment of the application, with reference to fig. 1, as shown in fig. 3, the step 201 described above may be specifically implemented by the following steps 201a to 201 c.
Step 201a, the shooting device obtains foreground image data corresponding to the shooting object based on the first image data.
In this embodiment, the foreground image data is image data corresponding to a photographic subject.
Further optionally, in this embodiment of the application, the capturing apparatus may determine an image area (i.e., a foreground image area) corresponding to the captured object from the first image data, and then obtain foreground image data corresponding to the foreground image area from the first image data.
In one example, the camera may obtain at least one piece of image data and compare the at least one piece of image data with the first image data to determine a foreground image region, thereby determining foreground image data corresponding to the foreground image region.
Step 201b, the shooting device obtains background image data corresponding to the background based on the second image data.
In an embodiment of the present application, the background image data is image data corresponding to an image area other than an image area corresponding to a photographic subject.
Further optionally, in this embodiment of the application, the shooting device may determine a first image area from the second image (the first image area is an image area corresponding to the shooting object), and then determine the background image area according to the first image area, so that the shooting device may obtain the background image data corresponding to the background image area from the second image.
It should be noted that, for the description of obtaining, by the shooting device, the background image data corresponding to the background, reference may be made to the specific description of obtaining, by the shooting device, the foreground image data corresponding to the shot object in the foregoing embodiment, and details of the embodiment of the present application are not described herein again.
Step 201c, the shooting device performs image fusion on the foreground image data and the background image data to generate a target image.
Therefore, the shooting device can obtain the foreground image data of the moving object after the offset deformation from the first image data and obtain the background image data of the background object which is not dislocated from the second image data, and then perform image fusion processing on the foreground image data and the background image data to generate the target image. Therefore, the situation that the object is deformed in the target image can be avoided.
According to the shooting method provided by the embodiment of the application, under the condition that the image sensor of the shooting device is controlled to perform exposure, the shooting device can acquire the motion information of a shooting object (namely the information that the shooting object moves in the field of view of the image sensor) through the sensing pixels of the image sensor, then determine the position compensation information of the image sensor based on the motion information, and further control the image sensor to move according to the position compensation information, so that the image sensor is static relative to the shooting object in the exposure process, and after the exposure is completed, first image data is output. The shooting device can firstly acquire the motion information of the shooting object moving in the field of view of the image sensor and then control the image sensor to move based on the position compensation information determined by the motion information, so that the image sensor is static relative to the shooting object in the exposure process. Therefore, the relative position change between the photographic subject and the image sensor can be reduced, and therefore, the deformation of the image area where the moving subject is located can be reduced, and thus, the photographic effect of the photographic device can be improved.
How the image sensor of the photographing device acquires motion information will be described below by taking as an example that at least two high-sensitivity pixels (e.g., real sensing pixels) are disposed in the image sensor of the photographing device.
Fig. 4 shows a schematic circuit diagram of a real pixel. As shown in fig. 4, a real sensing pixel circuit may include: the circuit comprises a first current amplification module 10, a second current amplification module 11, an analog-to-digital conversion module 12, a logic judgment module 13 and a signal control module 14.
The first current amplifying module 10 includes: a pixel PD1, a first end of the pixel PD1 being grounded, a second end of the pixel PD1 being connected to a first end of a first capacitor C1; a NPN type triode 15, a base of the NPN type triode 15 is connected to the second end of the first capacitor C1, the base of the NPN type triode 15 is further connected to the first power source V0 through a first resistor R1, a collector of the NPN type triode 15 is connected to the first end of the second capacitor C2, the collector of the NPN type triode 15 is further connected to the first power source V0 through a second resistor R2, and an emitter of the NPN type triode 15 is grounded; the second end of the second capacitor C2 is grounded through a third resistor R3, and the second end of the second capacitor C2 is further connected to the first switch tube T1.
The analog-to-digital conversion module 12 includes: an Analog-to-digital converter (ADC) 16, a first end of the ADC16 is connected to a second end of the second capacitor C2, and a second end of the ADC16 is further connected to a first end of the logic determining module 13;
a first end of the signal control module 14 is connected with a second end of the logic judgment module 13, and a second end of the signal control module 14 is connected with the first switch tube T1;
the second current amplifying module includes: a pixel PD2, a first terminal of the pixel PD2 being grounded, a second terminal of the pixel PD2 being connected to a first terminal of a third capacitor C3; an NPN-type triode 17, a base of the NPN-type triode 17 being connected to the second end of the third capacitor C3, the base of the NPN-type triode 17 being further connected to the second power source V1 through a fourth resistor R4, a collector of the NPN-type triode 17 being connected to the first end of the fourth capacitor C4, the collector of the NPN-type triode 17 being further connected to the second power source V1 through a fifth resistor R5, and an emitter of the NPN-type triode 17 being grounded; a second end of the fourth capacitor C4 is grounded through a sixth resistor R6, and the second end of the fourth capacitor C4 is further connected to the second switching tube T2; the second end of the fourth capacitor C4 is further connected to the analog-to-digital converter ADC, and the second end of the signal control module 14 is further connected to the second switch tube T2.
The first current amplification module 10 is configured to convert an optical signal into a current signal by using the photodiode PD1, amplify the current signal by using the resistor R1, the resistor R2 and the triode, convert the current signal into a voltage signal 1 by using the resistor R3, and output the voltage signal 1 to the analog-to-digital conversion module 12.
In the analog-to-digital conversion block 12, VADCAnd Vref are the operating voltage and reference voltage of an Analog-to-digital converter (ADC), respectively. The digital-to-analog conversion module is used for converting analog quantities such as voltage, current and the like into digital quantities so as to facilitate subsequent processing. After the voltage signal 1 outputted from the first current amplifying module 10 is inputted into the analog-to-digital converting module 12, it is converted into a sense by comparing with the reference voltage VrefThe optical digital signal 1 'and outputs the photosensitive digital signal 1' to the logic judgment module 13.
In the logic determining module 13, VH and VL are the maximum value and the minimum value of the photo-sensing digital signal of the photo-sensing diode PD1 at the previous moment, that is, VH and VL indicate the voltage range in which the photo-sensing digital signal 2' of the photo-sensing diode PD1 at the previous moment is located. The logic determining module 13 can compare the photo-sensing digital signal 1' with VH and VL of the photo-sensing diode PD1 at the previous time, respectively. If the photo-sensing digital signal 1' is out of the range of VH and VL, the logic judgment module 13 may output a request signal to the signal control module 14, where the request signal is used for requesting to output the voltage signal output by the first current amplification module 10, i.e., the voltage signal 1.
The signal control module 14, after receiving the request signal output by the logic determining module 13, may arbitrate when to output the voltage signal 1, and output a control signal to the multiplexing switch module, where the control signal is used to control the output of the voltage signal 1 through a first path, where the first path includes: the first switch tube T1, the first analog signal output module and the multiplexing switch module are connected in sequence.
It can be understood that the functions of the components of the second current amplifying module 11 are similar to the first current amplifying module 10, and the functions of the second analog signal output module and the second switch transistor T2 are similar to the functions of the first analog signal output module and the first switch transistor T1, and therefore, the description thereof is omitted.
The second analog signal output module and the second switch tube T2 form a second path for outputting a voltage signal (analog) of the second current amplifying module 11.
It should be noted that, in practical implementation, the signal control module 14 may control the voltage signal 1 of the first current amplification module 10 and the voltage signal 2 of the second current amplification module 11 to be output simultaneously, and control the multiplexing switch module to process the two voltage signals (i.e., the voltage signal 1 and the voltage signal 2), and finally output the voltage signal 1, the voltage signal 2, and the voltage signal 1+ the voltage signal 2 by the multiplexing switch module.
Therefore, the voltage signal 1 and the voltage signal 2 can be used for judging the phase difference and measuring distance or speed; and voltage signal 1+ voltage signal 2 is the photosensitive signal value of the real sensing pixel (e.g., luminance information captured by the real sensing pixel) for subsequent picture processing.
In the embodiment of the application, each of the at least two real pixels can be independent from each other, and each real pixel can sense the brightness change of the external environment in real time along with the clock frequency of the pixel, so that the change of the environment brightness is converted into the change of the current, and further converted into the change of the digital signal.
If the variation of the digital signal of some of the at least two real sensing pixels exceeds a preset threshold, the some real sensing pixels report to the system for reading, and output a data packet with the position information, the brightness information and the time information of the some real sensing pixels.
It can be understood that the change of the position of the moving object can change the ambient brightness, thus causing the change of the digital signals corresponding to some real sensing pixels; therefore, the electronic equipment can obtain the contour of the moving object through the digital signal change and the coordinate information corresponding to some real sensing pixels, and accordingly the motion information of the moving object in the field of view of the image sensor is determined.
Optionally, in this embodiment of the application, at least two of the real sensing pixels are distributed in the image sensor according to a predetermined density, and the at least two of the real sensing pixels may be arranged in the image sensor of the photographing device in an array manner.
Further optionally, in this application embodiment, the size and density of at least two of the sensory pixels may be flexibly adjusted according to an actual application scenario, which is not limited in this application embodiment.
Optionally, in this embodiment of the application, the coordinate information may be a coordinate position in a target coordinate system; the target coordinate system is as follows: a rectangular coordinate system with a specific pixel of the image sensor as an origin. The specific pixel may be any one of: a pixel located at the center of the image sensor, and a pixel located at an end of the image sensor.
Optionally, in this embodiment of the present application, the motion information includes a target speed and a target direction of the photographic subject. Specifically, referring to fig. 1, as shown in fig. 5, the step 101 may be implemented by the following steps 101a to 101 b.
In step 101a, the photographing device determines N pieces of time information and N pieces of position information according to the luminance change information of the sensing pixels in the image sensor.
In this embodiment of the application, each of the N pieces of time information corresponds to one of the N pieces of location information, and N is a positive integer greater than 1.
Further optionally, in this embodiment of the application, each of the N time information is time information of detecting an outline of the photographic subject by the real sensing pixel; each of the N pieces of position information is position information of detecting the contour of the photographic subject for the corresponding time information, that is, each of the position information is position information of the contour of the photographic subject in the field of view of the image sensor.
Further alternatively, in this embodiment of the application, the photographing device may determine N number of sensing pixels from the sensing pixels in the image sensor according to the luminance change information, and then receive N number of time information and N number of position information from the N number of sensing pixels to determine the N number of time information and the N number of position information.
And step 101b, the shooting device respectively determines the movement speed and the movement direction based on the N pieces of time information and the N pieces of position information.
Further optionally, in this embodiment of the application, the shooting device may determine, from the N time information, a difference between the maximum time information and the minimum time information, and then determine the difference between the maximum time information and the minimum time information as the movement time of the moving object.
Further optionally, in this embodiment of the application, the shooting device may determine, from the N time information, position information corresponding to the maximum time information and position information corresponding to the minimum time information, and then determine a distance between the position information corresponding to the maximum time information and the position information corresponding to the minimum time information as the movement distance of the moving object, so that the shooting device may determine a ratio between the movement distance of the moving object and the movement time of the moving object as the movement speed of the moving object.
Further optionally, in this embodiment of the application, the shooting device may compare the coordinate positions of the position information corresponding to the maximum time information and the position information corresponding to the minimum time information in the target coordinate system, and determine the moving direction of the moving object.
Therefore, the real-time performance of the real-sensing pixel is better than that of the conventional pixel, the signal redundancy is better, and the precision is higher. Therefore, the image sensor can capture the contour of the moving object with high precision through the sensory pixels to realize the positioning of the moving object with high precision, so that the motion information of the moving object can be accurately obtained according to the position change of the moving object in different contour maps and the image acquisition time.
Alternatively, in this embodiment of the application, as shown in fig. 6 in combination with fig. 1, the step 102 may be specifically implemented by a step 102a described below, and the step 103 may be specifically implemented by a step 103a described below.
And 102a, determining X target positions of the image sensor by the shooting device according to the movement speed, the movement direction and the reading time.
In the embodiment of the application, X is a positive integer.
In the embodiment of the present application, the reading time is a reading duration of a single row of pixels of the image sensor.
Further optionally, in this embodiment of the application, the shooting device may first obtain the reading time, and then determine X target distances according to the reading time and X movement speeds.
Further alternatively, in this embodiment of the present application, the shooting device may determine X target distances by multiplying the reading time by X movement speeds.
Further optionally, in this embodiment of the application, the shooting device determines X target positions according to X movement directions and X target distances.
Further optionally, in this embodiment of the application, the shooting device may determine the current position of the image sensor as X target positions according to X movement directions and X positions of X target distances.
For example, assuming that the moving speed is 0.1mm/ms, the moving direction is horizontal to the right, and the reading time of each row of pixels of the image sensor is 3ms, the photographing device may determine the product of 0.1mm/ms and 3ms as the target distance, i.e., 0.3mm, and then determine the current position of the image sensor as the target position by moving the current position of the image sensor horizontally to the right by a position of 0.3 mm.
And step 103a, controlling the image sensor to move to the X target positions in sequence by the shooting device.
It can be understood that the photographing device may expose a row of pixels of the image sensor, after the exposure of the row of pixels is finished, the image sensor of the photographing device is controlled to move to the target position within the reading time of each row of pixels of the image sensor, then the next row of pixels of the image sensor is exposed, and so on, the image sensor is controlled to move to the X target positions in sequence, and during this period, the relative positions of the moving object and the image sensor are unchanged when each row of pixels of the image sensor is exposed.
Therefore, when the shooting device exposes partial pixels corresponding to the moving object line by line, the relative position of the moving object and the image sensor is not changed, and the phenomenon that the image area where the moving object is located is deformed can be reduced.
Of course, the moving object may also be displaced in the vertical direction relative to the camera.
It should be noted that the real-sensing pixel not only has a good capturing capability for the contour information of the moving object, but also has a very high frequency and a very good real-time performance for the real-sensing pixel sensor, and also has a capability of performing phase detection, so that phase information can be obtained, for example, a phase difference can be determined by the voltage signal 1 and the voltage signal 2, and distance measurement and speed measurement can be performed, specifically, a vertical distance and a vertical speed of a moving object far away from or close to the sensor can be measured, and therefore, at least two real-sensing pixel sensors can provide position information of the moving object in the field of view of the image sensor and a distance between the moving object and the shooting device in real time.
In the embodiment of the application, the shooting device can acquire the speed of the moving object far away from or close to the shooting device in the vertical direction through at least two sensing pixels, and then the shooting device can control the image sensor to move to a plurality of target positions in sequence, so that the distance between the lens and the image sensor is adjusted, and the phenomenon that the image area where the moving object is located is deformed in the vertical direction relative to the shooting device is reduced.
According to the shooting method provided by the embodiment of the application, the execution main body can be a shooting device. In the embodiment of the present application, a shooting method executed by a shooting device is taken as an example, and the shooting device provided in the embodiment of the present application is described.
Fig. 7 shows a schematic diagram of a possible structure of the shooting device according to the embodiment of the present application. As shown in fig. 7, the photographing device 60 may include: an acquisition module 61, a determination module 62, a control module 63, and a processing module 64. The acquiring module 61 is configured to acquire motion information of a photographic subject through a real sensing pixel of the image sensor when the image sensor is controlled to perform exposure, where the motion information is information of a motion of the photographic subject in a field of view of the image sensor. A determining module 62, configured to determine position compensation information of the image sensor based on the motion information acquired by the acquiring module. A control module 63 for controlling the image sensor to move based on the position compensation information determined by the determination module so that the image sensor is stationary with respect to the photographic subject during exposure. And a processing module 64, configured to output the first image data after the exposure is completed.
In one possible implementation manner, the target motion information includes a motion speed and a motion direction of the photographic subject. The obtaining module 61 is specifically configured to determine N pieces of time information and N pieces of position information according to luminance change information of a sensing pixel in the image sensor, where each piece of time information corresponds to one piece of position information, and N is a positive integer greater than 1; and determining the moving speed and the moving direction based on the N pieces of time information and the N pieces of position information, respectively.
In a possible implementation manner, the determining module 62 is specifically configured to determine X target positions of the image sensor according to the moving speed, the moving direction, and the reading time. The control module 63 is specifically configured to control the image sensor to sequentially move to the X target positions determined by the determining module 62. Wherein X is a positive integer.
In a possible implementation manner, the processing module 64 is further configured to control the image sensor to perform exposure and output second image data; and generating a target image according to the first image data and the second image data.
In a possible implementation manner, the processing module 64 is specifically configured to obtain foreground image data corresponding to the captured object based on the first image data; obtaining background image data corresponding to the background based on the second image data; and carrying out image fusion on the foreground image data and the background image data to generate the target image.
According to the shooting device provided by the embodiment of the application, the shooting device can firstly acquire the motion information of the shooting object moving in the field of view of the image sensor, and then the image sensor of the shooting device is controlled to move based on the position compensation information determined by the motion information, so that the image sensor is static relative to the shooting object in the exposure process. Therefore, the relative position change between the shooting object and the image sensor can be reduced, and the phenomenon that the image area where the moving object is located is deformed can be reduced, so that the shooting effect of the shooting device can be improved.
The shooting device in the embodiment of the present application may be an electronic device, or may be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic device may be, for example, a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) device, a robot, a wearable device, a super-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The shooting device provided by the embodiment of the application can realize each process realized by the method embodiments of fig. 1 to fig. 7, achieves the same technical effect, and is not repeated here to avoid repetition.
Optionally, in this embodiment, as shown in fig. 8, an electronic device 80 is further provided in this embodiment, and includes a processor 81 and a memory 82, where a program or an instruction that can be executed on the processor 81 is stored in the memory 82, and when the program or the instruction is executed by the processor M01, the process steps of the shooting method embodiment are implemented, and the same technical effect can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing the embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
In an embodiment of the present application, the electronic device further includes an image sensor.
The processor 110, under the condition of controlling the image sensor to perform exposure, obtains motion information of a photographic subject through the real sensing pixels of the image sensor, wherein the motion information is information of the movement of the photographic subject in the field of view of the image sensor; determining position compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the position compensation information so that the image sensor is stationary relative to the photographic subject during exposure; and outputting the first image data after the exposure is finished.
According to the shooting method provided by the embodiment of the application, the electronic equipment can firstly acquire the motion information of the shooting object moving in the field of view of the image sensor, and then the image sensor is controlled to move based on the position compensation information determined by the motion information, so that the image sensor is static relative to the shooting object in the exposure process. Therefore, the relative position change between the shooting object and the image sensor can be reduced, and the phenomenon that the image area where the moving object is located is deformed can be reduced, so that the shooting effect of the electronic equipment can be improved.
Optionally, in this embodiment of the application, the motion information includes a motion speed and a motion direction.
A processor 110, specifically configured to determine N time information and N position information through luminance change information of a sensing pixel in the image sensor, where each time information corresponds to one position information, and N is a positive integer greater than 1; and determining the moving speed and the moving direction based on the N pieces of time information and the N pieces of position information, respectively.
Therefore, the electronic equipment can acquire N pieces of time information and N pieces of position information through the real sensing pixels in the image sensor and determine the movement speed and the movement direction, so that a high-precision contour map of the moving object can be captured, the high-precision positioning of the moving object is realized, and the movement information of the moving object is accurately obtained according to the position change of the moving object in different contour maps and the image acquisition time.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to determine X target positions of the image sensor according to the moving speed, the moving direction, and the reading time; controlling the image sensor to move to X target positions in sequence; wherein X is a positive integer.
Therefore, the electronic equipment can determine X target positions according to the motion speed and the reading time, and control the image sensor to sequentially move to the X target positions, so that when part of pixels corresponding to the moving object are exposed line by line, the relative positions of the moving object and the image sensor are unchanged, the phenomenon that an image area where the moving object is located is deformed can be reduced, and the shooting effect of the electronic equipment can be improved.
Optionally, in this embodiment of the application, the processor 110 is further configured to control the image sensor to perform exposure, and output second image data; and generating a target image from the first image data and the second image data.
As can be seen, since the electronic device can generate the target image from the first image data and the second image data, the deformation of the moving object in the captured image can be reduced, and thus, the capturing effect of the electronic device can be improved.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to obtain foreground image data corresponding to the shooting object based on the first image data; obtaining background image data corresponding to the background based on the second image data; and carrying out image fusion on the foreground image data and the background image data to generate a target image.
Therefore, the electronic equipment can obtain the foreground image data from the first image data, obtain the background image data from the second image data, and perform image fusion on the foreground image data and the background image data to generate the target image, so that the deformation of the moving object in the shot image can be reduced, and the shooting effect of the electronic equipment can be improved.
It should be understood that, in the embodiment of the present application, the input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processing unit 1041 processes image data of a still picture or a video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, memory 109 may include volatile memory or non-volatile memory, or memory 109 may include both volatile and non-volatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memories may be Random Access Memories (RAMs), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (ddr SDRAM), enhanced SDRAM (enhanced SDRAM, ESDRAM), synclink DRAM (SLDRAM), and direct bus RAM (DRRAM). Memory 109 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor, which primarily handles operations involving the operating system, user interface, and applications, etc., and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above shooting method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing shooting method embodiments, and achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatuses in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions recited, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A photographing method applied to a photographing apparatus including an image sensor, the method comprising:
under the condition of controlling the image sensor to carry out exposure, acquiring motion information of a shooting object through real sensing pixels of the image sensor, wherein the motion information is information of the motion of the shooting object in a field of view of the image sensor;
determining position compensation information of the image sensor based on the motion information;
controlling the image sensor to move based on the position compensation information so that the image sensor is stationary relative to the photographic subject during exposure;
after the exposure is completed, the first image data is output.
2. The method according to claim 1, wherein the motion information includes a motion speed and a motion direction of the photographic subject;
the acquiring motion information of the photographic object through the sensory pixels of the image sensor comprises the following steps:
determining N pieces of time information and N pieces of position information through the brightness change information of the sensory pixels in the image sensor, wherein each piece of time information corresponds to one piece of position information, and N is a positive integer greater than 1;
determining the moving speed and the moving direction based on the N pieces of time information and the N pieces of position information, respectively.
3. The method of claim 2, wherein determining the position compensation information for the image sensor based on the motion information comprises:
determining X target positions of the image sensor according to the movement speed, the movement direction and the reading time;
the controlling the image sensor to move based on the position compensation information includes:
controlling the image sensor to move to the X target positions in sequence;
wherein X is a positive integer.
4. The method of claim 1, wherein after the outputting the first image data after the exposing is completed, the method further comprises:
controlling the image sensor to perform exposure and outputting second image data;
and generating a target image according to the first image data and the second image data.
5. The method of claim 4, wherein generating a target image from the first image data and the second image data comprises:
obtaining foreground image data corresponding to the shooting object based on the first image data;
obtaining background image data corresponding to a background based on the second image data;
and carrying out image fusion on the foreground image data and the background image data to generate the target image.
6. A camera, characterized in that the camera comprises an image sensor, the camera further comprising: the device comprises an acquisition module, a determination module, a control module and a processing module;
the acquisition module is used for acquiring motion information of a shooting object through a sensing pixel of the image sensor under the condition of controlling the image sensor to perform exposure, wherein the motion information is information of the motion of the shooting object in a field of view of the image sensor;
the determining module is used for determining the position compensation information of the image sensor based on the motion information acquired by the acquiring module;
the control module is used for controlling the image sensor to move based on the position compensation information determined by the determination module so that the image sensor is static relative to the shooting object in the exposure process;
and the processing module is used for outputting first image data after exposure is finished.
7. The photographing apparatus according to claim 6, wherein the motion information includes a motion speed and a motion direction of the photographic subject;
the acquisition module is specifically configured to determine N pieces of time information and N pieces of position information through luminance change information of a sensory pixel in the image sensor, where each piece of time information corresponds to one piece of position information, and N is a positive integer greater than 1; and determining the moving speed and the moving direction based on the N time information and the N position information, respectively.
8. The camera according to claim 6, wherein the determining module is specifically configured to determine X target positions of the image sensor according to the movement speed, the movement direction, and a reading time;
the control module is specifically configured to control the image sensor to sequentially move to the X target positions determined by the determination module;
wherein X is a positive integer.
9. The camera according to claim 6, wherein the processing module is further configured to control the image sensor to perform exposure and output second image data; and generating a target image according to the first image data and the second image data.
10. The shooting device according to claim 9, wherein the processing module is specifically configured to obtain foreground image data corresponding to the shooting object based on the first image data; obtaining background image data corresponding to a background based on the second image data; and carrying out image fusion on the foreground image data and the background image data to generate the target image.
11. An electronic device, characterized by comprising a processor and a memory, said memory storing a program or instructions executable on said processor, said program or instructions, when executed by said processor, implementing the steps of the shooting method according to any one of claims 1 to 5.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the shooting method according to any one of claims 1 to 5.
CN202210730500.5A 2022-06-24 2022-06-24 Shooting method, shooting device, electronic equipment and medium Pending CN115278056A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210730500.5A CN115278056A (en) 2022-06-24 2022-06-24 Shooting method, shooting device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210730500.5A CN115278056A (en) 2022-06-24 2022-06-24 Shooting method, shooting device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN115278056A true CN115278056A (en) 2022-11-01

Family

ID=83761384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210730500.5A Pending CN115278056A (en) 2022-06-24 2022-06-24 Shooting method, shooting device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN115278056A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161942A (en) * 2016-07-29 2016-11-23 广东欧珀移动通信有限公司 The method and apparatus of shooting moving object and mobile terminal
CN106791281A (en) * 2017-01-06 2017-05-31 西安中科飞图光电科技有限公司 IMC method, image motion compensation device and imaging device
CN108540725A (en) * 2018-05-14 2018-09-14 Oppo广东移动通信有限公司 Anti-fluttering method, electronic device, imaging system, storage medium and computer equipment
CN115118890A (en) * 2022-06-24 2022-09-27 维沃移动通信有限公司 Camera module, shooting method, shooting device and electronic equipment
CN115278097A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Image generation method, image generation device, electronic device, and medium
CN115278058A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Image acquisition method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161942A (en) * 2016-07-29 2016-11-23 广东欧珀移动通信有限公司 The method and apparatus of shooting moving object and mobile terminal
CN106791281A (en) * 2017-01-06 2017-05-31 西安中科飞图光电科技有限公司 IMC method, image motion compensation device and imaging device
CN108540725A (en) * 2018-05-14 2018-09-14 Oppo广东移动通信有限公司 Anti-fluttering method, electronic device, imaging system, storage medium and computer equipment
CN115118890A (en) * 2022-06-24 2022-09-27 维沃移动通信有限公司 Camera module, shooting method, shooting device and electronic equipment
CN115278097A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Image generation method, image generation device, electronic device, and medium
CN115278058A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Image acquisition method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
JP4513869B2 (en) Imaging apparatus, strobe image generation method, and program
CN107424186A (en) depth information measuring method and device
US20110228044A1 (en) Imaging apparatus, imaging method and recording medium with program recorded therein
CN109743505B (en) Video shooting method and device based on laser ranging and electronic equipment
US11532089B2 (en) Optical flow computing method and computing device
CN113099122A (en) Shooting method, shooting device, shooting equipment and storage medium
CN105227838A (en) A kind of image processing method and mobile terminal
JP2006050337A (en) Imaging apparatus, imaging method, and imaging control program
CN109714539B (en) Image acquisition method and device based on gesture recognition and electronic equipment
CN109889712B (en) Pixel circuit, image sensor, terminal equipment and signal control method
JP2021029017A (en) Photoelectric conversion device, imaging system, mobile body, and exposure control device
US20050117017A1 (en) System and method for imaging regions of interest
CN114466129A (en) Image processing method, image processing device, storage medium and electronic equipment
JP2001177752A (en) Image pickup method and device to generate combined output image having image components photographed by different focal distances
JP3493403B2 (en) 3D measuring device
EP3872463A1 (en) Light sensor module, method for acquiring light sensor data, electronic equipment, and storage medium
CN115278097A (en) Image generation method, image generation device, electronic device, and medium
CN113747067A (en) Photographing method and device, electronic equipment and storage medium
US20200412984A1 (en) Cross-row time delay integration method, apparatus and camera
CN115861741B (en) Target calibration method and device, electronic equipment, storage medium and vehicle
CN115278056A (en) Shooting method, shooting device, electronic equipment and medium
CN110418085B (en) TOF pixel circuit and ranging system
CN113330487A (en) Parameter calibration method and device
JP2010200360A (en) Imaging apparatus, stroboscopic image generation method, and program
CN113506351A (en) Calibration method and device for ToF camera, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination