CN109803079B - Mobile terminal, photographing method thereof and computer storage medium - Google Patents

Mobile terminal, photographing method thereof and computer storage medium Download PDF

Info

Publication number
CN109803079B
CN109803079B CN201910120670.XA CN201910120670A CN109803079B CN 109803079 B CN109803079 B CN 109803079B CN 201910120670 A CN201910120670 A CN 201910120670A CN 109803079 B CN109803079 B CN 109803079B
Authority
CN
China
Prior art keywords
target
shot
mobile terminal
laser
camera module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910120670.XA
Other languages
Chinese (zh)
Other versions
CN109803079A (en
Inventor
马美雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910120670.XA priority Critical patent/CN109803079B/en
Publication of CN109803079A publication Critical patent/CN109803079A/en
Application granted granted Critical
Publication of CN109803079B publication Critical patent/CN109803079B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application discloses a mobile terminal, a photographing method thereof and a computer storage medium, wherein the mobile terminal comprises: a camera module; the first sensing assembly is used for acquiring the position change condition of a target to be shot; and the controller is used for determining the motion state of the target to be shot according to the position change condition of the target to be shot and performing motion compensation when the camera module performs shooting according to the motion state of the target to be shot. By the method, the definition of the moving target can be guaranteed, and the quality of shooting moving objects is improved.

Description

Mobile terminal, photographing method thereof and computer storage medium
Technical Field
The present application relates to the field of mobile terminal technologies, and in particular, to a mobile terminal, a photographing method thereof, and a computer storage medium.
Background
During human-computer interaction, the existing air-isolated gesture operation generally adopts a camera positioned on a mobile phone, the camera comprises a depth camera, a binocular camera and a monocular camera, the gesture state of a user is detected through the camera, image recognition is carried out based on algorithms such as machine learning, and the image recognition is compared with a preset gesture action image to realize the air-isolated gesture operation. There are also solutions that use infrared laser emitters to determine the gesture state by detecting the infrared reflection received.
Gesture recognition achieved by the camera and the infrared transmitting device is limited by angles of the camera and the infrared receiver, only can function within a characteristic range, and meanwhile complex image algorithms need to be combined, so that system resources are consumed very much. In addition, the power consumption of the camera and the infrared emitter is large, and the camera and the infrared emitter are not beneficial to being used in mobile portable equipment.
Disclosure of Invention
The technical scheme adopted by the application is as follows: there is provided a mobile terminal including: a camera module; the first sensing assembly is used for acquiring the position change condition of a target to be shot; and the controller is used for determining the motion state of the target to be shot according to the position change condition of the target to be shot and performing motion compensation when the camera module performs shooting according to the motion state of the target to be shot.
Another technical scheme adopted by the application is as follows: a photographing method of a mobile terminal is provided, the method comprising: acquiring the position change condition of a target to be shot; determining the motion state of the target to be shot according to the position change condition of the target to be shot; when the camera module is used for shooting, motion compensation is carried out according to the motion state of the target to be shot.
Another technical scheme adopted by the application is as follows: there is provided a computer storage medium for storing a computer program which, when executed by a processor, implements the method as described above.
The application provides a mobile terminal comprising: a camera module; the first sensing assembly is used for acquiring the position change condition of a target to be shot; and the controller is used for determining the motion state of the target to be shot according to the position change condition of the target to be shot and performing motion compensation when the camera module performs shooting according to the motion state of the target to be shot. Through the mode, the motion compensation can be carried out during shooting based on the moving speed of the object to be shot, the definition of the moving object is guaranteed, and the shooting quality of the moving object is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
fig. 1 is a schematic structural diagram of a first embodiment of a mobile terminal provided in the present application;
FIG. 2 is a schematic diagram of a first sensor assembly in a first embodiment of a mobile terminal provided herein;
FIG. 3 is a schematic diagram of the movement of an object to be photographed;
fig. 4 is a schematic diagram of the moving distance of the object to be photographed;
fig. 5 is a schematic structural diagram of a second embodiment of a mobile terminal provided in the present application;
fig. 6 is a schematic structural diagram of a third embodiment of a mobile terminal provided in the present application;
fig. 7 is a schematic structural diagram of a camera module in a third embodiment of the mobile terminal provided in the present application;
fig. 8 is a schematic structural diagram of a camera motor in a third embodiment of the mobile terminal provided in the present application;
fig. 9 is a flowchart illustrating a first embodiment of a photographing method of a mobile terminal according to the present application;
fig. 10 is a flowchart illustrating a second embodiment of a photographing method of a mobile terminal according to the present application;
FIG. 11 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
Referring to fig. 1, fig. 1 is a schematic structural diagram of a first embodiment of a mobile terminal provided in the present application, where the mobile terminal 10 includes a camera module 11, a controller 12, and a first sensor assembly 13.
The camera module 11 is used for shooting, and specifically includes shooting or recording. The controller 12 may be a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and is used for Processing data or issuing control instructions to control the operations of other components.
The first sensor assembly 13 is configured to obtain a position of a target to be photographed and a variation of the position of the target, and the controller 12 is configured to determine a motion state of the target to be photographed according to the variation of the position of the target to be photographed, and perform motion compensation when the camera module performs photographing according to the motion state of the target to be photographed.
The motion state may include a moving direction, a speed, an acceleration, and the like of the object to be photographed, where the speed is the main factor. It will be appreciated that to detect the speed of the object to be photographed, it is a feasible solution to calculate the speed indirectly by detecting the change in its position and the time by the optical sensor. The change of the position of the object to be photographed is detected by first detecting the change of the distance and direction between the object to be photographed and the photographing apparatus, for example, in one case, the connection direction between the object to be photographed and the mobile terminal is not changed, but the distance between the object to be photographed and the mobile terminal is changed, and in another case, the distance between the object to be photographed and the mobile terminal is not changed, but the connection direction between the object to be photographed and the mobile terminal is changed. The method of detecting the distance between the object to be photographed and the photographing apparatus will be described in several ways.
Optionally, the first sensor component 13 may be a laser sensor component, and includes a laser transmitter and a laser receiver, and specifically, TOF (time of flight) is used for distance detection, where TOF (time of flight 3D imaging) is to continuously transmit light pulses to an object to be detected through the laser transmitter, then receive light returning from the object to be detected through the laser receiver, and obtain the object distance by detecting the flight (round trip) time of the light pulses.
The ToF distance measuring method is a two-way distance measuring technique, and mainly uses the flight time of signal between two asynchronous transceivers (or reflected surfaces) to measure the distance between nodes. Conventional ranging techniques are classified into two-way ranging techniques and one-way ranging techniques. Under the condition of better Signal level modulation or a non-line-of-sight environment, the estimation result of the distance measurement method based on RSSI (Received Signal Strength Indication) is more ideal; under the sight distance and sight line environment, the estimation method based on the ToF distance can make up the defects of the estimation method based on the RSSI distance.
Optionally, the first sensor component 13 may be a 3D structured light sensor component, and includes a projector and a camera, and the projector projects specific light information to the surface and the background of the target to be measured, and then the light information is collected by the camera. And calculating information such as the position and the depth of the object according to the change of the optical signal caused by the object, and further restoring the whole three-dimensional space to obtain the distance of the target to be measured.
Optionally, in other embodiments, the distance detection may be implemented by using an ultrasonic sensor and a laser radar, and in addition, the distance sensor may be used in combination with an inertial sensor, an accelerometer, a gyroscope, and the like to further correct the detected distance.
The detection of the distance is described below by means of a specific embodiment.
As shown in fig. 2, fig. 2 is a schematic structural diagram of a first sensor assembly in a first embodiment of a mobile terminal provided in the present application. The first sensing assembly 13 is a multi-point laser ranging module, and includes a laser transmitter 131 and a corresponding laser receiver 132.
The laser emitter 131 is configured to emit laser to a target to be photographed; the laser receiver 132 is used for receiving the reflected laser light; the controller (not shown in fig. 2) is configured to measure the distance between the first sensor assembly 13 and the object a to be photographed according to the time difference between the laser transmitter 131 and the corresponding laser receiver 132. It will be appreciated that, since in practical configurations the distance between the laser emitter 131 and the laser receiver 132 is small and negligible, and therefore the emitted light and the reflected light can be considered to be approximately coincident, the following formula can be used to calculate the distance between the first sensor assembly 13 and the object a to be photographed:
Figure BDA0001971784080000041
where L is a distance between the first sensor assembly 13 and the object a to be photographed, c is a speed of light, and t is a time difference between a time when the laser emitter 131 emits laser light and a time when the laser receiver 132 receives the laser light.
Alternatively, the laser emitters 131 and corresponding laser receivers 132 may be distributed in an array, for example, a 4 × 4 array may be used.
As shown in fig. 3, fig. 3 is a schematic diagram of the movement of the object to be photographed, which shows the movement of the object to be photographed at three times t0, t1, and t2, respectively. It can be understood that, since the position of the mobile terminal generally does not change, the position of the sensor can be considered to be fixed, and then, if the distances detected by different sensors change, the object to be photographed can be considered to have moved. Further, the moving condition of the object to be shot can be judged according to the change condition of the distance acquired by each sensor.
As shown in fig. 4, fig. 4 is a schematic diagram of the moving distance of the target to be photographed, two adjacent sensors respectively measure that the distance is greatly changed to a and b within a time interval t, and since the sensors can accurately design the included angle θ in the designing, manufacturing and calibrating processes, the moving distance s of the target to be photographed can be calculated:
Figure BDA0001971784080000051
further, the speed v of the object to be photographed:
Figure BDA0001971784080000052
the method comprises the steps of acquiring a speed of a target to be shot, acquiring a distance between a first laser emitter and a target point on the target to be shot at a first moment, acquiring a distance between a second laser emitter and the target point at a second moment, acquiring a time length between the first moment and the second moment, and acquiring a time length between the first moment and the second moment.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a second embodiment of the mobile terminal provided in the present application, where the mobile terminal 50 includes a camera module 51, a controller 52, a first sensor element 53 and a second sensor element 54.
The camera module 51 is used for shooting, specifically including shooting or recording. The controller 52 may be a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and is used for Processing data or issuing control instructions to control the operations of other components.
The first sensor assembly 53 is configured to obtain a position change of the object to be photographed, and the controller 52 is configured to determine a motion state of the object to be photographed according to the position change of the object to be photographed. The second sensor component 54 is used for acquiring a motion state of the mobile terminal 50, and the controller 52 is used for performing motion compensation when the camera module 51 performs shooting according to the motion state of the object to be shot and the motion state of the mobile terminal 50.
The motion state acquired by the first sensor component 53 is the moving direction, speed, acceleration, etc. of the object to be photographed, and the motion state acquired by the second sensor component 54 is the moving direction, speed, acceleration, etc. of the mobile terminal 50, and may also be the shaking frequency, shaking amplitude, etc. of the mobile terminal 50.
Specifically, the second sensor assembly 54 may include two gyroscopes (or accelerometers) which respectively detect the left and right and front and back tilt angles, and then transmit the two angle information to the controller 52, and the controller 52 obtains the motion state of the mobile terminal 50 based on the angle information.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a third embodiment of the mobile terminal provided in the present application, where the mobile terminal 60 includes a camera module 61, a controller 62, and a first sensor assembly 63.
The camera module 61 is used for shooting, and specifically comprises shooting or video recording. The controller 62 may be a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and is used for Processing data or issuing control instructions to control the operations of other components.
The first sensor assembly 63 is configured to obtain a position change situation of the target to be photographed, and the controller 62 is configured to determine a motion state of the target to be photographed according to the position change situation of the target to be photographed, and send a control instruction to the camera module 61 for motion compensation when the camera module performs photographing according to the motion state of the target to be photographed.
As shown in fig. 7, fig. 7 is a schematic structural diagram of a camera module in a third embodiment of the mobile terminal provided in the present application, and the camera module 61 specifically includes a camera motor 611, a lens 612, and a driver 613.
The camera module 61 may be disposed on a circuit board 60a of the mobile terminal 60, and the circuit board 60a may be a Flexible Printed Circuit (FPC) and is connected to a main board of the electronic device through a btb (board to board). In addition, the camera module 61 may further include other components such as a light sensor and a flash lamp, or may be a multifunctional module formed by combining with other modules such as a receiver, which is not described herein.
The camera motor 611 may be a mechanical motor, an electronic touch motor, an annular ultrasonic motor, or the like, which is not limited herein. Alternatively, in one embodiment, a voice coil motor may be used to achieve the positional adjustment of the lens 612. A Voice Coil Motor (Voice Coil activator/Voice Coil Motor) is a device that converts electrical energy into mechanical energy and realizes linear and limited swing motion.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a camera motor in a third embodiment of the mobile terminal provided in the present application, where the camera motor 611 includes an anti-magnetic cover 611a, a magnet 611b, an upper elastic sheet 611c, a lens carrier 611d, a motor coil 611e, a lower elastic sheet 611f, and a motor base 611 g. Wherein, the lens 612 is fixed on the lens carrier 611 d; the driver 613 is specifically configured to acquire a driving instruction to control the current intensity in the motor coil 611e, thereby controlling the movement of the lens carrier 611d in the motor 611.
Optionally, the camera motor 611 may be a voice coil motor, and the working principle of the voice coil motor is as follows: in a permanent magnetic field formed by the magnet 611b, the extension position of the spring plate (i.e., the upper spring 611c and the lower spring 611f) is controlled by changing the dc current of the motor coil 611e in the motor, so as to drive the lens carrier 611d to move back and forth, and further drive the lens 612 to move back and forth.
With reference to the foregoing embodiment, after acquiring the motion state (e.g., the moving speed) of the target to be photographed, the controller 62 in this embodiment may perform motion compensation in two ways:
first, the digital camera can also be called optical anti-shake, and the optical anti-shake wraps the floating lens by means of magnetic force, so that image blur caused by camera vibration is effectively overcome, and the effect of the digital camera with the large zoom lens is more obvious. Usually, the gyroscope in the lens detects a tiny movement, and will transmit a signal to the microprocessor to immediately calculate the displacement amount to be compensated, and then compensate according to the shake direction and displacement amount of the lens through the compensation lens set, and the compensation lens set correspondingly adjusts the position and angle to keep the light path stable, thereby effectively overcoming the image blur generated by the vibration of the camera.
The optical anti-shake technique is applied to the movement of the object to be photographed in the present embodiment. When the laser array detects the movement of the target to be shot, the microprocessor determines the movement speed of the target to be shot according to the change of the distance of the target to be shot, then the compensation is carried out according to the movement direction and the movement speed of the target to be shot through the compensation lens group, and the position and the angle of the compensation lens group are correspondingly adjusted, so that the light path is kept stable, and the image blur caused by the vibration of the camera is effectively overcome.
Specifically, the moving speed of the lens may be adjusted according to a certain proportional change, for example, in the foregoing embodiment, the distance between the mobile terminal and the object to be photographed is acquired, and assuming that the distance between the lens and the sensor and the object to be photographed is the same, the moving speed of the lens may be adjusted according to the following formula:
Figure BDA0001971784080000081
where v1 is the moving speed of the object to be photographed, v2 is the moving speed of the lens, s1 is the distance between the object to be photographed and the CCD (image sensor), and s2 is the distance between the lens and the CCD.
The second type is also called electronic anti-shake, which mainly refers to anti-shake that forcibly increases the CCD photosensitive parameters while accelerating the shutter and analyzing the image obtained on the CCD, and then compensates for the image using the edge image.
Electronic anti-shaking uses digital circuitry to perform picture processing resulting in an anti-shaking effect. When the anti-shake circuit works, a shot picture is only about 90% of an actual picture, then the digital circuit carries out fuzzy judgment on the shake condition of a target to be shot, and then the remaining picture of about 10% is used for shake compensation. The method has the characteristics of low cost, but reduces the utilization rate of the CCD and brings certain loss to the definition of a picture.
Unlike the prior art, the mobile terminal in this embodiment includes: a camera module; the first sensing assembly is used for acquiring the distance of a target to be shot; and the controller is used for determining the motion state of the target to be shot according to the change condition of the distance of the target to be shot and performing motion compensation when the camera module performs shooting according to the motion state of the target to be shot. Through the mode, the motion compensation can be carried out during shooting based on the moving speed of the object to be shot, the definition of the moving object is guaranteed, and the shooting quality of the moving object is improved.
Referring to fig. 9, fig. 9 is a schematic flowchart of a first embodiment of a photographing method of a mobile terminal provided in the present application, where the method includes:
step 91: and acquiring the position change condition of the target to be shot.
In one case, the direction of the connection line between the object to be photographed and the mobile terminal is constant, but the distance between the object to be photographed and the mobile terminal is changing, and in another case, the direction of the connection line between the object to be photographed and the mobile terminal is constant, but the direction of the connection line between the object to be photographed and the mobile terminal is changing.
Optionally, the position change condition of the target to be photographed may be obtained by using a laser sensor assembly, specifically, the laser sensor assembly includes a laser transmitter and a laser receiver, and specifically, TOF (time of flight) is used for distance detection, where TOF (time of flight) 3D imaging is to continuously send light pulses to the target to be photographed by using the laser transmitter, then receive light returning from the target to be photographed by using the laser receiver, and obtain the target object distance by detecting the flight (round trip) time of the light pulses.
The ToF distance measuring method is a two-way distance measuring technique, and mainly uses the flight time of signal between two asynchronous transceivers (or reflected surfaces) to measure the distance between nodes. Conventional ranging techniques are classified into two-way ranging techniques and one-way ranging techniques. Under the condition of better Signal level modulation or a non-line-of-sight environment, the estimation result of the distance measurement method based on RSSI (Received Signal Strength Indication) is more ideal; under the sight distance and sight line environment, the estimation method based on the ToF distance can make up the defects of the estimation method based on the RSSI distance.
Optionally, the distance to the target to be photographed may also be obtained by using a structured light sensor assembly, specifically, the structured light sensor assembly includes a projector and a camera, and the projector is used to project specific light information to the surface and the background of the target to be photographed, and then the light information is collected by the camera. And calculating information such as the position and the depth of the object according to the change of the optical signal caused by the object, and further restoring the whole three-dimensional space to obtain the distance of the target to be measured.
And step 92: and determining the motion state of the target to be shot according to the position change condition of the target to be shot.
Alternatively, a multi-spot laser matrix may be used for distance detection. Alternatively, the multi-spot laser matrix may be distributed in an array of 4 × 4, i.e. 16 laser emitters and corresponding 16 laser receivers. Then the following formula can be specifically used to calculate the distance of the object to be photographed:
Figure BDA0001971784080000091
wherein, L is the distance between the mobile terminal and the object to be shot, c is the speed of light, and t is the time difference between the moment when the laser transmitter sends laser and the moment when the laser receiver receives laser.
It can be understood that, since the position of the mobile terminal generally does not change, the position of the sensor can be considered to be fixed, and then, if the distances detected by different sensors change, the object to be photographed can be considered to have moved. Further, the moving condition of the object to be shot can be judged according to the change condition of the distance acquired by each sensor.
With reference to fig. 4, two adjacent sensors measure the distance from a to a and b within a time interval t, and since the sensors can accurately design the included angle θ in the design, manufacture and calibration processes, the moving distance s of the target to be photographed can be calculated:
Figure BDA0001971784080000101
further, the speed v of the object to be photographed:
Figure BDA0001971784080000102
the method comprises the steps of acquiring a speed of a target to be shot, acquiring a distance between a first laser emitter and a target point on the target to be shot at a first moment, acquiring a distance between a second laser emitter and the target point at a second moment, acquiring a time length between the first moment and the second moment, and acquiring a time length between the first moment and the second moment.
Step 93: when the camera module is used for shooting, motion compensation is carried out according to the motion state of the target to be shot.
Optionally, step 93 may specifically be: when the camera module shoots, the position of a lens in the camera module is adjusted according to the motion state of a target to be shot, so that motion compensation is performed when the camera module shoots.
The mode can also be called optical anti-shake, and the optical anti-shake wraps the suspension lens by means of magnetic force, so that image blur caused by camera vibration is effectively overcome, and the effect of the digital camera with the large zoom lens is more obvious. Usually, the gyroscope in the lens detects a tiny movement, and will transmit a signal to the microprocessor to immediately calculate the displacement amount to be compensated, and then compensate according to the shake direction and displacement amount of the lens through the compensation lens set, and the compensation lens set correspondingly adjusts the position and angle to keep the light path stable, thereby effectively overcoming the image blur generated by the vibration of the camera.
The optical anti-shake technique is applied to the movement of the object to be photographed in the present embodiment. When the laser array detects the movement of the target to be shot, the microprocessor determines the movement speed of the target to be shot according to the change of the distance of the target to be shot, then the compensation is carried out according to the movement direction and the movement speed of the target to be shot through the compensation lens group, and the position and the angle of the compensation lens group are correspondingly adjusted, so that the light path is kept stable, and the image blur caused by the vibration of the camera is effectively overcome.
Specifically, the moving speed of the lens may be adjusted according to a certain proportional change, for example, in the foregoing embodiment, the distance between the mobile terminal and the object to be photographed is acquired, and assuming that the distance between the lens and the sensor and the object to be photographed is the same, the moving speed of the lens may be adjusted according to the following formula:
Figure BDA0001971784080000111
where v1 is the moving speed of the object to be photographed, v2 is the moving speed of the lens, s1 is the distance between the object to be photographed and the CCD (image sensor), and s2 is the distance between the lens and the CCD.
Optionally, step 93 may specifically be: when the camera module shoots, the exposure time of the camera module is adjusted, so that when the camera module shoots, motion compensation is carried out.
This method may also be called electronic anti-shake, which mainly refers to anti-shake that forcibly increases the CCD photosensing parameters while accelerating the shutter and analyzing the image obtained on the CCD, and then compensates for the image using the edge image, and is actually a technique of compensating for shake by reducing the image quality, which attempts to achieve a balance between the image quality and the image shake.
Electronic anti-shaking uses digital circuitry to perform picture processing resulting in an anti-shaking effect. When the anti-shake circuit works, a shot picture is only about 90% of an actual picture, then the digital circuit carries out fuzzy judgment on the shake condition of a target to be shot, and then the remaining picture of about 10% is used for shake compensation. The method has the characteristics of low cost, but reduces the utilization rate of the CCD and brings certain loss to the definition of a picture.
Referring to fig. 10, fig. 10 is a schematic flowchart of a second embodiment of a photographing method of a mobile terminal provided in the present application, where the method includes:
step 101: and acquiring the position change condition of the target to be shot.
Step 102: and determining the motion state of the target to be shot according to the position change condition of the target to be shot.
Step 103: and acquiring the motion state of the mobile terminal.
Usually, a gyroscope in the lens detects a small movement to acquire the motion state of the mobile terminal.
It is understood that the above steps 101 and 102 are used for detecting the motion state of the object to be photographed, the step 103 is used for detecting the motion state of the mobile terminal, the steps between the two may be exchanged, or both may be executed simultaneously, which is not required here.
Step 104: when the camera module shoots, motion compensation is carried out according to the motion state of the target to be shot and the motion state of the mobile terminal when the camera module shoots.
Alternatively, compared with the compensation of a single motion, the motion states of both the object to be photographed and the mobile terminal are acquired at the same time, and the compensation is performed at the same time.
For example, if the moving directions of the object to be photographed and the mobile terminal are the same, the compensation value may be smaller with respect to the individual movement of the object to be photographed, and if the moving directions of the object to be photographed and the mobile terminal are opposite, the compensation value may be larger with respect to the individual movement of the object to be photographed.
Specifically, a compensation value may be acquired separately according to different types. For example, a first compensation value is obtained for the speed of the object to be photographed, and a second compensation value is obtained for the speed of the mobile terminal. And then judging whether the moving direction of the target to be shot is the same as the moving direction of the mobile terminal, if so, adding the first compensation value to the second compensation value to obtain a final compensation value, and if not, taking the difference value between the first compensation value and the second compensation value as the final compensation value.
Referring to fig. 11, fig. 11 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application, a computer program 111 is stored in the computer storage medium 110, and when being executed by a processor, the computer program 111 implements the following method:
acquiring the position change condition of a target to be shot; determining the motion state of the target to be shot according to the position change condition of the target to be shot; when the camera module is used for shooting, motion compensation is carried out according to the motion state of the target to be shot.
Optionally, the computer program 111, when executed by the processor, is further configured to implement the method of: emitting laser to a target to be shot by using a multipoint laser ranging module; receiving reflected laser by using a multipoint laser ranging module; and determining the position change condition of the target to be shot based on the time difference between the emitted laser and the received laser.
Optionally, the computer program 111, when executed by the processor, is further configured to implement the method of: calculating the speed of the object to be shot by adopting the following formula:
Figure BDA0001971784080000121
the method comprises the steps of acquiring a speed of a target to be shot, acquiring a distance between a first laser emitter and a target point on the target to be shot at a first moment, acquiring a distance between a second laser emitter and the target point at a second moment, acquiring a time length between the first moment and the second moment, and acquiring a time length between the first moment and the second moment.
Optionally, the computer program 111, when executed by the processor, is further configured to implement the method of: when the camera module shoots, motion compensation is carried out according to the motion state of the target to be shot and the motion state of the mobile terminal when the camera module shoots.
Optionally, the computer program 111, when executed by the processor, is further configured to implement the method of: when the camera module shoots, the position of a lens in the camera module is adjusted according to the motion state of a target to be shot, so that motion compensation is performed when the camera module shoots.
Optionally, the computer program 111, when executed by the processor, is further configured to implement the method of: when the camera module shoots, the exposure time of the camera module is adjusted, so that when the camera module shoots, motion compensation is carried out.
Embodiments of the present application may be implemented in software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (11)

1. A mobile terminal, comprising:
a camera module;
multipoint laser ranging module includes: the laser shooting device comprises a plurality of laser transmitters distributed in an array and used for transmitting laser to a target to be shot; the laser receivers are arranged corresponding to the laser transmitters and used for receiving the reflected laser;
the second sensing assembly is used for acquiring the motion state of the mobile terminal;
the controller is used for determining the speed of the target to be shot according to the position change condition of the target point on the target to be shot, which is measured by each laser transmitter and the corresponding laser receiver, and performing motion compensation when the camera module performs shooting according to the speed of the target to be shot and the motion state of the mobile terminal;
the controller specifically calculates the speed of the target to be shot by using the following formula:
Figure FDA0002768217520000011
the method comprises the steps that v is the speed of a target to be shot, a is the distance between a first laser emitter and a target point on the target to be shot at a first moment, b is the distance between a second laser emitter and the target point at a second moment, theta is the included angle between the first laser emitter and the second laser emitter, and t is the time length between the first moment and the second moment.
2. The mobile terminal of claim 1,
the camera module is a rear camera of the mobile terminal, and the multipoint laser ranging module and the camera module are arranged on the same side of the mobile terminal.
3. The mobile terminal of claim 1,
the second sensing component is an acceleration sensor.
4. The mobile terminal of claim 1,
the controller is specifically configured to adjust a position of a lens in the camera module according to a speed of a target to be photographed, so as to perform motion compensation when the camera module performs photographing.
5. The mobile terminal of claim 4,
the camera module specifically includes:
the camera motor comprises a motor base and a lens carrier;
a lens mounted on the lens carrier;
and the driver is used for acquiring a driving instruction of the controller so as to control the movement of the lens carrier, thereby adjusting the position of the lens.
6. The mobile terminal of claim 5, wherein the mobile terminal is configured to receive the request from the mobile terminal
The camera motor further comprises a motor coil, and the lens carrier is arranged in the motor coil;
the driver is specifically configured to obtain a driving instruction of the controller to control a current of the motor coil, so as to control movement of the lens carrier, and to perform position adjustment on the lens.
7. The mobile terminal of claim 1,
the controller is specifically used for adjusting the exposure time of the camera module according to the speed of the target to be shot so as to perform motion compensation when the camera module shoots.
8. A photographing method of a mobile terminal is characterized by comprising the following steps:
emitting laser to a target to be shot by using a multipoint laser ranging module;
receiving reflected laser by using a multipoint laser ranging module;
determining the position change condition of a target point on a target to be shot based on the time difference between the emitted laser and the received laser;
according to the position change condition of a target point on the target to be shot, which is measured by each laser transmitter and the corresponding laser receiver, the speed of the target to be shot is calculated by adopting the following formula:
Figure FDA0002768217520000021
the method comprises the following steps that v is the speed of a target to be shot, a is the distance between a first laser emitter and a target point on the target to be shot at a first moment, b is the distance between a second laser emitter and the target point at a second moment, theta is the included angle between the first laser emitter and the second laser emitter, and t is the time length between the first moment and the second moment;
acquiring the motion state of the mobile terminal by utilizing a second sensing assembly;
when the camera module shoots, motion compensation is carried out when the camera module shoots according to the speed of the target to be shot and the motion state of the mobile terminal.
9. The method of claim 8,
when the camera module shoots, according to the speed of the target to be shot, the step of motion compensation is carried out, including:
when the camera module shoots, the position of a lens in the camera module is adjusted according to the speed of a target to be shot, so that motion compensation is performed when the camera module shoots.
10. The method of claim 8,
when the camera module shoots, according to the speed of the target to be shot, the step of motion compensation is carried out, including:
when the camera module shoots, the exposure time of the camera module is adjusted, so that the camera module can carry out motion compensation when shooting.
11. A computer storage medium, characterized in that the computer storage medium is used to store a computer program which, when executed by a processor, implements the method according to any one of claims 8-10.
CN201910120670.XA 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium Active CN109803079B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910120670.XA CN109803079B (en) 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910120670.XA CN109803079B (en) 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium

Publications (2)

Publication Number Publication Date
CN109803079A CN109803079A (en) 2019-05-24
CN109803079B true CN109803079B (en) 2021-04-27

Family

ID=66560963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910120670.XA Active CN109803079B (en) 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium

Country Status (1)

Country Link
CN (1) CN109803079B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479011B (en) * 2020-04-02 2022-01-07 Oppo广东移动通信有限公司 Power adjustment method, device, storage medium and electronic equipment
CN111736173B (en) * 2020-05-24 2023-04-11 奥比中光科技集团股份有限公司 Depth measuring device and method based on TOF and electronic equipment
CN113395447B (en) * 2021-05-31 2023-04-04 江西晶浩光学有限公司 Anti-shake mechanism, image pickup device, and electronic apparatus
CN114254492A (en) * 2021-12-08 2022-03-29 新国脉文旅科技有限公司 Passenger flow behavior track destination simulation method based on passenger flow portrayal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0782089B2 (en) * 1990-06-14 1995-09-06 浜松ホトニクス株式会社 Speed measuring instrument
CN101491084A (en) * 2006-07-20 2009-07-22 松下电器产业株式会社 Imaging apparatus
CN101753708A (en) * 2008-12-22 2010-06-23 康佳集团股份有限公司 Mobile phone capable of measuring velocity and method for measuring movement velocity of object by mobile phone
JP6246015B2 (en) * 2014-02-19 2017-12-13 キヤノン株式会社 Image processing apparatus and control method thereof
CN105763785A (en) * 2014-12-15 2016-07-13 富泰华工业(深圳)有限公司 Shooting system and method with dynamic adjustment of shutter speed
CN105100614B (en) * 2015-07-24 2018-07-31 小米科技有限责任公司 The implementation method and device of optical anti-vibration, electronic equipment
JP2018194770A (en) * 2017-05-22 2018-12-06 キヤノン株式会社 Camera system, interchangeable and camera

Also Published As

Publication number Publication date
CN109803079A (en) 2019-05-24

Similar Documents

Publication Publication Date Title
CN109803079B (en) Mobile terminal, photographing method thereof and computer storage medium
US8319839B2 (en) Optical image system
JP2021509515A (en) Distance measurement methods, intelligent control methods and devices, electronic devices and storage media
US10812722B2 (en) Imaging apparatus, shake correction method, lens unit, and body unit
RU2758460C1 (en) Terminal apparatus and method for video image stabilisation
WO2016000533A1 (en) Fast focusing shooting module in mobile phone
JP2008541161A (en) Digital camera with triangulation autofocus system and associated method
JP5689630B2 (en) Optical apparatus for photographing and optical system for photographing
WO2018191963A1 (en) Remote control, camera mount, and camera mount control method, device, and system
KR20170135854A (en) Methods and apparatus for defocus reduction using laser autofocus
EP3835913A1 (en) Control method of handheld gimbal, handheld gimbal, and handheld device
KR20210127658A (en) A mobile device including a camera module
CN110972516A (en) Camera and unmanned aerial vehicle
KR20190089491A (en) Lens curvature variation apparatus for varying lens curvature using sensed temperature information, camera, and image display apparatus including the same
CN108603752B (en) Deflection angle detection method and device and jitter compensation method and device for camera module of terminal
US10972665B2 (en) Imaging apparatus and image blurring amount calculation method therefor
KR102204201B1 (en) System for measuring a frequency response function of a camera module
US11372316B2 (en) Lens barrel, camera body, camera system
CN114270801B (en) Anti-shake circuit, method, apparatus, and storage medium
CN113093207A (en) Mobile terminal and method for acquiring ranging information
CN113014819A (en) Camera module, electronic equipment and shake compensation method
KR20210058363A (en) Electronic device for using depth information and operating method thereof
CN109089033A (en) Camera module
KR102530307B1 (en) Subject position detection system
JP2012124554A (en) Imaging apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant