CN113225478A - Shooting method and device - Google Patents

Shooting method and device Download PDF

Info

Publication number
CN113225478A
CN113225478A CN202110465941.2A CN202110465941A CN113225478A CN 113225478 A CN113225478 A CN 113225478A CN 202110465941 A CN202110465941 A CN 202110465941A CN 113225478 A CN113225478 A CN 113225478A
Authority
CN
China
Prior art keywords
target object
distance
ultrasonic
position information
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110465941.2A
Other languages
Chinese (zh)
Inventor
彭武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN202110465941.2A priority Critical patent/CN113225478A/en
Publication of CN113225478A publication Critical patent/CN113225478A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/20Arrangements for obtaining desired frequency or directional characteristics
    • H04R1/32Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
    • H04R1/34Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by using a single transducer with sound reflecting, diffracting, directing or guiding means
    • H04R1/342Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by using a single transducer with sound reflecting, diffracting, directing or guiding means for microphones

Abstract

The embodiment of the application discloses a shooting method and a shooting device, wherein the method comprises the following steps: transmitting a first ultrasonic signal by an ultrasonic transmitting device; receiving at least two second ultrasonic signals through at least two microphones respectively, wherein the second ultrasonic signals are reflected wave signals of the first ultrasonic signals after the first ultrasonic signals are reflected by the target object; determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, wherein the first distance is the distance between the ultrasonic transmitting device and the camera; and controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. According to the embodiment of the application, the convenience of shooting the target object can be improved.

Description

Shooting method and device
Technical Field
The embodiment of the application relates to the field of information processing, in particular to a shooting method and a shooting device.
Background
As electronic devices become more and more popular, the shooting function of the electronic devices is widely used. In an actual use process, a user needs to manually touch a display screen of the electronic device to realize a function of focusing a camera of the electronic device on a target object.
In carrying out the present application, the applicant has found that there are at least the following problems in the related art:
the user needs to manually focus the target object to be shot, and the operation is complicated.
Disclosure of Invention
The embodiment of the application provides a shooting method and a shooting device, and aims to solve the problem that a user needs to manually focus a target object to be shot and the operation is complex.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a shooting method applied to an electronic device, where the electronic device includes an ultrasonic wave transmitting device, at least two microphones, and a camera, where the ultrasonic wave transmitting device is configured to transmit an ultrasonic wave signal, and the method may include:
transmitting a first ultrasonic signal by an ultrasonic transmitting device;
receiving at least two second ultrasonic signals through at least two microphones respectively, wherein the second ultrasonic signals are reflected wave signals of the first ultrasonic signals after the first ultrasonic signals are reflected by the target object;
determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, wherein the first distance is the distance between the ultrasonic transmitting device and the camera;
and controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information.
In a second aspect, an embodiment of the present application provides a shooting device applied to an electronic device, where the electronic device includes an ultrasonic wave emitting device, at least two microphones, and a camera, and the ultrasonic wave emitting device is configured to emit an ultrasonic wave signal, and the shooting device may include:
the transmitting module is used for transmitting a first ultrasonic signal through the ultrasonic transmitting device;
the receiving module is used for receiving at least two second ultrasonic signals through at least two microphones respectively, wherein the second ultrasonic signals are reflected wave signals of the first ultrasonic signals after the first ultrasonic signals are reflected by a target object;
the determining module is used for determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, wherein the first distance is the distance between the ultrasonic transmitting device and the camera;
and the control module is used for controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, a first ultrasonic signal is transmitted by an ultrasonic transmitting device, and second ultrasonic signals of at least two first ultrasonic signals reflected by a target object are respectively received by at least two microphones; then, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance between the ultrasonic transmitting device and the camera; and finally, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. Therefore, the camera can be quickly and accurately focused on the target object and the microphone is controlled to record towards the target object without manual focusing of a user, and convenience of shooting the target object by the user can be improved.
Drawings
The present application may be better understood from the following description of specific embodiments of the application taken in conjunction with the accompanying drawings, in which like or similar reference numerals identify like or similar features.
Fig. 1 is a schematic view of an application scenario of a shooting method according to an embodiment of the present application;
fig. 2 is a flowchart of a shooting method according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a positional relationship between a target object and an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating a position relationship between another target object and an electronic device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a shooting device according to an embodiment of the present disclosure;
fig. 6 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The shooting method provided by the embodiment of the application can be applied to at least the following application scenarios, which are explained below.
As shown in fig. 1, if the user 10 needs to focus on the target object 30 during the shooting process, the user needs to touch the display screen of the electronic device 20 for focusing, which is cumbersome to operate; moreover, a user holds the electronic device while touching the display screen, so that shaking can occur in the shooting process inevitably, and a shooting picture is blurred.
In view of the problems in the related art, embodiments of the present application provide a shooting method, an apparatus, an electronic device, and a storage medium, so as to solve the problem in the related art that a user needs to manually focus a target object to be shot, which is relatively complicated to operate.
The method provided by the embodiment of the application can be applied to any scene that a user needs to manually focus on a target object to be shot besides the application scene.
According to the method provided by the embodiment of the application, the first ultrasonic signals are transmitted through the ultrasonic transmitting device, and the second ultrasonic signals of at least two first ultrasonic signals reflected by the target object are respectively received through at least two microphones; then, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance between the ultrasonic transmitting device and the camera; and finally, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. Therefore, the camera can be quickly and accurately focused on the target object and the microphone is controlled to record towards the target object without manual focusing of a user, and convenience of shooting the target object by the user can be improved.
Based on the application scenario, the following describes the shooting method provided in the embodiment of the present application in detail.
Fig. 2 is a flowchart of a shooting method according to an embodiment of the present disclosure.
As shown in fig. 2, the photographing method may include steps 210 to 240, and the method is applied to a photographing apparatus, and specifically as follows:
step 210, a first ultrasonic signal is transmitted by an ultrasonic transmitting device.
Step 220, receiving at least two second ultrasonic signals through at least two microphones, respectively, where the second ultrasonic signals are reflected wave signals of the first ultrasonic signals reflected by the target object.
And step 230, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, wherein the first distance is the distance between the ultrasonic transmitting device and the camera.
And step 240, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information.
According to the shooting method, the first ultrasonic signals are transmitted through the ultrasonic transmitting device, and the second ultrasonic signals of at least two first ultrasonic signals reflected by a target object are respectively received through at least two microphones; then, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance between the ultrasonic transmitting device and the camera; and finally, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. Therefore, the camera can be quickly and accurately focused on the target object and the microphone is controlled to record towards the target object without manual focusing of a user, and convenience of shooting the target object by the user can be improved.
The contents of steps 210-240 are described below:
first, step 210 is involved.
The first ultrasonic signal is transmitted by an ultrasonic transmitting device of the electronic device. The first ultrasonic wave signal is reflected by the target object, and reflected wave signals reflected by the target object are respectively received by the microphones of the electronic equipment.
Next, step 220 is involved.
Wherein the second ultrasonic signal includes: time information from the emission of the first ultrasonic signal to the reception of the second ultrasonic signal. The second ultrasonic signal is a reflected wave signal of the first ultrasonic signal reflected by the target object.
Next, step 230 is involved.
The step 230 may specifically include the following steps:
determining second relative position information of the ultrasonic transmitting device and the target object according to the at least two pieces of time information; and determining first relative position information according to the second relative position information and the first distance.
The step of determining the second relative position information of the ultrasound transmitting apparatus and the target object according to the at least two pieces of time information may specifically include the following steps:
acquiring a second distance between each microphone and the ultrasonic wave transmitting device;
and determining a third distance and a first angle according to the time information and the second distance, wherein the third distance is the distance between the target object and the ultrasonic transmitting device, the first angle is an included angle between a first connecting line and a normal of a display screen of the electronic equipment, and the first connecting line is the connecting line between the target object and the ultrasonic transmitting device.
Specifically, as shown in fig. 3, the electronic device includes one ultrasonic wave emitting device and two microphones, and the distance between the two microphones is l. The target object has a first angle (included angle theta) relative to the normal direction of the display screen of the electronic equipment, and the distance between the target object and the ultrasonic wave transmitting device is a third distance (r). Based on a relationship between a first time from the emitting of the first ultrasonic signal to the receiving of the second ultrasonic signal by the microphone a, a second time from the emitting of the first ultrasonic signal to the receiving of the second ultrasonic signal by the microphone B, and the third distance; and the relation between the distance of the target object from the microphone and the third distance, as follows:
Figure BDA0003043952820000061
wherein, c0At the speed of sound, tA、tBFrom the start of transmission to the time when the reflected waves are received by the microphones a and B, respectively. r is1Is the distance, r, between the target object and the microphone A2Is the distance between the target object and the microphone B.
Hair fromThe distance from the beginning of the transmitted wave to the receiving of the reflected wave by the microphone a is as follows: the distance of the target object from the ultrasonic wave emitting device a, and the distance of the target object from the ultrasonic wave emitting device. Based on the fact that the ultrasonic transmission distance is equal to the product of the transmission speed and the transmission time, c0 tA=r+r1
Similarly, the distance from the emission of the transmitted wave to the reception of the reflected wave by the microphone a is as follows: the distance of the target object from the microphone B, and the distance of the target object from the ultrasonic wave transmitting device. Therefore, c0 tB=r+r2
In addition, based on the triangular sinusoidal relationship, a relationship between the first angle, the third distance, and the second distance may be derived. As shown in fig. 3, the distances between the microphones a and B and the ultrasonic wave emission are equal, so the second distances may be l/2, respectively; of course, the distances between the microphones a and B and the ultrasonic emission may also be unequal, e.g. the second distances may be l/3 and 2l/3, respectively; the rest of the calculation principles are the same, and are not described in detail herein.
Therefore, based on the above formula (1), the distance r and the angle θ of the target object with respect to the ultrasonic wave emitting device can be derived. The step of determining the first relative position information according to the second relative position information and the first distance may specifically include the following steps:
Figure BDA0003043952820000071
wherein s is the distance between the target object and the camera; d is a first distance; r is a third distance; θ is a first angle; alpha is the complement of the first angle;
Figure BDA0003043952820000072
phi is an included angle between a second connecting line and the normal of the display screen, and the second connecting line is a connecting line between the target object and the camera; beta is the included angle between the second connecting line and the display screen.
When the ultrasonic wave emitting device and the camera are overlapped, the obtained third distance (r) and the first angle (theta) can be used as second relative position information between the target object and the camera.
Since a certain distance (i.e., the first distance) still exists between the ultrasonic wave emitting device and the camera, after the third distance between the target object and the ultrasonic wave emitting device is calculated, the distance between the target object and the camera needs to be calculated continuously according to the third distance.
Specifically, as shown in fig. 4, there is a first distance (d) between the ultrasonic wave emitting device and the camera, and the following equation is given:
Figure BDA0003043952820000081
firstly, determining a complementary angle of the first angle (theta) as a second angle (alpha);
then, according to the cosine theorem of triangles, for any triangle, the square of any side is equal to the sum of the squares of the other two sides minus the double product of the cosines of the two sides and the included angle between the two sides, namely, the calculated relationship exists in the triangle enclosed by the first distance (d), the third distance (r) and the distance(s) between the target object and the camera. Therefore, the first distance, the third distance and the second angle can be calculated based on the cosine theorem, and the distance between the target object and the camera is determined.
And then, calculating the first distance, the third distance and the distance between the target object and the camera according to the cosine law, and determining a third angle (beta), wherein the third angle is an included angle between a connecting line (a second connecting line) between the target object and the camera and the display screen.
Finally, the fourth angle (phi) can be determined according to the third angle because the third angle is an included angle between the second connecting line and the display screen and the fourth angle is an included angle between the second connecting line and the normal of the display screen. In the case where the third angle is an acute angle, the sum of the fourth angle and the third angle is 90 °; when the third angle is an obtuse angle, the difference between the third angle and the fourth angle is 90 DEG
Thus, after determining the second relative position information including the third distance and the first angle, the second relative position information may be determined based on the first distance and the second relative position information.
Finally, step 240 is involved.
In a possible embodiment, the step of controlling the microphone to record towards the target object may specifically include the following steps:
acquiring position information of a target object relative to at least two microphones; determining a preset radio reception angle according to the position information of the microphone; and recording the target object based on the preset radio reception angle.
The at least two microphones may be a microphone array, the microphone array is an array formed by arranging a group of omnidirectional microphones located at different spatial positions according to a certain shape rule, and the microphone array is a device for performing spatial sampling on a spatial propagation sound signal, and the acquired signal contains spatial position information of the spatial propagation sound signal. The array can be divided into a near-field model and a far-field model according to the distance between the sound source and the microphone array. According to the topology of the microphone array, the microphone array can be divided into a linear array, a planar array, a volume array, and the like.
In a scene of a real life, the position of a sound source (a target object) is constantly changed, a microphone array can perform sound source positioning based on a sound source positioning technology, and the sound source positioning technology is to use the microphone array to calculate the angle and the distance of the target object, so that the target object is tracked and then the voice is directionally picked up.
The sound source positioning technology based on the microphone array technology has the characteristics of flexible beam control, higher spatial resolution, high signal gain, stronger anti-interference capability and the like, and can quickly and accurately carry out automatic directional recording on a target object.
The above step of recording the target object based on the preset sound reception angle may specifically include the following steps:
acquiring the spherical coordinate information of a target object relative to at least one microphone; determining a preset reception angle according to the spherical coordinate information; and adjusting the main lobe direction of at least one microphone based on a preset sound receiving angle.
Specifically, the directional pattern of the microphone array at a certain point in the far field is:
Figure BDA0003043952820000091
wherein n is the number of microphones, d is the distance between the microphones, and theta and phi are the pitch angle and the azimuth angle of a point in space under the spherical coordinate system respectively. Theta, phi, and the distance between the target object and the microphone constitute the spherical coordinate information of the microphone.
ω in formula 3nFor weighting the parameters, they can be expressed in terms of magnitude and phase components as
Figure BDA0003043952820000101
By modifying the amplitude weight anThe shape of the directional pattern may be modified; likewise, by modifying the phase weights
Figure BDA0003043952820000102
The angular position of the responsive main lobe can be controlled.
For example, the amplitude is weighted by anSet to 1, taking into account only the phase weights
Figure BDA0003043952820000103
When the pointing mode is
Figure BDA0003043952820000104
Order to
Figure BDA0003043952820000105
Formula (5) can be rewritten as
Figure BDA0003043952820000106
This time game
Figure BDA0003043952820000107
Then equation 6 can be written as
Figure BDA0003043952820000108
As can be seen from equation 7, the phase weights
Figure BDA0003043952820000109
The effect on the pointing mode is to steer the main lobe of the beam towards the direction of the cosine a'. The horizontal pointing mode is shown in fig. 6, where the beam direction has been moved to 45 ° (i.e., the preset reception angle).
According to fourier transform theory, a negative phase shift in the frequency domain corresponds to a time delay in the time domain. Considering the horizontal plane, the delays of the n microphones are (c is the speed of sound)
Figure BDA0003043952820000111
As can be seen from equation (8), by applying a time delay τ to the signal output by the microphonenTo obtain the corresponding phase weight
Figure BDA0003043952820000112
Thereby controlling the main lobe direction in the directional mode diagram and realizing directional recording.
In summary, in the embodiment of the present application, the first ultrasonic signal is transmitted by the ultrasonic transmitting device, and the second ultrasonic signals of the at least two first ultrasonic signals reflected by the target object are respectively received by the at least two microphones; then, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance between the ultrasonic transmitting device and the camera; and finally, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. Therefore, the camera can be quickly and accurately focused on the target object and the microphone is controlled to record towards the target object without manual focusing of a user, and convenience of shooting the target object by the user can be improved.
It should be noted that, in the shooting method provided in the embodiment of the present application, the execution subject may be a shooting device, or a control module in the shooting device for executing the loading shooting method. In the embodiment of the present application, a shooting device executes a loading shooting method as an example, and the shooting method provided in the embodiment of the present application is described.
In addition, based on the shooting method, an embodiment of the present application further provides a shooting device, which is specifically described in detail with reference to fig. 5.
Fig. 5 is a schematic structural diagram of a shooting device according to an embodiment of the present application.
As shown in fig. 5, the photographing apparatus 500 may include:
the transmitting module 510 is configured to transmit a first ultrasonic signal through an ultrasonic transmitting device.
A receiving module 520, configured to receive at least two second ultrasonic signals through the at least two microphones, respectively, where the second ultrasonic signals are reflected wave signals of the first ultrasonic signals reflected by the target object.
A determining module 530, configured to determine first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, where the first distance is a distance between the ultrasonic transmitting device and the camera.
The control module 540 is configured to control the camera to focus on the target object and control the microphone to record sound towards the target object according to the first relative position information.
In one possible embodiment, the second ultrasonic signal comprises: the determining module 530 is specifically configured to determine, from the time when the first ultrasonic signal is sent to the time when the second ultrasonic signal is received:
and determining second relative position information of the ultrasonic transmitting device and the target object according to the at least two pieces of time information.
And determining first relative position information according to the second relative position information and the first distance.
In one possible embodiment, the determining module 530 includes:
the first acquisition module is used for acquiring a second distance between each microphone and the ultrasonic wave transmitting device;
the determining module 530 is specifically configured to: and determining a third distance and a first angle according to the time information and the second distance, wherein the third distance is the distance between the target object and the ultrasonic transmitting device, the first angle is an included angle between a first connecting line and a normal of a display screen of the electronic equipment, and the first connecting line is the connecting line between the target object and the ultrasonic transmitting device.
In a possible embodiment, the determining module 530 is specifically configured to:
Figure BDA0003043952820000121
wherein s is the distance between the target object and the camera; d is a first distance; r is a third distance; θ is a first angle; alpha is the complement of the first angle;
Figure BDA0003043952820000122
phi is an included angle between a second connecting line and the normal of the display screen, and the second connecting line is a connecting line between the target object and the camera; beta is the included angle between the second connecting line and the display screen.
In one possible embodiment, the control module 540 includes:
and the second acquisition module is used for acquiring the position information of the target object relative to the at least two microphones.
The determining module 530 is further configured to determine a preset sound reception angle according to the position information of the microphone.
And the recording module is used for recording the target object based on the preset reception angle.
In summary, the shooting device provided in the embodiment of the present application transmits a first ultrasonic signal through an ultrasonic transmitting device, and receives second ultrasonic signals of at least two first ultrasonic signals reflected by a target object through at least two microphones respectively; then, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance between the ultrasonic transmitting device and the camera; and finally, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. Therefore, the camera can be quickly and accurately focused on the target object and the microphone is controlled to record towards the target object without manual focusing of a user, and convenience of shooting the target object by the user can be improved.
The imaging device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting device provided in the embodiment of the present application can implement each process implemented by the shooting device in the method embodiments of fig. 2 to 4, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 6, an electronic device 600 is further provided in this embodiment of the present application, and includes a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and executable on the processor 601, where the program or the instruction is executed by the processor 601 to implement each process of the above-mentioned chat group creation method embodiment, and can achieve the same technical effect, and in order to avoid repetition, it is not described here again.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 7 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Among them, the input unit 704 may include a graphic processor 7041 and a microphone 7042; the display unit 706 may include a display panel 7061; the user input unit 707 may include a touch panel 7071 and other input devices 7072; memory 709 may include applications and an operating system.
Those skilled in the art will appreciate that the electronic device 700 may further comprise a power supply (e.g., a battery) for supplying power to various components, and the power supply 711 may be logically connected to the processor 710 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A radio frequency unit 701 for transmitting a first ultrasonic signal through the ultrasonic transmitting device.
The radio frequency unit 701 is configured to receive at least two second ultrasonic signals through the at least two microphones, respectively, where the second ultrasonic signals are reflected waves of the first ultrasonic signals reflected by the target object.
And a processor 710 for determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, wherein the first distance is a distance between the ultrasonic wave transmitting device and the camera.
And the processor 710 is configured to control the camera to focus on the target object and control the microphone to record the sound towards the target object according to the first relative position information.
Optionally, the second ultrasonic signal comprises: the processor 710, from the time of the emission of the first ultrasonic signal to the time of the reception of the second ultrasonic signal, is further configured to:
and determining second relative position information of the ultrasonic transmitting device and the target object according to the at least two pieces of time information.
And determining first relative position information according to the second relative position information and the first distance.
Optionally, the network module 702 is configured to obtain a second distance between each microphone and the ultrasonic wave emitting device.
Processor 710, further configured to: and determining a third distance and a first angle according to the time information and the second distance, wherein the third distance is the distance between the target object and the ultrasonic transmitting device, the first angle is an included angle between a first connecting line and a normal of a display screen of the electronic equipment, and the first connecting line is the connecting line between the target object and the ultrasonic transmitting device.
Optionally, the processor 710 is further configured to:
Figure BDA0003043952820000151
wherein s is the distance between the target object and the camera; d is a first distance; r is a third distance; θ is a first angle; alpha is the complement of the first angle;
Figure BDA0003043952820000152
phi is an included angle between a second connecting line and the normal of the display screen, and the second connecting line is a connecting line between the target object and the camera; beta is the included angle between the second connecting line and the display screen.
Optionally, the network module 702 is configured to obtain position information of the target object with respect to at least two microphones.
The processor 710 is further configured to determine a preset sound reception angle according to the position information of the microphone.
The microphone 7042 is configured to record the target object based on a preset sound reception angle.
In the embodiment of the application, a first ultrasonic signal is transmitted by an ultrasonic transmitting device, and second ultrasonic signals of at least two first ultrasonic signals reflected by a target object are respectively received by at least two microphones; then, determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance between the ultrasonic transmitting device and the camera; and finally, controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information. Therefore, the camera can be quickly and accurately focused on the target object and the microphone is controlled to record towards the target object without manual focusing of a user, and convenience of shooting the target object by the user can be improved.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above shooting method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A shooting method is applied to electronic equipment, and is characterized in that the electronic equipment comprises an ultrasonic wave transmitting device, at least two microphones and a camera, the ultrasonic wave transmitting device is used for transmitting ultrasonic wave signals, and the method comprises the following steps:
transmitting a first ultrasonic signal by the ultrasonic transmitting device;
receiving at least two second ultrasonic signals through the at least two microphones respectively, wherein the second ultrasonic signals are reflected wave signals of the first ultrasonic signals after the first ultrasonic signals are reflected by a target object;
determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, wherein the first distance is a distance between the ultrasonic transmitting device and the camera;
and controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information.
2. The method of claim 1, wherein the second ultrasonic signal comprises: time information from the emission of the first ultrasonic signal to the reception of the second ultrasonic signal, the determining first relative position information of the target object and the camera according to the at least two second ultrasonic signals and the first distance, comprising:
determining second relative position information of the ultrasonic transmitting device and the target object according to at least two pieces of time information;
and determining the first relative position information according to the second relative position information and the first distance.
3. The method according to claim 2, wherein said determining second relative position information of the ultrasound emitting device and the target object from the at least two pieces of time information comprises:
acquiring a second distance between each microphone and the ultrasonic wave transmitting device;
according to the time information and the second distance, a third distance and a first angle are determined, the third distance is the distance between the target object and the ultrasonic transmitting device, the first angle is an included angle between a first connecting line and a normal line of a display screen of the electronic equipment, and the first connecting line is the connecting line between the target object and the ultrasonic transmitting device.
4. The method according to claim 3, wherein the determining the first relative position information is based on the second relative position information and the first distance, and specifically the determining the first relative position information is based on the following formula:
Figure FDA0003043952810000021
wherein s is the distance between the target object and the camera; d is the first distance; r is the third distance; θ is the first angle; α is the complement of the first angle;
Figure FDA0003043952810000022
phi is an included angle between a second connecting line and the normal of the display screen, and the second connecting line is a connecting line between the target object and the camera; beta is the included angle between the second connecting line and the display screen.
5. The method of claim 1, wherein controlling the microphone to record towards the target object comprises:
acquiring position information of the target object relative to the at least two microphones;
determining the preset radio reception angle according to the position information of the microphone;
and recording the target object based on the preset sound reception angle.
6. A shooting device is applied to electronic equipment, wherein the electronic equipment comprises an ultrasonic wave transmitting device, at least two microphones and a camera, the ultrasonic wave transmitting device is used for transmitting ultrasonic wave signals, and the shooting device comprises:
the transmitting module is used for transmitting a first ultrasonic signal through the ultrasonic transmitting device;
a receiving module, configured to receive at least two second ultrasonic signals through the at least two microphones, respectively, where the second ultrasonic signals are reflected wave signals of the first ultrasonic signals reflected by a target object;
a determining module, configured to determine first relative position information of the target object and the camera according to the at least two second ultrasonic signals and a first distance, where the first distance is a distance between the ultrasonic transmitting device and the camera;
and the control module is used for controlling the camera to focus on the target object and controlling the microphone to record towards the target object according to the first relative position information.
7. The apparatus of claim 6, wherein the second ultrasonic signal comprises: the determining module is specifically configured to, based on the time information from the sending of the first ultrasonic signal to the receiving of the second ultrasonic signal:
determining second relative position information of the ultrasonic transmitting device and the target object according to at least two pieces of time information;
and determining the first relative position information according to the second relative position information and the first distance.
8. The apparatus of claim 7, wherein the determining module comprises:
the first acquisition module is used for acquiring a second distance between each microphone and the ultrasonic wave transmitting device;
the determining module is specifically configured to: according to the time information and the second distance, a third distance and a first angle are determined, the third distance is the distance between the target object and the ultrasonic transmitting device, the first angle is an included angle between a first connecting line and a normal line of a display screen of the electronic equipment, and the first connecting line is the connecting line between the target object and the ultrasonic transmitting device.
9. The apparatus of claim 8, wherein the determining module determines the first relative position information by, in particular:
Figure FDA0003043952810000041
wherein s is the distance between the target object and the camera; d is the first distance; r is the third distance; θ is the first angle; α is the complement of the first angle;
Figure FDA0003043952810000042
phi is an included angle between a second connecting line and the normal of the display screen, and the second connecting line is a connecting line between the target object and the camera; beta is the included angle between the second connecting line and the display screen.
10. The apparatus of claim 6, wherein the control module comprises:
the second acquisition module is used for acquiring the position information of the target object relative to the at least two microphones;
the determining module is further configured to determine the preset radio reception angle according to the position information of the microphone;
and the recording module is used for recording the target object based on the preset reception angle.
CN202110465941.2A 2021-04-28 2021-04-28 Shooting method and device Pending CN113225478A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110465941.2A CN113225478A (en) 2021-04-28 2021-04-28 Shooting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110465941.2A CN113225478A (en) 2021-04-28 2021-04-28 Shooting method and device

Publications (1)

Publication Number Publication Date
CN113225478A true CN113225478A (en) 2021-08-06

Family

ID=77089584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110465941.2A Pending CN113225478A (en) 2021-04-28 2021-04-28 Shooting method and device

Country Status (1)

Country Link
CN (1) CN113225478A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013108921A (en) * 2011-11-24 2013-06-06 Panasonic Corp Ultrasonic sensor system
CN105208287A (en) * 2015-10-15 2015-12-30 广东欧珀移动通信有限公司 Photographing method and device
CN105827937A (en) * 2015-08-28 2016-08-03 维沃移动通信有限公司 Imaging device and imaging method
CN107809596A (en) * 2017-11-15 2018-03-16 重庆科技学院 Video conference tracking system and method based on microphone array
CN108989687A (en) * 2018-09-07 2018-12-11 北京小米移动软件有限公司 camera focusing method and device
CN109932054A (en) * 2019-04-24 2019-06-25 北京耘科科技有限公司 Wearable Acoustic detection identifying system
CN110389597A (en) * 2018-04-17 2019-10-29 北京京东尚科信息技术有限公司 Camera method of adjustment, device and system based on auditory localization
CN110495185A (en) * 2018-03-09 2019-11-22 深圳市汇顶科技股份有限公司 Audio signal processing method and device
CN111641794A (en) * 2020-05-25 2020-09-08 维沃移动通信有限公司 Sound signal acquisition method and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013108921A (en) * 2011-11-24 2013-06-06 Panasonic Corp Ultrasonic sensor system
CN105827937A (en) * 2015-08-28 2016-08-03 维沃移动通信有限公司 Imaging device and imaging method
CN105208287A (en) * 2015-10-15 2015-12-30 广东欧珀移动通信有限公司 Photographing method and device
CN107809596A (en) * 2017-11-15 2018-03-16 重庆科技学院 Video conference tracking system and method based on microphone array
CN110495185A (en) * 2018-03-09 2019-11-22 深圳市汇顶科技股份有限公司 Audio signal processing method and device
CN110389597A (en) * 2018-04-17 2019-10-29 北京京东尚科信息技术有限公司 Camera method of adjustment, device and system based on auditory localization
CN108989687A (en) * 2018-09-07 2018-12-11 北京小米移动软件有限公司 camera focusing method and device
CN109932054A (en) * 2019-04-24 2019-06-25 北京耘科科技有限公司 Wearable Acoustic detection identifying system
CN111641794A (en) * 2020-05-25 2020-09-08 维沃移动通信有限公司 Sound signal acquisition method and electronic equipment

Similar Documents

Publication Publication Date Title
US11049519B2 (en) Method for voice recording and electronic device thereof
CN114624689B (en) Near-field focusing sound source distance calculation method and system based on acoustic imaging instrument
CN112261669A (en) Network beam orientation control method and device, readable medium and electronic equipment
US11790607B2 (en) Method and apparatus for displaying heat map, computer device, and readable storage medium
CN114374279A (en) Wireless charging method and device and electronic equipment
CN111385525B (en) Video monitoring method, device, terminal and system
WO2022104528A1 (en) Nonlinear calculation method and apparatus for fixed-point number, image processing device, movable platform, and storage medium
CN113225478A (en) Shooting method and device
CN113834482A (en) Positioning method, positioning device, electronic equipment and storage medium
CN111273286B (en) Imaging device, method, electronic apparatus, and storage medium
CN111813272A (en) Information input method and device and electronic equipment
WO2019218900A1 (en) Neural network model and data processing method and processing apparatus
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
JP6368055B2 (en) Recording method and terminal for video chat
CN112152689B (en) Beam transmission control method and device and transmitting end
KR102306066B1 (en) Sound collection method, apparatus and medium
CN110166947B (en) Position sharing method, first mobile terminal and base station
CN113034621A (en) Combined calibration method, device, equipment, vehicle and storage medium
CN112312273A (en) Sound playing method, sound receiving method and electronic equipment
CN112904345B (en) Positioning system and positioning method
WO2022155998A1 (en) Imaging system and method
CN112749504B (en) Method and device for acquiring simulated scanning point, electronic equipment and storage medium
CN112153735B (en) Equipment positioning method and device, storage medium and electronic equipment
CN110337058A (en) Data transmission method for uplink, device and mobile device
WO2022041219A1 (en) Sound source ranging method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210806

WD01 Invention patent application deemed withdrawn after publication