CN115623331A - Focusing control method and device, electronic equipment and storage medium - Google Patents

Focusing control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115623331A
CN115623331A CN202211249628.6A CN202211249628A CN115623331A CN 115623331 A CN115623331 A CN 115623331A CN 202211249628 A CN202211249628 A CN 202211249628A CN 115623331 A CN115623331 A CN 115623331A
Authority
CN
China
Prior art keywords
focusing
focus
electronic device
message
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211249628.6A
Other languages
Chinese (zh)
Inventor
刘先亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211249628.6A priority Critical patent/CN115623331A/en
Publication of CN115623331A publication Critical patent/CN115623331A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses a focusing control method and device, electronic equipment and a storage medium, and belongs to the technical field of communication. The method comprises the following steps: receiving a first input of a user, the first input being used to determine a first focus object of a second electronic device; in response to the first input, sending a first message to the second electronic device; the first message includes information of the first focus object.

Description

Focusing control method and device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of communication, and particularly relates to a focusing control method and device, electronic equipment and a storage medium.
Background
At present, in the process of shooting by a user in multiple machine positions (multiple shooting devices are placed at different positions), when the user needs to control the multiple shooting devices to perform focusing, the user is generally required to control the shooting device in the main machine position to perform focusing setting, and then control the shooting devices in other auxiliary machine positions to perform focusing. Or the shooting equipment of the main machine position and the shooting equipment of the auxiliary machine position are set to be automatically focused, and the shooting equipment of each machine position automatically finishes focusing of people in the preview area according to the focusing capacity of the equipment.
The multi-camera shooting focusing control scheme has the technical problems of complex operation and low shooting efficiency.
Disclosure of Invention
The embodiment of the application aims to provide a focusing control method, a focusing control device, electronic equipment and a storage medium, which can simplify multi-camera focusing operation and improve shooting efficiency.
In a first aspect, an embodiment of the present application provides a focus control method, where the method includes:
receiving a first input of a user, the first input being used to determine a first focus object of a second electronic device;
in response to the first input, sending a first message to the second electronic device; the first message includes information of the first focus object.
In a second aspect, an embodiment of the present application provides a focus control method, including:
receiving a first message sent by first electronic equipment; the first message contains information of a first focusing object;
focusing is performed based on the information of the first focusing object.
In a third aspect, an embodiment of the present application provides a focus control apparatus, including:
a first receiving module for receiving a first input of a user, the first input being used for determining a first focus object of a second electronic device;
a first response module for sending a first message to the second electronic device in response to the first input; the first message includes information of the first focus object.
In a fourth aspect, an embodiment of the present application provides a focus control apparatus, including:
the second receiving module is used for receiving a first message sent by the first electronic equipment; the first message contains information of a first focusing object;
and the first control module is used for focusing based on the information of the first focusing object.
In a fifth aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the method according to the first aspect or the second aspect.
In a sixth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the method according to the first aspect or the second aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect or the second aspect.
In an eighth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement a method according to the first or second aspect.
In the embodiment of the application, the plurality of shooting devices are connected, and the shooting device of the auxiliary machine position is controlled to focus through the shooting device of the main machine position, so that the multi-machine position focusing operation is simplified, and the shooting efficiency is improved.
Drawings
Fig. 1 is a schematic flowchart of a focus control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a focus control logic provided in an embodiment of the present application;
FIG. 3 is a schematic interface diagram of a focus control method according to an embodiment of the present disclosure;
FIG. 4 is a second schematic interface diagram of a focus control method according to an embodiment of the present disclosure;
FIG. 5 is a third schematic interface diagram of a focus control method according to an embodiment of the present disclosure;
FIG. 6 is a fourth schematic interface diagram of a focus control method according to an embodiment of the present disclosure;
FIG. 7 is a fifth schematic view of an interface of a focus control method according to an embodiment of the present disclosure;
FIG. 8 is a sixth schematic view of an interface of a focusing control method according to an embodiment of the present disclosure;
FIG. 9 is a seventh schematic interface diagram illustrating a focus control method according to an embodiment of the present disclosure;
FIG. 10 is an eighth schematic interface diagram of a focusing control method according to an embodiment of the present application;
FIG. 11 is a second flowchart illustrating a focus control method according to an embodiment of the present application;
FIG. 12 is a schematic structural diagram of a focus control device according to an embodiment of the present disclosure;
FIG. 13 is a second schematic structural diagram of a focus control apparatus according to an embodiment of the present disclosure;
fig. 14 is a schematic hardware structure diagram of a first electronic device provided in an embodiment of the present application;
fig. 15 is a hardware structure diagram of a second electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
At present, in the process of shooting by a user at multiple machine positions, when the user needs to control a plurality of shooting devices to preview focusing control of a main shooting person, the user is generally required to control the shooting device at a main machine position to make focusing setting, and then control the shooting devices at other auxiliary machine positions to make focusing. Or the shooting equipment of the main machine position and the shooting equipment of the auxiliary machine position are set to be automatically focused, and the shooting equipment of each machine position automatically finishes focusing of people in the preview area according to the focusing capacity of the equipment.
The multi-machine shooting focusing control method has the defects, for the first scheme, after the main shooting equipment is well focused, each auxiliary machine needs to be specially used for adjusting focusing parameters, and particularly when the auxiliary machine is more or the distance between the auxiliary machines is far, the operation control of a user is inconvenient, the operation is complex, and the shooting efficiency is low. For the second scheme, the shooting devices of the respective machine positions respectively and automatically complete the focusing of the people in the preview area according to the focusing capabilities of the devices, which may cause inconsistent focusing effects, for example, the main machine position shooting device focuses on a human body, and the auxiliary machine position focuses on a background.
Based on the technical problem, the application provides a focusing control method, a focusing control device, an electronic device and a storage medium, which are applied to a shooting device.
The following describes in detail a focus control method, a focus control device, an electronic apparatus, and a storage medium according to embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a focus control method provided in an embodiment of the present application, and as shown in fig. 1, the embodiment of the present application provides a focus control method applied to a first electronic device, where the first electronic device may be a shooting device for shooting a host location in a scene with multiple locations, such as a smart phone, a camera, a video camera, and the like, and the following description takes the video camera as an example, and the method includes:
step 101, receiving a first input of a user, wherein the first input is used for determining a first focus object of a second electronic device.
The second electronic device may be a shooting device for a slave in a multi-camera shooting scene, such as a smart phone, a camera, a video camera, etc., and the video camera is exemplified below. The number of the shooting devices of the slave station may be one or more.
Specifically, fig. 2 is a schematic diagram of a focus control logic provided in an embodiment of the present application, as shown in the figure2, after the user sets the connection between the shooting devices at multiple machine positions, the main camera A is aligned to the scene to be shot, and the preview area is P A The secondary cameras B, C \8230;, etc. are placed in sequence, with their respective preview areas corresponding to P B ,P C 823060, 8230and so on.
The connection between the multi-camera shooting devices can be through wired connection or wireless connection.
On the main camera a, the user can focus the photographic subject as needed.
User is in preview area P A The first focusing object (object or person) a is found, and focusing operation is performed. For example, fig. 3 is one of the interface schematic diagrams of the focusing control method provided in the embodiment of the present application, and as shown in fig. 3, a focusing process on the main camera a may be triggered by clicking the first focusing object a, and after the focusing is completed, the focusing frame displays a focusing completion state of the main camera. When the user needs to control the secondary camera to perform focusing operation, the user may operate the primary camera a to determine a focusing object of the secondary camera by operating the first input, for example, the user may click the first focusing object a, may double click the first focusing object a, or may tap the first focusing object a.
Step 102, responding to the first input, and sending a first message to the second electronic equipment; the first message includes information of the first focus object.
Specifically, the master camera a receives a first operation by the user, and then transmits a first message including information on the first object to be focused to the slave camera. For example, fig. 4 is a second interface schematic diagram of the focus control method provided in the embodiment of the present application, and as shown in fig. 4, double-clicking a first focus object a may trigger the primary camera a to send a synchronous focus (which may also be referred to as "focus tracking") request message (a first message) to the secondary camera B and the secondary camera C, respectively.
According to the embodiment of the application, the plurality of shooting devices are connected, and the shooting device of the auxiliary machine position is controlled to focus through the shooting device of the main machine position, so that the multi-machine position focusing operation is simplified, and the shooting efficiency is improved.
Optionally, the information of the first focus object comprises at least one of:
an identification of a first focus object;
characteristic points of the first object of focus.
Specifically, when there are a plurality of persons in the preview area of the main camera a and the user determines that there are a plurality of focusing objects (first focusing objects) by the main camera a, the distinction can be made by the identification of the first focusing object.
The feature points of the first focusing object are used for recording the features of the first focusing object, and by comparing the feature points of the two persons, it can be determined whether the other person is the same person.
According to the embodiment of the application, the user can distinguish a plurality of focusing objects determined by the main camera through the identification of the focusing object, and control the auxiliary camera to focus, so that the focusing efficiency is further improved.
Optionally, after the first electronic device sends the first message to the second electronic device, the second electronic device receives the first message sent by the first electronic device; the first message contains information of a first focusing object; and the second electronic device focuses based on the first message.
Optionally, the focusing based on the information of the first focusing object includes:
determining feature points of the first focus object based on the information of the first focus object;
judging whether the first focusing object exists in a preview area based on the characteristic points of the first focusing object;
focusing the first focus object when the first focus object exists in the preview area.
Optionally, focusing the first focused object includes:
determining position information of the first focus object in a preview area;
determining a focal distance of the first in-focus object based on the location information.
For example, fig. 5 is a third schematic interface diagram of the focusing control method provided in the embodiment of the present application, and as shown in fig. 5, after receiving the first message, the secondary camera B acquires feature point information of the first focusing object a, and locates the feature point information in the corresponding preview area P B Comparing the data to determine the preview area P B Whether or not there is a feature point of a certain person matched with the feature point of the first focused object a.
If preview area P B If the feature point of a person is matched with the feature point of the first object a, the preview area P is determined B A person existing in the preview area is the first focus object a, i.e. the preview area P B There is a first object of focus a.
In preview area P B When the first focusing object a exists, the sub-camera B performs synchronous focusing on the first focusing object a.
First, the sub-camera B compares the preview area P B The area position of (2) is calculated in correspondence with the preview area P of the first in-focus object a B Position coordinates of (F) Ba Then, based on the position coordinates F Ba The intersection point (first object a of focus) distance is determined.
For example, the sub-camera B determines that the first in-focus object a is in the preview region P B After the position information, the focus tracking operation can be triggered to be performed at the corresponding coordinate position F Ba And executing a focus tracking process to realize synchronous focusing on the first focusing object a.
According to the embodiment of the application, the auxiliary camera can be focused quickly by utilizing the focus tracking technology, and the focusing efficiency is further improved.
Optionally, focusing the first focused object, further includes:
and when the first focusing object does not exist in the preview area, performing automatic focusing.
Specifically, as shown in fig. 5, after receiving the first message, the sub-camera C acquires feature point information of the first in-focus object a, and displays the feature point information in the corresponding preview area P C Comparing the data to determine a preview area P C Whether or not a person is present inThe feature points are matched with the feature points of the first object of focus a.
If preview area P C If the feature point of a person does not exist in the preview area P and the feature point of the first object a to be focused are matched, the preview area P is determined C The first focus object a is not present.
In preview area P C When the first focusing object a is not present, the sub-camera C performs auto focusing.
According to the embodiment of the application, the auxiliary camera performs automatic focusing under the condition that the focusing is unsuccessful, so that the picture shot by the auxiliary camera is prevented from being fuzzy.
Optionally, after the first electronic device sends the first message to the second electronic device, the method further includes:
receiving a second message sent by the second electronic equipment;
and displaying a focusing result of the second electronic equipment according to the second message.
Optionally, after the second electronic device receives the first message sent by the first electronic device, the method further includes:
sending a second message to the first electronic device; the second message includes a focusing result of the second electronic device.
Specifically, as shown in fig. 5, after the sub-camera B achieves synchronous focusing on the first focused object a, a focusing result/focusing status message is sent to the main camera a, where the focusing result message includes information that synchronous focusing on the first focused object a is successfully achieved.
And after the auxiliary camera C realizes automatic focusing, sending a focusing result message to the main camera A, wherein the focusing result message contains information that the synchronous focusing of the first focusing object a is not successfully realized.
After the primary camera a receives the second message, the focusing results of the secondary camera B and the secondary camera C are displayed based on the second message.
The focusing result may be displayed by displaying the focusing frame of the first focusing object a. For example, the identification of the sub camera which is successfully focused in synchronization is displayed, and the identification of the sub camera which is not successfully focused in synchronization is not displayed; or the mark of the auxiliary camera which is successfully focused in the synchronization mode is displayed in a first pattern/color, shape and other modes, and the mark of the auxiliary camera which is not successfully focused in the synchronization mode is displayed in a second pattern/color, shape and other modes; and displaying the mark of the auxiliary camera which is successfully focused synchronously.
According to the embodiment of the application, the auxiliary camera feeds back the focusing result to the main camera in time, so that a user can know whether the auxiliary camera is successfully focused or not in time, and the focusing efficiency is further improved.
Optionally, after sending the first message to the second electronic device, the method further includes:
receiving a second input of the user; the second input is used to determine an updated second focus object of the second electronic device;
sending a third message to the second electronic device; the third message includes information of the second object of focus.
Optionally, after receiving the first message sent by the first electronic device, the method further includes:
receiving a third message sent by the first electronic equipment; the third message comprises information of a second focusing object;
focusing is performed based on the information of the second focusing object.
Specifically, fig. 6 is a fourth schematic interface diagram of the focusing control method provided in the embodiment of the present application, as shown in fig. 6, after the secondary camera B implements synchronous focusing on the first focused object a, if the user wishes to cancel the synchronous focusing result of the secondary camera B on the first focused object a, the user may input an operation that wishes to update the focusing result of the secondary camera B on the primary camera a, after receiving the input of the user, the secondary camera B sends a message to the secondary camera B to instruct the secondary camera B to update the focusing result, and after receiving the message, the secondary camera B updates its own focusing result based on the message. The focusing result may be updated to auto-focus or not.
According to the embodiment of the application, the user can update the focusing object of the auxiliary camera through the main camera, and the focusing efficiency is further improved.
Optionally, the second input is an operation of a user dragging the identifier associated with the second electronic device to the second focus object.
Specifically, the user can (quickly) slide to the preview area P by long-pressing the focusing frame of the first focusing object a on the main camera a, waking up the focusing operation, and holding down the identification of the corresponding sub camera B on the focusing frame of the first focusing object a A In addition, a command to cancel focusing can be sent to the sub-camera B, instructing the sub-camera B to cancel specific focusing and change it to auto-focusing. After the auxiliary camera B finishes automatic focusing, the auxiliary camera B sends updated focusing result information to the main camera A, and the main camera A displays the updated focusing result of the auxiliary camera B.
Optionally, after the user updates the focusing result of the secondary camera B to auto focusing through the first input, if the auto focusing area of the secondary camera B is in the preview area of the primary camera a, the updated focusing result message sent by the secondary camera B to the primary camera a includes the feature point of the auto focusing area or includes the feature point of the auto focusing object in the auto focusing area, after the primary camera a receives the updated focusing result message, the primary camera a displays the focusing frame of the auto focusing area of the secondary camera B or the focusing frame of the auto focusing object in the auto focusing area in the preview interface, and the focusing frame moves along with the movement of the auto focusing object of the secondary camera B, if the secondary camera B needs to be made to perform synchronous focusing again, the user only needs to drag the focusing frame on the primary camera a to the second focusing object that is desired to be synchronously focused.
If the autofocus area of the secondary camera B is not in the preview area of the primary camera a, the autofocus frame on the secondary camera B is not displayed in the preview interface of the primary camera a, and if the secondary camera B needs to be controlled again for synchronous focusing, the steps in the above embodiment may be repeated, that is, double-clicking the target object triggers the primary camera a to send a synchronous focusing request message to the secondary camera B and the secondary camera C, respectively.
After all the cameras finish focusing, photographing or video recording can be carried out.
Alternatively, the respective sub-cameras may also be controlled to focus on different target objects, respectively, through the first input.
Fig. 7 is a fifth schematic interface diagram of a focus control method provided in an embodiment of the present application, as shown in fig. 7, on the main camera a, a user can focus a target object as required, for example, in the preview area P A And finding the first focusing object a and carrying out focusing operation. Then, pressing the first focusing object a for a long time, waking up a focusing control (focusing frame), respectively displaying the identifiers of the cameras on the right side of the focusing control, and controlling the focusing area of the main camera a by dragging the identifier corresponding to the main camera a; the corresponding identifier of the secondary camera B can also be dragged, and after the primary camera A receives the input of the user, the message is sent to the secondary camera B to control the focusing area of the secondary camera B.
Fig. 8 is a sixth schematic interface diagram of a focusing control method according to an embodiment of the present application, as shown in fig. 8, a user drags an identifier corresponding to a secondary camera B to a second focused object B, after receiving the input of the user, a primary camera a sends a message to the secondary camera B, where the message includes a feature point of the second focused object B, after receiving the message, the secondary camera B obtains feature point information of the second focused object B, and in a corresponding preview area P, the secondary camera B obtains feature point information of the second focused object B B If the preview area P is compared with the data B If the second focusing object b does not exist, carrying out automatic focusing; if preview area P B If the second object b exists, then according to the preview area P B Corresponding calculation of position coordinates F Bb For a specific implementation manner, reference may be made to the manner of performing synchronous focusing on the first focusing object a in the foregoing embodiment, which is not described herein again.
According to the embodiment of the application, the user can drag the focusing object in the main camera to update the focusing object of the auxiliary camera, so that the focusing efficiency is further improved.
Alternatively, the size of the in-focus area of each sub camera may also be controlled by the first input.
Optionally, after sending the first message to the second electronic device, the method further includes:
receiving a third input of a user, wherein the third input is used for determining an updated focusing area of the second electronic device;
sending a fourth message to the second electronic device; the fourth message includes information of a focusing area of the second electronic device.
Fig. 9 is a seventh interface schematic diagram of the focusing control method according to the embodiment of the present application, as shown in fig. 9, the identifier corresponding to the sub-camera B on the preview interface of the main camera a can be controlled to move around the focusing frame, for example, clockwise movement may correspond to linear enlargement of the focusing area of the sub-camera B, counterclockwise movement may correspond to linear reduction of the focusing area of the sub-camera B, and modification of the size of the focusing area may result in change of the definition of the focused object.
According to the embodiment of the application, the user can adjust the focusing area of the auxiliary camera in the main camera, and the focusing efficiency is further improved.
Alternatively, the linkage between the respective sub-cameras may also be controlled by the first input.
Optionally, after performing the auto-focusing, the method further includes:
focusing the first focus object when the first focus object appears in the preview area.
Fig. 10 is an eighth interface schematic diagram of the focusing control method according to the embodiment of the present application, as shown in fig. 10, an identifier corresponding to the sub-camera B and an identifier corresponding to the sub-camera C are displayed in a preview interface of the main camera a, the sub-camera B is focused on the first focused object a, the sub-camera C is focused on the second focused object B, the user can drag the identifier corresponding to the sub-camera B to the identifier corresponding to the sub-camera C to form a hidden connecting line, after receiving the input of the user, the main camera a sends a message to the sub-camera B, the message includes a feature point of the second focused object B, after receiving the message, the sub-camera B records the feature point of the second focused object B, if the user needs to move the position or direction of the sub-camera B, the sub-camera B automatically compares the feature point of the second focused object B recorded before moving, and if the second focused object B is matched in the moved area, the synchronous camera focusing process is automatically performed without requiring the sub-camera B to perform a refocusing operation.
In the embodiment of the application, the auxiliary camera automatically compares the feature points of the previously recorded focusing object after moving, and if the focusing object is matched in the moved preview area, synchronous focusing processing is automatically performed without performing focusing operation again by the auxiliary camera, so that the focusing efficiency is further improved.
Fig. 11 is a second flowchart of a focus control method according to an embodiment of the present application, and as shown in fig. 11, the embodiment of the present application provides a focus control method, which is applied to a second electronic device, where the second electronic device can be a shooting device for shooting a secondary phone in a scene with multiple phones, for example, a smart phone, a camera, a video camera, and the like, and the method includes:
step 1101, receiving a first message sent by a first electronic device; the first message contains information of a first focusing object;
step 1102, focusing is carried out based on the information of the first focusing object.
Optionally, focusing based on the information of the first focusing object includes:
determining feature points of the first focus object based on the information of the first focus object;
judging whether the first focusing object exists in a preview area based on the characteristic point of the first focusing object;
focusing the first focus object when the first focus object exists in the preview area.
Optionally, focusing the first focused object includes:
determining position information of the first focus object in a preview area;
determining a focal distance of the first in-focus object based on the location information.
Optionally, focusing the first focused object, further comprises:
and when the first focusing object does not exist in the preview area, performing automatic focusing.
Optionally, after performing the auto-focusing, the method further includes:
focusing the first in-focus object when the first in-focus object appears in the preview area.
Optionally, after receiving the first message sent by the first electronic device, the method further includes:
sending a second message to the first electronic device; the second message includes a focusing result of the second electronic device.
Optionally, after receiving the first message sent by the first electronic device, the method further includes:
receiving a third message sent by the first electronic equipment; the third message comprises information of a second focusing object;
focusing is performed based on the information of the second focusing object.
The focusing control method provided in the embodiment of the present application may refer to the embodiment of the focusing control method in which the execution main body is the first electronic device, and may achieve the same technical effects, and detailed descriptions of the same parts and beneficial effects as those of the corresponding method embodiment in this embodiment are not repeated herein.
According to the focusing control method provided by the embodiment of the application, the execution main body can be a focusing control device. In the embodiment of the present application, a focusing control method executed by a focusing control device is taken as an example, and the focusing control device provided in the embodiment of the present application is described.
Fig. 12 is a schematic structural diagram of a focus control device provided in an embodiment of the present application, and as shown in fig. 12, the embodiment of the present application provides a focus control device including a first receiving module 1201 and a first responding module 1202, where:
the first receiving module 1201 is configured to receive a first input of a user, where the first input is used to determine a first focus object of a second electronic device; a first response module 1202 for sending a first message to the second electronic device in response to the first input; the first message includes information of the first focus object.
Optionally, the system further comprises a third receiving module and a display module;
the third receiving module is used for receiving a second message sent by the second electronic device;
the display module is used for displaying the focusing result of the second electronic equipment according to the second message.
Optionally, the system further comprises a fourth receiving module and a first sending module;
the fourth receiving module is used for receiving a second input of the user; the second input is used to determine an updated second focus object of the second electronic device;
the first sending module is used for sending a third message to the second electronic equipment; the third message includes information of the second object of focus.
Optionally, the second input is an operation of a user dragging the identifier associated with the second electronic device to the second focus object.
Optionally, the system further comprises a fifth receiving module and a second sending module;
the fifth receiving module is configured to receive a third input of the user, where the third input is used to determine an updated focusing area of the second electronic device;
the second sending module is used for sending a fourth message to the second electronic device; the fourth message includes information of a focusing area of the second electronic device.
Optionally, the information of the first focus object includes at least one of the following information:
an identification of a first focus object;
characteristic points of the first object of focus.
Specifically, the focus control device provided in the embodiment of the present application can implement all the method steps implemented by the method embodiment in which the execution subject is the first electronic device, and can achieve the same technical effect, and details of the same parts and beneficial effects as those of the method embodiment in this embodiment are not described herein again.
Fig. 13 is a second schematic structural diagram of a focusing control apparatus provided in the embodiment of the present application, and as shown in fig. 13, the embodiment of the present application provides a focusing control apparatus, which includes a second receiving module 1301 and a first control module;
the second receiving module 1301 is configured to receive a first message sent by the first electronic device; the first message contains information of a first focusing object; the first control module 1302 is configured to focus based on the information of the first focusing object.
Optionally, the first control module includes a first determining unit, a first judging unit and a first control unit;
the first determination unit is used for determining characteristic points of the first focusing object based on the information of the first focusing object;
the first judging unit is used for judging whether the first focusing object exists in a preview area or not based on the characteristic point of the first focusing object;
the first control unit is configured to focus the first focus object when the first focus object exists in the preview area.
Optionally, the first control unit comprises a first determining subunit and a second determining subunit;
the first determining subunit is configured to determine position information of the first focus object in a preview area;
the second determining subunit is configured to determine a focal distance of the first in-focus object based on the position information.
Optionally, the first control module further comprises a second control unit;
the second control unit is configured to perform auto-focusing when the first focusing object is not present in the preview area.
Optionally, the first control unit is further configured to focus the first focusing object when the first focusing object appears in the preview area.
Optionally, the apparatus further comprises a third sending module;
the third sending module is used for sending a second message to the first electronic device; the second message includes a focusing result of the second electronic device.
Optionally, the apparatus further comprises a sixth receiving module and a second control module;
the sixth receiving module is configured to receive a third message sent by the first electronic device; the third message comprises information of a second focusing object;
the second control module is used for focusing based on the information of the second focusing object.
Specifically, the focusing control apparatus provided in the embodiment of the present application can implement all the method steps implemented by the method embodiment in which the execution subject is the second electronic device, and can achieve the same technical effect, and details of the same parts and beneficial effects as those of the method embodiment in this embodiment are not described herein again.
The focus control device in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (Network Attached Storage, NAS), a personal computer (NAS), a Television (TV), an assistant, a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The focus control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present application.
Optionally, an electronic device is further provided in an embodiment of the present application, and includes a processor and a memory, where the memory stores a program or an instruction that can be executed on the processor, and the program or the instruction, when executed by the processor, implements the steps of the foregoing focusing control method embodiment, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 14 is a schematic hardware structure diagram of a first electronic device provided in an embodiment of the present application, and as shown in fig. 14, the electronic device 1400 includes, but is not limited to: radio unit 1401, network module 1402, audio output unit 1403, input unit 1404, sensor 1405, display unit 1406, user input unit 1407, interface unit 1408, memory 1409, and processor 1410.
Those skilled in the art will appreciate that the electronic device 1400 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1410 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 14 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
It should be understood that in the embodiment of the present application, the input Unit 1404 may include a Graphics Processing Unit (GPU) 14041 and a microphone 14042, and the Graphics processor 14041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1406 may include a display panel 14061, and the display panel 14061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1407 includes at least one of a touch panel 14071 and other input devices 14072. The touch panel 14071 is also referred to as a touch panel. The touch panel 14071 may include two parts of a touch detection device and a touch controller. Other input devices 14072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a track control, a mouse, and a joystick, which are not described in detail herein. The memory 1409 may be used to store software programs as well as various data. The memory 1409 may mainly include a first memory area storing a program or an instruction and a second memory area storing data, wherein the first memory area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1409 can comprise volatile memory or nonvolatile memory, or the memory 1409 can comprise both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 1409 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1410 may include one or more processing units; optionally, the processor 110 integrates an application processor, which mainly handles operations related to the operating system, user interface, application programs, etc., and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 1410.
Wherein the user input unit 1407 is for receiving a first input of a user, the first input being for determining a first object of focus of the second electronic device;
the radio frequency unit 1401 is configured to send a first message to the second electronic device in response to the first input; the first message includes information of the first focus object.
Optionally, the radio frequency unit 1401 is further configured to receive a second message sent by the second electronic device;
the display unit 1406 is configured to display a focusing result of the second electronic device according to the second message.
Optionally, the user input unit 1407 is further configured to receive a second input by the user; the second input is used to determine an updated second focus object of the second electronic device;
the radio frequency unit 1401 is further configured to send a third message to the second electronic device; the third message includes information of the second object of focus.
Optionally, the user input unit 1407 is further configured to receive a third input of the user, where the third input is used to determine an updated focusing area of the second electronic device;
the radio frequency unit 1401 is further configured to send a fourth message to the second electronic device; the fourth message includes information of a focusing area of the second electronic device.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the focusing control method embodiment in which the execution main body is the first electronic device, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
Fig. 15 is a schematic diagram of a hardware structure of a second electronic device provided in an embodiment of the present application, and as shown in fig. 15, the electronic device 1500 includes, but is not limited to: a radio frequency unit 1501, a network module 1502, an audio output unit 1503, an input unit 1504, a sensor 1505, a display unit 1506, a user input unit 1507, an interface unit 1508, a memory 1509, and a processor 1510.
Those skilled in the art will appreciate that the electronic device 1500 may also include a power supply (e.g., a battery) for powering the various components, which may be logically coupled to the processor 1510 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 15 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
It should be understood that, in the embodiment of the present application, the input Unit 1504 may include a Graphics Processing Unit (GPU) 15041 and a microphone 15042, and the Graphics processor 15041 processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1506 may include a display panel 15061, and the display panel 15061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1507 includes at least one of a touch panel 15071 and other input devices 15072. A touch panel 15071 is also referred to as a touch screen. The touch panel 15071 may include two parts of a touch detection device and a touch controller. Other input devices 15072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), track controls, a mouse, and a joystick, which are not described in detail herein.
The memory 1509 may be used to store software programs as well as various data. The memory 1509 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions required for at least one function (such as a sound playing function, an image playing function, etc.), and the like. Further, the memory 1509 may include volatile memory or nonvolatile memory, or the memory 1509 may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 1509 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1510 may include one or more processing units; optionally, the processor 110 integrates an application processor, which mainly handles operations related to the operating system, user interface, application programs, etc., and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1510.
The radio frequency unit 1501 is configured to receive a first message sent by a first electronic device; the first message contains information of a first focusing object;
the processor 1510 is configured to perform focusing based on the information of the first focusing object.
Optionally, the processor 1510 is specifically configured to determine feature points of the first focus object based on the information of the first focus object;
judging whether the first focusing object exists in a preview area based on the characteristic point of the first focusing object;
focusing the first focus object when the first focus object exists in the preview area.
Optionally, processor 1510 is specifically configured to determine position information of the first focus object in the preview area;
determining a focal distance of the first in-focus object based on the location information.
Optionally, the processor 1510 is further configured to perform autofocus in case the first focusing object is not present in the preview area.
Optionally, processor 1510 is further configured to focus the first in-focus object if the first in-focus object appears in the preview area.
Optionally, the radio frequency unit 1501 is further configured to send a second message to the first electronic device; the second message includes a focusing result of the second electronic device.
Optionally, the radio frequency unit 1501 is further configured to receive a third message sent by the first electronic device; the third message comprises information of a second focusing object;
the processor 1510 is further configured to focus based on the information of the second object of focus.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the focusing control method with the execution main body being the second electronic device, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing focusing control method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing focusing control method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (16)

1. A focusing control method is applied to first electronic equipment and comprises the following steps:
receiving a first input of a user, the first input being used to determine a first focus object of a second electronic device;
in response to the first input, sending a first message to the second electronic device; the first message includes information of the first focus object.
2. The focus control method of claim 1, wherein after sending the first message to the second electronic device, further comprising:
receiving a second message sent by the second electronic equipment;
and displaying the focusing result of the second electronic equipment according to the second message.
3. The focus control method of claim 1, wherein after sending the first message to the second electronic device, further comprising:
receiving a second input of the user; the second input is used to determine an updated second focus object of the second electronic device;
sending a third message to the second electronic device; the third message includes information of the second object of focus.
4. The focus control method of claim 3, wherein the second input is a user operation to drag the identification associated with the second electronic device to the second focus object.
5. The focus control method of claim 1, wherein after sending the first message to the second electronic device, further comprising:
receiving a third input of a user, wherein the third input is used for determining an updated focusing area of the second electronic device;
sending a fourth message to the second electronic device; the fourth message includes information of a focusing area of the second electronic device.
6. The focus control method according to any one of claims 1 to 5, wherein the information of the first focus object includes at least one of:
an identification of a first focus object;
characteristic points of the first object of focus.
7. A focusing control method is applied to a second electronic device and comprises the following steps:
receiving a first message sent by first electronic equipment; the first message comprises information of a first focusing object;
focusing is performed based on the information of the first focusing object.
8. The focus control method according to claim 7, wherein focusing based on the information of the first focusing target includes:
determining feature points of the first focus object based on the information of the first focus object;
judging whether the first focusing object exists in a preview area based on the characteristic points of the first focusing object;
focusing the first in-focus object when the first in-focus object is present in the preview area.
9. The focus control method according to claim 8, wherein focusing the first focus object includes:
determining position information of the first focus object in a preview area;
determining a focal distance of the first in-focus object based on the location information.
10. The focus control method according to claim 8, wherein focusing the first focused object, further comprising:
and performing automatic focusing when the first focusing object does not exist in the preview area.
11. The focus control method according to claim 10, further comprising, after performing auto-focusing:
focusing the first focus object when the first focus object appears in the preview area.
12. The focus control method of claim 7, wherein after receiving the first message sent by the first electronic device, the method further comprises:
sending a second message to the first electronic device; the second message includes a focusing result of the second electronic device.
13. The focus control method of claim 7, wherein after receiving the first message sent by the first electronic device, the method further comprises:
receiving a third message sent by the first electronic equipment; the third message comprises information of a second focusing object;
focusing is performed based on the information of the second focusing object.
14. A focus control apparatus, comprising:
a first receiving module for receiving a first input of a user, the first input being used for determining a first focus object of a second electronic device;
a first response module for sending a first message to the second electronic device in response to the first input; the first message includes information of the first focus object.
15. A focus control apparatus, comprising:
the second receiving module is used for receiving a first message sent by the first electronic equipment; the first message comprises information of a first focusing object;
and the first control module is used for focusing based on the information of the first focusing object.
16. An electronic device comprising a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the method of any of claims 1 to 6 or the method of any of claims 7 to 13.
CN202211249628.6A 2022-10-12 2022-10-12 Focusing control method and device, electronic equipment and storage medium Pending CN115623331A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211249628.6A CN115623331A (en) 2022-10-12 2022-10-12 Focusing control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211249628.6A CN115623331A (en) 2022-10-12 2022-10-12 Focusing control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115623331A true CN115623331A (en) 2023-01-17

Family

ID=84862809

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211249628.6A Pending CN115623331A (en) 2022-10-12 2022-10-12 Focusing control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115623331A (en)

Similar Documents

Publication Publication Date Title
CN112492212B (en) Photographing method and device, electronic equipment and storage medium
CN112565611B (en) Video recording method, video recording device, electronic equipment and medium
EP2887648B1 (en) Method of performing previewing and electronic device for implementing the same
EP3945490A1 (en) Method and device for processing video, and storage medium
CN113840070B (en) Shooting method, shooting device, electronic equipment and medium
WO2022089284A1 (en) Photographing processing method and apparatus, electronic device, and readable storage medium
CN112887617B (en) Shooting method and device and electronic equipment
CN113411621B (en) Audio data processing method and device, storage medium and electronic equipment
CN113329172B (en) Shooting method and device and electronic equipment
KR20150117820A (en) Method For Displaying Image and An Electronic Device Thereof
US20230362477A1 (en) Photographing method and apparatus, electronic device and readable storage medium
CN112954214A (en) Shooting method and device, electronic equipment and storage medium
CN113873166A (en) Video shooting method and device, electronic equipment and readable storage medium
CN114025092A (en) Shooting control display method and device, electronic equipment and medium
CN114125179A (en) Shooting method and device
CN107230240B (en) Shooting method and mobile terminal
CN115278084A (en) Image processing method, image processing device, electronic equipment and storage medium
CN115623331A (en) Focusing control method and device, electronic equipment and storage medium
CN113286085B (en) Display control method and device and electronic equipment
CN114390206A (en) Shooting method and device and electronic equipment
KR101683130B1 (en) Method for providing user interface capable of allowing a user at local area and a user at remote area to interact each other, terminal, and computer-readable recording medium using the same
EP3905660A1 (en) Method and device for shooting image, and storage medium
CN114286004A (en) Focusing method, shooting device, electronic equipment and medium
WO2016206468A1 (en) Method and device for processing video communication image
CN112165584A (en) Video recording method, video recording device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination