CN112261300B - Focusing method and device and electronic equipment - Google Patents

Focusing method and device and electronic equipment Download PDF

Info

Publication number
CN112261300B
CN112261300B CN202011142848.XA CN202011142848A CN112261300B CN 112261300 B CN112261300 B CN 112261300B CN 202011142848 A CN202011142848 A CN 202011142848A CN 112261300 B CN112261300 B CN 112261300B
Authority
CN
China
Prior art keywords
light
emitting point
determining
shooting
target device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011142848.XA
Other languages
Chinese (zh)
Other versions
CN112261300A (en
Inventor
肖敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Shenzhen Co Ltd
Original Assignee
Vivo Mobile Communication Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Shenzhen Co Ltd filed Critical Vivo Mobile Communication Shenzhen Co Ltd
Priority to CN202011142848.XA priority Critical patent/CN112261300B/en
Publication of CN112261300A publication Critical patent/CN112261300A/en
Application granted granted Critical
Publication of CN112261300B publication Critical patent/CN112261300B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a focusing method, a focusing device and electronic equipment, which belong to the technical field of communication, wherein the method comprises the following steps: under the condition that a light-emitting point is detected to be included in a shooting preview picture, determining the position of the light-emitting point, wherein the light-emitting point is formed by the action of light rays emitted by the target device on a shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object. According to the focusing method disclosed by the application, a user can specify a focusing object in a shooting preview picture through light rays emitted by the target device, the user does not need to touch a screen of the electronic equipment to adjust the focusing object, and the operation is convenient and fast.

Description

Focusing method and device and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a focusing method, a focusing device and electronic equipment.
Background
At present, in the process of taking a picture by a user through electronic equipment, when a camera needs to be adjusted to focus an object, the user can click an image of the object to be taken, which is taken by a screen of the electronic equipment, so that the camera is controlled to change the object to be focused.
When the user uses the selfie stick or the tripod to take self-timer and record, the user usually has a certain distance with the electronic equipment, and the focusing object can not be directly changed through the touch control touch screen, so that the user operation is inconvenient.
Disclosure of Invention
The embodiment of the application aims to provide a focusing method, which can solve the problem that the existing focusing mode is inconvenient for a user to operate.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present application provides a focusing method, where the method includes: under the condition that a light-emitting point is detected to be included in a shooting preview picture, determining the position of the light-emitting point, wherein the light-emitting point is formed by the action of light rays emitted by a target device on a shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object.
In a second aspect, an embodiment of the present application provides a focusing apparatus, where the apparatus includes: the device comprises a position determining module, a position determining module and a shooting module, wherein the position determining module is used for determining the position of a luminous point under the condition that the shooting preview picture comprises the luminous point, and the luminous point is formed by the action of light rays emitted by a target device on a shooting object; the object determining module is used for determining the shooting object at the position as a focusing object; and the adjusting module is used for adjusting the shooting focus to the focusing object.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, the position of a light-emitting point is determined under the condition that the shooting preview picture includes the light-emitting point, wherein the light-emitting point is formed by the action of light rays emitted by a target device on a shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object. A user can specify a focusing object in the shooting preview picture through light rays emitted by the target device, and does not need to touch a screen of the electronic equipment with a hand to adjust the focusing object, so that the operation is convenient and fast.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments of the present application will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flowchart illustrating steps of a focusing method according to an embodiment of the present application;
FIG. 2 is a schematic view illustrating a focusing operation principle of the embodiment of the present application;
FIG. 3 is a schematic view illustrating still another focusing operation principle of the embodiment of the present application;
FIG. 4 is a block diagram illustrating a focusing device according to an embodiment of the present application;
fig. 5 is a block diagram showing a configuration of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic diagram showing a hardware configuration of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The focusing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, a flowchart illustrating steps of a focusing method according to an embodiment of the present application is shown.
The focusing method of the embodiment of the application comprises the following steps:
step 101: and in the case of detecting that the light-emitting point is included in the shooting preview picture, determining the position of the light-emitting point.
Wherein, the luminous point is formed by the light ray emitted by the target device acting on the shooting object.
The target device may be a light-emitting device included in the electronic device, and may also be a light-emitting intelligent device having a connection relationship with the electronic device. The target device can be integrated with a light emitting device, the light emitting device can emit light with preset colors, and the light with the preset colors emitted by the light emitting device forms light emitting points in a shooting preview picture. Furthermore, an infrared laser emitting and receiving device may be integrated in the target device, and a light emitting point is formed in the photographing preview screen by infrared light emitted from the infrared laser emitting and receiving device.
The light-emitting point included in the shooting preview picture can be a newly added light-emitting point, and can also be a light-emitting point which moves from other positions in the shooting preview picture to the current position.
Step 102: and determining the shooting object at the position as a focusing object.
As shown in fig. 2, the image of the photographed preview image includes two objects, i.e., an oval object a and a hexagonal object B, and the user controls the touch device to emit light and positions the light-emitting point on the photographed object a, so that the light-emitting point appears on the image of the hexagonal object B in the photographed preview image synchronously, and the hexagonal object B is determined as the object to be focused.
Step 103: and adjusting the shooting focus to the focusing object.
In the step, the system automatically controls the camera to adjust the focusing without touching the screen of the electronic equipment or moving the body of the electronic equipment by a user to adjust the focusing object.
According to the focusing method provided by the embodiment of the invention, the position of the luminous point is determined under the condition that the shooting preview picture is detected to include the luminous point, wherein the luminous point is formed by the action of light rays emitted by a target device on a shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object. The user can specify the focusing object in the shooting preview picture through the light emitted by the target device, and does not need to touch the screen of the electronic equipment or move the body of the electronic equipment to adjust the focusing object, so that the operation is convenient and fast.
In an optional embodiment, in a case that it is detected that the light-emitting point is included in the shooting preview screen, the determining a manner of the position of the light-emitting point includes the following steps:
the method comprises the following steps: determining a first color of a light-emitting point in the case of detecting that the light-emitting point is included in the shooting preview picture;
step two: and determining the position of the light-emitting point in the shooting preview picture under the condition that the light-emitting color of the target device comprises the first color.
For example: the target device can emit red light and green light, and the detected first color of the luminous point in the shooting preview picture is blue, the luminous point is determined to be not used for triggering focusing, and therefore the step one is returned to continuously detect the shooting preview picture.
For another example: the target device can emit red light and green light, and the detected first color of the luminous point in the shooting preview picture is red, the luminous point is determined to be used for triggering focusing, so that the position of the luminous point in the shooting preview picture is determined, and a subsequent related focusing process is executed.
Under the condition that the light color emitted by the target device does not contain the first color, the light-emitting point is determined to be not the light-emitting point formed by the light emitted by the user control target device but the light-emitting point formed by the light emitted by other external light-emitting objects, so that the subsequent focusing related flow is not required to be executed.
The mode of optionally judging whether the color of the light-emitting point is matched with the color of the light emitted by the target device can effectively avoid blindly adjusting the focusing object, and avoid interference brought to users and useless work caused to electronic equipment.
In an optional embodiment, in a case where the light-emitting color of the target device includes a first color, the step of determining the position of the light-emitting point in the shooting preview screen includes the following sub-steps:
the first substep: comparing whether the first color and a second color of the shooting object at the position of the luminous point belong to the same color system or not under the condition that the luminous color of the target device comprises the first color;
in order to improve the focusing accuracy, in the case that the focusing object is red, the user can control the target device to emit green light to form a light-emitting point on the focusing object. In the case where the object in focus is green, the user may control the target device to emit red light to form a light emitting point on the object in focus.
And a second substep: when the color difference is not of the same color system, the light-emitting point is regarded as an effective point.
In the case of belonging to the same color system, the light-emitting point is regarded as an invalid point. When the light emitting point is recorded as the effective point, the subsequent focusing-related operation is executed, when the light emitting point is recorded as the ineffective point,
for example: the target device can emit red light and green light, the detected first color containing the light-emitting point in the shooting preview picture is red, the shooting object at the position of the light-emitting point is green, the first color and the detected first color are determined to belong to different color systems, and the light-emitting point is marked as an effective point, wherein the effective point is used for the light-emitting point formed by the light emitted intentionally through the target device. The invalid point refers to a light-emitting point formed by a light ray which is not intended to be emitted by the target device by the user or a light-emitting point formed by a light ray which is not intended to be emitted by the target device.
The mode of optionally judging whether the luminous point is the effective point can improve the accuracy of identifying the focusing object, thereby further effectively avoiding the interference brought to users and the useless work caused to the electronic equipment due to blind adjustment of the focusing object.
In an optional embodiment, the target device includes an infrared laser transceiver, the target device emits infrared laser, and the step of determining the position of the light-emitting point when detecting that the light-emitting point is included in the shooting preview picture includes the following sub-steps:
the first substep: under the condition that the shooting preview picture is detected to include the luminous point, determining the time difference between the first time when the target device emits the infrared laser and the second time when the shooting preview picture is detected to have the luminous point;
the electronic equipment is also provided with an infrared laser receiver, the infrared laser receiver can receive infrared laser reflected by a focusing object at the position of the luminous point, the first time when the target device sends the infrared laser is reached, and the time difference between the first time and the second time when the infrared laser receiver arranged in the electronic equipment receives the infrared laser can be represented by Tb.
And a second substep: determining a first distance between the target device and a shooting object at the position of the light-emitting point;
an infrared laser receiving and transmitting device in the target device transmits infrared laser, and the transmitted infrared laser is reflected after encountering an obstacle and then is received by the target device again. The target device can determine a first distance between the target device and a shooting object at the position of the light-emitting point through the time difference of the red and red infrared laser lights.
Specifically, the time difference between the emission and the emission of the infrared laser light can be represented by Ta, c represents the light speed of the infrared laser light, and a represents the first distance, so that the first distance a is (Ta/2 × c).
And a third substep: determining a second distance between the camera and the shooting object according to the time difference and the first distance;
as shown in fig. 3, the first distance, i.e., distance a shown in fig. 3, and the second distance, i.e., distance b shown in fig. 3, may be determined by the formula b ═ Tb × c-a.
And a fourth substep: and determining the position of the light-emitting point according to the second distance.
After the second distance between the camera of the electronic equipment and the shot object is determined, the electronic equipment can control the camera to adjust the focusing distance, focus is aligned to the shot object, namely the focusing object, and focusing is automatically completed.
The mode of optionally selecting the focusing object through the infrared laser has convenient and accurate operation.
In an optional embodiment, in a case that the shooting preview picture includes the light-emitting point, before determining the position of the light-emitting point, the method may further establish a data transmission connection with the target device in a case that the shooting function is detected to be turned on.
After the data transmission connection is established between the electronic equipment and the target device, the electronic equipment is convenient to communicate with the target device in real time.
It should be noted that, in the focusing method provided in the embodiments of the present application, the executing body may be a focusing device, or a control module in the focusing device for executing the focusing method. In the embodiment of the present application, a focusing module is taken as an example to execute a focusing method, so as to describe the focusing device provided in the embodiment of the present application.
Fig. 4 is a block diagram of a focusing device according to an embodiment of the present disclosure.
The focusing apparatus 400 of the embodiment of the present application is applied to an electronic device including a touch apparatus, wherein the apparatus 400 includes: a position determining module 401, configured to determine, when it is detected that a shooting preview picture includes a light-emitting point, a position where the light-emitting point is located, where the light-emitting point is formed by light emitted by a target device acting on a shooting object; an object determination module 402, configured to determine the shooting object at the position as a focusing object; an adjusting module 403, configured to adjust a shooting focus to the focusing object.
Optionally, the position determining module includes:
the first submodule is used for determining a first color of a light-emitting point under the condition that the light-emitting point is detected to be included in a shooting preview picture;
and the second submodule is used for determining the position of the light-emitting point in the shooting preview picture under the condition that the light color which can be emitted by the target device contains the first color.
Optionally, the second sub-module includes:
a first unit, configured to compare whether a first color of the target device and a second color of the object at the position where the light-emitting point is located belong to a same color system when the color of light emitted by the target device includes the first color;
and a second unit configured to mark the light emitting point as an effective point when the light emitting point does not belong to a different color system.
Optionally, the position determining module includes:
the third sub-module is used for determining the time difference between the first time when the target device emits the infrared laser and the second time when the light-emitting point appears in the shooting preview picture under the condition that the shooting preview picture is detected to include the light-emitting point;
the fourth submodule is used for determining a first distance between the target device and a shooting object at the position of the light-emitting point;
the fifth submodule is used for determining a second distance between the camera and the shooting object according to the time difference and the first distance;
and the sixth submodule is used for determining the position of the luminous point according to the second distance.
Optionally, the apparatus further comprises:
and the connection establishing module is used for establishing data transmission connection with the target device under the condition that the shooting function is detected to be started before the position of the luminous point is determined under the condition that the position determining module detects that the shooting preview picture comprises the luminous point.
The focusing device provided by the embodiment of the application determines the position of a light-emitting point under the condition that the shooting preview picture comprises the light-emitting point, wherein the light-emitting point is formed by the action of light rays emitted by a target device on a shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object. A user can specify a focusing object in the shooting preview picture through light rays emitted by the target device, and does not need to touch a screen of the electronic equipment with a hand to adjust the focusing object, so that the operation is convenient and fast.
The focusing device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The focusing device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The focusing device provided in the embodiment of the present application can implement each process implemented in the method embodiments of fig. 1 to fig. 3, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 5, an electronic device 500 is further provided in the embodiment of the present application, and includes a processor 501, a memory 502, and a program or an instruction stored in the memory 502 and executable on the processor 501, where the program or the instruction is executed by the processor 501 to implement each process of the foregoing focusing method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and the like.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 610 is configured to determine a position of a light-emitting point when the light-emitting point is detected to be included in the shooting preview picture, where the light-emitting point is formed by light emitted by the target device acting on the shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object. The target device may be a light-emitting device included in the electronic device, or may be a light-emitting intelligent device having a connection relationship with the electronic device. In the embodiment of the application, the electronic equipment determines the position of a luminous point under the condition that the shooting preview picture comprises the luminous point, wherein the luminous point is formed by the action of light rays emitted by a target device on a shooting object; determining the shooting object at the position as a focusing object; and adjusting the shooting focus to the focusing object. A user can specify a focusing object in the shooting preview picture through light rays emitted by the target device, and does not need to touch a screen of the electronic equipment with a hand to adjust the focusing object, so that the operation is convenient and fast.
Optionally, when the processor 610 determines that the position of the light-emitting point is located when detecting that the light-emitting point is included in the shooting preview screen, specifically, the processor is configured to: determining a first color of a light-emitting point under the condition that the light-emitting point is detected to be included in a shooting preview picture; and under the condition that the light color which can be emitted by the target device comprises the first color, determining the position of the light-emitting point in the shooting preview picture.
Optionally, when determining the position of the light-emitting point in the shooting preview screen when the light color emitted by the target device includes the first color, the processor 610 is specifically configured to: comparing whether the first color and a second color of the shooting object at the position of the light-emitting point belong to the same color system or not under the condition that the light color emitted by the target device comprises the first color; in the case of not belonging to the different color system, the light-emitting point is regarded as an effective point.
Optionally, the target device includes an infrared laser transceiver, the light emitted by the target device is an infrared laser, and when the processor 610 determines that the position of the light-emitting point is determined when detecting that the shooting preview picture includes the light-emitting point, the processor is specifically configured to: under the condition that a light-emitting point is detected to be included in a shooting preview picture, determining a time difference between a first time when the target device emits infrared laser and a second time when the light-emitting point is detected to appear in the shooting preview picture; determining a first distance between the target device and a shooting object at the position of the light-emitting point; determining a second distance between the camera and the shooting object according to the time difference and the first distance; and determining the position of the luminous point according to the second distance.
Optionally, the processor 610 is further configured to establish a data transmission connection with the target device when it is detected that the shooting function is turned on before determining a position of the light-emitting point when it is detected that the shooting preview screen includes the light-emitting point.
It should be understood that in the embodiment of the present application, the input Unit 604 may include a Graphics Processing Unit (GPU) 8041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel, 6071, also known as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 610 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the foregoing focusing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above focusing method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. A focusing method, the method comprising:
under the condition that a light-emitting point is detected to be included in a shooting preview picture, determining the position of the light-emitting point, wherein the light-emitting point is formed by the action of light rays emitted by a target device on a shooting object;
determining the shooting object at the position as a focusing object;
adjusting a shooting focus to the focusing object;
the target device emits light rays with preset colors.
2. The method according to claim 1, wherein the step of determining the position of the light-emitting point when the light-emitting point is detected to be included in the shooting preview picture comprises:
determining a first color of a light-emitting point under the condition that the light-emitting point is detected to be included in a shooting preview picture;
and under the condition that the light color which can be emitted by the target device comprises the first color, determining the position of the light-emitting point in the shooting preview picture.
3. The method according to claim 2, wherein the step of determining the position of the light-emitting point in the shooting preview screen when the target device-emittable light color includes the first color comprises:
comparing whether the first color and a second color of the shooting object at the position of the light-emitting point belong to the same color system or not under the condition that the light color emitted by the target device comprises the first color;
in the case of not belonging to the different color system, the light-emitting point is regarded as an effective point.
4. The method according to claim 1, wherein the target device includes an infrared laser transceiver, the target device emits infrared laser, and the step of determining the position of the light-emitting point when detecting that the light-emitting point is included in the shooting preview picture includes:
under the condition that a light-emitting point is detected to be included in a shooting preview picture, determining a time difference between a first time when the target device emits infrared laser and a second time when the light-emitting point is detected to appear in the shooting preview picture;
determining a first distance between the target device and a shooting object at the position of the light-emitting point;
determining a second distance between the camera and the shooting object according to the time difference and the first distance;
and determining the position of the luminous point according to the second distance.
5. The method according to claim 1, wherein, in a case where it is detected that a light-emitting point is included in the shooting preview screen, before the step of determining a position where the light-emitting point is located, the method further comprises:
and under the condition that the shooting function is detected to be started, establishing data transmission connection with the target device.
6. A focusing device, comprising:
the device comprises a position determining module, a position determining module and a shooting module, wherein the position determining module is used for determining the position of a luminous point under the condition that the shooting preview picture comprises the luminous point, and the luminous point is formed by the action of light rays emitted by a target device on a shooting object;
the object determining module is used for determining the shooting object at the position as a focusing object;
the adjusting module is used for adjusting the shooting focus to the focusing object;
the target device emits light rays with preset colors.
7. The apparatus of claim 6, wherein the location determination module comprises:
the first submodule is used for determining a first color of a light-emitting point under the condition that the light-emitting point is detected to be included in a shooting preview picture;
and the second submodule is used for determining the position of the light-emitting point in the shooting preview picture under the condition that the light color which can be emitted by the target device contains the first color.
8. The apparatus of claim 7, wherein the second sub-module comprises:
a first unit, configured to compare whether a first color of the target device and a second color of the object at the position where the light-emitting point is located belong to a same color system when the color of light emitted by the target device includes the first color;
and a second unit configured to mark the light emitting point as an effective point when the light emitting point does not belong to a different color system.
9. The apparatus of claim 6, wherein the location determination module comprises:
the third sub-module is used for determining the time difference between the first time when the target device emits the infrared laser and the second time when the light-emitting point appears in the shooting preview picture under the condition that the shooting preview picture is detected to include the light-emitting point;
the fourth submodule is used for determining a first distance between the target device and a shooting object at the position of the light-emitting point;
the fifth submodule is used for determining a second distance between the camera and the shooting object according to the time difference and the first distance;
and the sixth submodule is used for determining the position of the luminous point according to the second distance.
10. The apparatus of claim 6, further comprising:
and the connection establishing module is used for establishing data transmission connection with the target device under the condition that the shooting function is detected to be started before the position of the luminous point is determined under the condition that the position determining module detects that the shooting preview picture comprises the luminous point.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, which when executed by the processor, implement the steps of the focusing method as claimed in any one of claims 1 to 5.
CN202011142848.XA 2020-10-22 2020-10-22 Focusing method and device and electronic equipment Active CN112261300B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011142848.XA CN112261300B (en) 2020-10-22 2020-10-22 Focusing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011142848.XA CN112261300B (en) 2020-10-22 2020-10-22 Focusing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112261300A CN112261300A (en) 2021-01-22
CN112261300B true CN112261300B (en) 2021-12-24

Family

ID=74264812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011142848.XA Active CN112261300B (en) 2020-10-22 2020-10-22 Focusing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112261300B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572956A (en) * 2021-06-25 2021-10-29 荣耀终端有限公司 Focusing method and related equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009223580A (en) * 2008-03-14 2009-10-01 Omron Corp Priority target determination device, electronic apparatus, priority target determination method, program, and recording medium
CN103685907B (en) * 2012-09-26 2019-04-05 联想(北京)有限公司 A kind of method and electronic equipment of Image Acquisition
CN103024199A (en) * 2012-12-28 2013-04-03 天津三星光电子有限公司 Smart camera cellphone
US10498976B2 (en) * 2014-12-05 2019-12-03 Microsoft Technology Licensing, Llc Virtual focus feedback
CN104767938A (en) * 2015-03-27 2015-07-08 广东欧珀移动通信有限公司 Photo shooting method and device
CN105657278A (en) * 2016-02-29 2016-06-08 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN107727616B (en) * 2017-10-16 2020-07-28 山东大学 Auxiliary focusing method and device
CN109257537B (en) * 2018-09-05 2020-12-18 广东小天才科技有限公司 Photographing method and device based on intelligent pen, intelligent pen and storage medium

Also Published As

Publication number Publication date
CN112261300A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN109639970B (en) Shooting method and terminal equipment
CN111601066B (en) Information acquisition method and device, electronic equipment and storage medium
CN112738402B (en) Shooting method, shooting device, electronic equipment and medium
CN112672069B (en) Exposure method and apparatus
CN109495616B (en) Photographing method and terminal equipment
CN113473007B (en) Shooting method and device
CN112291473B (en) Focusing method and device and electronic equipment
US11574415B2 (en) Method and apparatus for determining an icon position
CN113194253A (en) Shooting method and device for removing image reflection and electronic equipment
CN112261300B (en) Focusing method and device and electronic equipment
CN112543284B (en) Focusing system, method and device
CN112929734B (en) Screen projection method and device and electronic equipment
CN112672051B (en) Shooting method and device and electronic equipment
CN112948048A (en) Information processing method, information processing device, electronic equipment and storage medium
CN112416172A (en) Electronic equipment control method and device and electronic equipment
CN113473008B (en) Shooting method and device
CN113286085B (en) Display control method and device and electronic equipment
CN112584110B (en) White balance adjusting method and device, electronic equipment and storage medium
CN114745505A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN113794833A (en) Shooting method and device and electronic equipment
CN113747076A (en) Shooting method and device and electronic equipment
CN112532879B (en) Image processing method and device
US20110285624A1 (en) Screen positioning system and method based on light source type
CN112399076B (en) Video shooting method and device
CN112286429B (en) Control method and control device of electronic equipment and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant