CN112511743B - Video shooting method and device - Google Patents

Video shooting method and device Download PDF

Info

Publication number
CN112511743B
CN112511743B CN202011340850.8A CN202011340850A CN112511743B CN 112511743 B CN112511743 B CN 112511743B CN 202011340850 A CN202011340850 A CN 202011340850A CN 112511743 B CN112511743 B CN 112511743B
Authority
CN
China
Prior art keywords
target
shooting
determining
shooting object
wearing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011340850.8A
Other languages
Chinese (zh)
Other versions
CN112511743A (en
Inventor
梁浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weiwo Software Technology Co ltd
Original Assignee
Nanjing Weiwo Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Weiwo Software Technology Co ltd filed Critical Nanjing Weiwo Software Technology Co ltd
Priority to CN202011340850.8A priority Critical patent/CN112511743B/en
Publication of CN112511743A publication Critical patent/CN112511743A/en
Application granted granted Critical
Publication of CN112511743B publication Critical patent/CN112511743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a video shooting method and a device, wherein the method comprises the following steps: determining target wearable equipment connected with the electronic equipment through Bluetooth; acquiring the position information of the target wearable device; determining a target shooting object wearing the target wearing equipment in a target scene according to the position information of the target wearing equipment; determining a target part on the target shooting object; and taking the target part as a shooting focus, and carrying out video shooting on the target scene to obtain a target video. By implementing the method, the target wearing equipment can be used for positioning, the shooting object wearing the target wearing equipment in the shooting scene is subjected to focusing shooting, the shooting object is determined to be the video visual focus all the time in the scene with dense crowds, the shooting expectation is reached, and the video shooting effect is improved.

Description

Video shooting method and device
Technical Field
The application belongs to the technical field of video processing, and particularly relates to a video shooting method and device.
Background
In recent years, with rapid development of internet technology and upgrading of hardware configuration of devices, functions of electronic devices are becoming more abundant, and more users use electronic devices to perform entertainment activities, for example, shooting videos using electronic devices. At present, when a user uses electronic equipment to shoot a video in a park, a school, a hospital and other occasions where people are dense, shooting focuses are often shifted by back-and-forth crowds, so that video shooting is not focused, a core is blurred by shot people, the shooting expectation of the user cannot be achieved, and the video shooting effect is poor.
Disclosure of Invention
The embodiment of the application aims to provide a video shooting method and a video shooting device so as to solve the technical problem that in the prior art, the video shooting effect is poor.
In a first aspect, an embodiment of the present application provides a video shooting method, where the method includes:
determining target wearable equipment connected with the electronic equipment through Bluetooth;
acquiring the position information of the target wearable device;
determining a target shooting object wearing the target wearing equipment in a target scene according to the position information;
determining a target part on the target shooting object;
and taking the target part as a shooting focus, and carrying out video shooting on the target scene to obtain a target video.
Optionally, as an embodiment, the target wearing device is multiple;
the determining a target shooting object wearing the target wearing device in a target scene according to the position information includes:
determining each shooting object wearing the target wearing equipment in a target scene according to the position information;
acquiring contour information of each shot object;
and determining a target shooting object from the shooting objects according to the contour information of the shooting objects.
Optionally, as an embodiment, the determining a target photographic subject from the photographic subjects according to the contour information of the photographic subjects includes:
determining a shooting object of which the face faces the electronic equipment in each shooting object according to the contour information of each shooting object;
if the number of the photographic objects of which the faces face the electronic equipment is 1, determining the photographic object of which the faces face the electronic equipment as a target photographic object;
if the number of the shooting objects of the face facing the electronic equipment is multiple, determining the shooting object which is closest to the electronic equipment in the shooting objects of the face facing the electronic equipment;
if the number of the shooting objects closest to the electronic equipment is 1, determining the shooting object closest to the electronic equipment as a target shooting object;
and if a plurality of shooting objects closest to the electronic equipment are provided, determining the shooting object in the center of the lens in the shooting objects closest to the electronic equipment as a target shooting object.
Optionally, as an embodiment, the determining a target portion on the target photographic object includes:
determining a face orientation of the target photographic subject;
and determining a target part on the target shooting object according to the face orientation of the target shooting object.
Optionally, as an embodiment, the determining a target portion on the target photographic subject according to the face orientation of the target photographic subject includes:
if the face of the target shooting object faces the electronic equipment and the facial expression is a preset expression, determining the face of the target shooting object as a target part;
and if the face of the target shooting object is not oriented to the electronic equipment or the face of the target shooting object is oriented to the electronic equipment but the facial expression is not a preset expression, determining the barycentric position of the target shooting object as a target part.
Optionally, as an embodiment, the target wearable device is a watch or a bracelet; the method further comprises the following steps:
acquiring the heart rate of the target shooting object through the target wearing equipment;
and displaying the heart rate of the target shooting object in the picture of the target video.
Optionally, as an embodiment, the method further includes:
acquiring peripheral scenes and weather information of the target scene according to the position information;
acquiring target music matched with the surrounding scene and the weather information;
and adding the target music as background music of the target video.
In a second aspect, an embodiment of the present application provides a video shooting apparatus, including:
the first determining module is used for determining target wearable equipment connected with the electronic equipment through Bluetooth;
the first acquisition module is used for acquiring the position information of the target wearable device;
the second determining module is used for determining a target shooting object wearing the target wearing equipment in a target scene according to the position information;
the third determining module is used for determining a target part on the target shooting object;
and the shooting module is used for taking the target part as a shooting focus and carrying out video shooting on the target scene to obtain a target video.
Optionally, as an embodiment, the target wearing device is multiple;
the second determining module comprises:
the first determining submodule is used for determining each shooting object wearing the target wearing equipment in a target scene according to the position information;
the acquisition sub-module is used for acquiring the contour information of each shooting object;
and the second determining submodule is used for determining a target shooting object from the shooting objects according to the contour information of the shooting objects.
Optionally, as an embodiment, the second determining sub-module includes:
a first determination unit configured to determine a photographic subject whose face is oriented toward the electronic apparatus among the photographic subjects, based on contour information of the photographic subjects;
a second determination unit configured to determine a photographic subject whose face faces the electronic apparatus as a target photographic subject if the number of photographic subjects whose faces face the electronic apparatus is 1;
a third determination unit configured to determine, if there are a plurality of photographic subjects whose faces face the electronic apparatus, a photographic subject whose face faces the electronic apparatus that is closest to the electronic apparatus among the photographic subjects whose faces face the electronic apparatus;
a fourth determination unit configured to determine the photographic subject closest to the electronic device as a target photographic subject if the photographic subject closest to the electronic device is 1;
a fifth determining unit configured to determine, as a target photographic subject, a photographic subject at a center of a lens among photographic subjects closest to the electronic device, if the photographic subjects closest to the electronic device are plural.
Optionally, as an embodiment, the third determining module includes:
a third determination sub-module for determining a face orientation of the target photographic subject;
and the fourth determining sub-module is used for determining a target part on the target shooting object according to the face orientation of the target shooting object.
Optionally, as an embodiment, the fourth determining sub-module includes:
a sixth determining unit, configured to determine the face of the target photographic subject as a target part if the face of the target photographic subject faces the electronic device and the facial expression is a preset expression;
a seventh determination unit configured to determine, as a target part, the barycentric position of the target photographic subject if the face of the target photographic subject is not oriented toward the electronic apparatus or if the face of the target photographic subject is oriented toward the electronic apparatus but the facial expression is not a preset expression.
Optionally, as an embodiment, the target wearable device is a watch or a bracelet; the device further comprises:
the second acquisition module is used for acquiring the heart rate of the target shooting object through the target wearing equipment;
and the display module is used for displaying the heart rate of the target shooting object in the picture of the target video.
Optionally, as an embodiment, the apparatus further includes:
the third acquisition module is used for acquiring the surrounding scene and the weather information of the target scene according to the position information;
the fourth acquisition module is used for acquiring target music matched with the surrounding scenes and the weather information;
and the adding module is used for adding the target music into the background music of the target video.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored in the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the video shooting method according to the first aspect.
In a fourth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the video shooting method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the video shooting method according to the first aspect.
In the embodiment of the application, the target wearing equipment connected with the electronic equipment through the Bluetooth is determined, the position information of the target wearing equipment is obtained, then the target shooting object wearing the target wearing equipment in the target scene is determined according to the position information of the target wearing equipment, then the target part on the target shooting object is determined, and finally the target part is taken as a shooting focus to carry out video shooting on the target scene to obtain the target video. Compared with the prior art, the target wearable device can be used for positioning, the shooting object wearing the target wearable device in the shooting scene is focused and shot, the shooting object is determined to be the video visual focus all the time in the scene with dense crowds, the shooting expectation is achieved, and therefore the video shooting effect is improved.
Drawings
Fig. 1 is a flowchart of a video shooting method provided in an embodiment of the present application;
fig. 2 is an exemplary diagram of a video shooting method provided in an embodiment of the present application;
fig. 3 is a block diagram of a video camera according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a hardware structure diagram of an electronic device implementing various embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The video shooting method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
It should be noted that the video shooting method provided in the embodiment of the present application is applicable to an electronic device, and in practical applications, the electronic device may include: the mobile terminal includes a smart phone, a tablet computer, a personal digital assistant, and the like, which is not limited in the embodiment of the present application.
Fig. 1 is a flowchart of a video shooting method provided in an embodiment of the present application, and as shown in fig. 1, the method may include the following steps: step 101, step 102, step 103, step 104 and step 105, wherein,
in step 101, a target wearable device connected with an electronic device through bluetooth is determined.
In this application embodiment, wearing equipment can include: intelligent wrist-watch, intelligent bracelet and intelligent glasses etc..
In this application embodiment, the target wearing device may include: any wearable device connected with the electronic device through Bluetooth; alternatively, the target-wearing device may include: wearing equipment that passes through bluetooth connection with electronic equipment and name for specific name, this application embodiment does not do the restriction to this.
In the embodiment of the application, the electronic device may be triggered to determine the target wearable device when the shooting input by the user is received, or the electronic device may automatically trigger to determine the target wearable device.
In step 102, location information of the target wearable device is obtained.
In the embodiment of the application, the electronic device can indicate the target wearing device to position, and indicate the target wearing device to send the positioned position information to the electronic device, so that the electronic device determines a target shooting object wearing the target wearing device in a current shooting scene according to the position information.
In step 103, a target shooting object wearing the target wearing device in the target scene is determined according to the position information of the target wearing device.
In the embodiment of the application, the electronic device can determine the distance from the target wearing device to the electronic device according to the position information of the target wearing device, determine a shooting object wearing the target wearing device in a shooting scene according to the distance between the target wearing device and the electronic device, and determine the target shooting object from the shooting object wearing the target wearing device.
In an embodiment provided by the present application, when the number of the target-wearing devices is 1, the number of the shooting objects wearing the target-wearing devices in the target scene is also 1, and the step 103 may specifically include the following steps:
determining a shooting object wearing the target wearing equipment in a target scene according to the position information of the target wearing equipment; and determining the shooting object wearing the target wearing equipment in the target scene as a target shooting object.
In another embodiment provided by the present application, when there are a plurality of target-wearing devices, there are a plurality of shooting targets wearing the target-wearing devices in the target scene, and the step 103 may specifically include the following steps (not shown in the figure): step 1031, step 1032 and step 1033, wherein,
in step 1031, each of the subjects wearing the target wearable device in the target scene is determined according to the position information of the target wearable device.
In step 1032, contour information of each photographic subject is acquired.
In the embodiment of the present application, the contour information of the photographic subject is used to determine whether the photographic subject faces the lens of the electronic device or faces the back of the electronic device.
In step 1033, a target photographic subject is determined from the photographic subjects on the basis of the contour information of the photographic subjects.
In the embodiment of the application, when a plurality of wearing devices are connected with the electronic device through Bluetooth, people with faces facing to the lens can be focused preferentially, if a plurality of faces face the lens, people with faces closer to the lens are focused preferentially, and if the faces are similar to the lens, people with faces in the center of the lens are focused preferentially.
In this case, the step 1033 may specifically include the following steps:
determining a shooting object of which the face faces towards the electronic equipment in each shooting object according to the contour information of each shooting object;
if the number of the photographic objects with the faces facing the electronic equipment is 1, determining the photographic object with the faces facing the electronic equipment as a target photographic object;
if the number of the shooting objects of which the faces face the electronic equipment is multiple, determining the shooting object which is closest to the electronic equipment in the shooting objects of which the faces face the electronic equipment;
if the number of the shooting objects closest to the electronic equipment is 1, determining the shooting object closest to the electronic equipment as a target shooting object;
if the shooting objects closest to the electronic equipment are multiple, the shooting object in the center of the lens in the shooting objects closest to the electronic equipment is determined as the target shooting object.
In step 104, a target portion on the target photographic subject is determined.
In the embodiment of the application, the target part is a shooting focus when the electronic equipment shoots.
In this embodiment of the present application, the target portion may be determined according to a face orientation of the target shooting object, in this case, the step 104 may specifically include the following steps (not shown in the figure): a step 1041 and a step 1042, wherein,
in step 1041, determining a face orientation of the target photographic subject;
in step 1042, a target portion on the target photographic subject is determined according to the face orientation of the target photographic subject.
In an embodiment provided by the application, the center-of-gravity position of the shooting object can be calculated in real time through an algorithm, when the face of the shooting object has an expression, for example, smiling or grimacing, the focus position is on the face, and the focus position is on the face without the expression, so that the shooting object is always a video visual focus in a scene with dense crowds, and the expectation of a photographer is achieved. In this case, the step 1042 may specifically include the following steps:
if the face of the target shooting object faces the electronic equipment and the facial expression is a preset expression, determining the face of the target shooting object as a target part;
if the face of the target photographic subject is not oriented towards the electronic equipment or the face of the target photographic subject is oriented towards the electronic equipment but the facial expression is not a preset expression, the barycentric position of the target photographic subject is determined as the target part.
In another embodiment provided by the present application, the step 1042 specifically includes the following steps:
the face of the target photographic subject is determined as the target part if the face of the target photographic subject is directed toward the electronic device.
If the back of the target photographic subject faces the electronic equipment, the gravity center position of the target photographic subject is determined as the target part.
In step 105, a target portion is taken as a shooting focus, and a target scene is shot to obtain a target video.
In the embodiment of the application, when the target scene is shot, the target part of the target shooting object is always taken as the shooting focus until shooting is finished, and the target video is obtained.
As can be seen from the above embodiment, in this embodiment, first, a target wearable device connected to an electronic device through bluetooth is determined, location information of the target wearable device is obtained, then, a target shooting object wearing the target wearable device in a target scene is determined according to the location information of the target wearable device, then, a target portion on the target shooting object is determined, and finally, the target scene is subjected to video shooting with the target portion as a shooting focus, so as to obtain a target video. Compared with the prior art, the target wearable device can be used for positioning, the shooting object wearing the target wearable device in the shooting scene is focused and shot, the shooting object is determined to be the video visual focus all the time in the scene with dense crowds, the shooting expectation is achieved, and therefore the video shooting effect is improved.
In another embodiment provided by the application, when the target wearable device is a watch or a bracelet, the heart rate information of the user during the video shooting process can be increased in an auxiliary manner to improve the overall shooting experience, and at this time, the video shooting method in this embodiment can further include the following steps on the basis of the embodiment shown in fig. 1:
acquiring the heart rate of a target shooting object through target wearing equipment; and displaying the heart rate of the target shooting object in a picture of the target video.
In one example, the shooting scene is a target scene 20, and the target scene 20 includes: the imaging apparatus includes an imaging subject 21, an imaging subject 22, and an imaging subject 23, wherein the imaging subject 22 is worn with a target wearing device 221, and a face of the imaging subject 22 faces a lens. When the target scene 20 is photographed, the face of the subject 22 is photographed as a photographing focus, and the heart rate of the subject 22 is displayed.
In another embodiment provided by the present application, a scene and weather around a shooting scene may also be identified, and background music corresponding to a tag is intelligently matched and added to a video according to the scene and weather conditions, so as to improve the overall shooting experience, where at this time, the video shooting method in this embodiment may further include the following steps on the basis of the embodiment shown in fig. 1:
according to the position information of the target wearable device, acquiring peripheral scenes and weather information of a target scene; and acquiring target music matched with the surrounding scene and the weather information, and adding the target music as background music of the target video.
It should be noted that, in the video shooting method provided in the embodiment of the present application, the execution subject may be a video shooting device, or a control module in the video shooting device, which is used for executing the loading of the video shooting method. In the embodiment of the present application, a video shooting device executing a method of loading video shooting is taken as an example, and the video shooting device provided in the embodiment of the present application is described.
Fig. 3 is a block diagram of a video camera according to an embodiment of the present disclosure, and as shown in fig. 3, the video camera 300 may include: a first determining module 301, a first acquiring module 302, a second determining module 303, a third determining module 304, and a photographing module 305, wherein,
the first determining module 301 is configured to determine a target wearable device connected to an electronic device through bluetooth;
a first obtaining module 302, configured to obtain location information of the target wearable device;
a second determining module 303, configured to determine, according to the location information, a target shooting object wearing the target wearable device in a target scene;
a third determining module 304, configured to determine a target portion on the target photographic object;
and a shooting module 305, configured to take the target portion as a shooting focus, and perform video shooting on the target scene to obtain a target video.
As can be seen from the above embodiment, in this embodiment, first, a target wearable device connected to an electronic device through bluetooth is determined, location information of the target wearable device is obtained, then, a target shooting object wearing the target wearable device in a target scene is determined according to the location information of the target wearable device, then, a target portion on the target shooting object is determined, and finally, the target scene is subjected to video shooting with the target portion as a shooting focus, so as to obtain a target video. Compared with the prior art, the target wearable device can be used for positioning, the shooting object wearing the target wearable device in the shooting scene is focused and shot, the shooting object is determined to be the video visual focus all the time in the scene with dense crowds, the shooting expectation is achieved, and therefore the video shooting effect is improved.
Optionally, as an embodiment, the target wearing device is multiple; the second determining module 303 may include:
the first determining submodule is used for determining each shooting object wearing the target wearing equipment in a target scene according to the position information;
the acquisition submodule is used for acquiring the contour information of each shooting object;
and the second determining submodule is used for determining a target shooting object from all the shooting objects according to the contour information of all the shooting objects.
Optionally, as an embodiment, the second determining submodule may include:
a first determination unit configured to determine a photographic subject whose face is directed to the electronic apparatus among the photographic subjects, based on contour information of the photographic subjects;
a second determination unit configured to determine, as a target photographic subject, a photographic subject whose face is directed toward the electronic apparatus if the photographic subject whose face is directed toward the electronic apparatus is 1;
a third determination unit configured to determine, if there are a plurality of photographic subjects whose faces face the electronic apparatus, a photographic subject whose face faces the electronic apparatus that is closest to the electronic apparatus among the photographic subjects whose faces face the electronic apparatus;
a fourth determination unit configured to determine the photographic subject closest to the electronic device as a target photographic subject if the photographic subject closest to the electronic device is 1;
a fifth determining unit configured to determine, as a target photographic subject, a photographic subject at a center of a lens among photographic subjects closest to the electronic device, if the photographic subjects closest to the electronic device are plural.
Optionally, as an embodiment, the third determining module 304 may include:
a third determination sub-module for determining a face orientation of the target photographic subject;
and the fourth determining sub-module is used for determining a target part on the target shooting object according to the face orientation of the target shooting object.
Optionally, as an embodiment, the fourth determining sub-module may include:
a sixth determining unit configured to determine the face of the target photographic subject as a target part if the face of the target photographic subject faces the electronic device and the facial expression is a preset expression;
a seventh determining unit configured to determine a barycentric position of the target photographic subject as a target part if the face of the target photographic subject is not oriented toward the electronic apparatus or the face of the target photographic subject is oriented toward the electronic apparatus but a facial expression is not a preset expression.
Optionally, as an embodiment, the target wearable device is a watch or a bracelet; the video camera 300 may further include:
the second acquisition module is used for acquiring the heart rate of the target shooting object through the target wearing equipment;
and the display module is used for displaying the heart rate of the target shooting object in the picture of the target video.
Optionally, as an embodiment, the video capturing apparatus 300 may further include:
the third acquisition module is used for acquiring the surrounding scene and the weather information of the target scene according to the position information;
the fourth acquisition module is used for acquiring target music matched with the surrounding scene and the weather information;
and the adding module is used for adding the target music into the background music of the target video.
The video shooting device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The video camera in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The video shooting device provided in the embodiment of the present application can implement each process implemented in the embodiment of the method in fig. 1, and is not described here again to avoid repetition.
Optionally, as shown in fig. 4, an electronic device 400 is further provided in this embodiment of the present application, and includes a processor 401, a memory 402, and a program or an instruction stored in the memory 402 and executable on the processor 401, where the program or the instruction is executed by the processor 401 to implement each process of the above-mentioned video shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application. The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and the like.
Those skilled in the art will appreciate that the electronic device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 5 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 510 is configured to determine a target wearable device connected to the electronic device through bluetooth; acquiring the position information of the target wearable device; determining a target shooting object wearing the target wearing equipment in a target scene according to the position information; determining a target part on the target shooting object; and taking the target part as a shooting focus, and carrying out video shooting on the target scene to obtain a target video.
Therefore, in the embodiment of the application, the target wearing equipment can be used for positioning, the shooting object wearing the target wearing equipment in the shooting scene is focused and shot, the shooting object is determined to be the video visual focus all the time in the scene with dense crowds, the shooting expectation is achieved, and the video shooting effect is improved.
Optionally, as an embodiment, the target wearing device is multiple; the processor 510 is further configured to determine, according to the location information, each shooting object wearing the target wearable device in a target scene; acquiring contour information of each shot object; and determining a target shooting object from the shooting objects according to the contour information of the shooting objects.
Optionally, as an embodiment, the processor 510 is further configured to determine, according to the contour information of each of the photographic subjects, a photographic subject whose face is facing the electronic device in each of the photographic subjects;
if the number of the photographic objects of which the faces face the electronic equipment is 1, determining the photographic object of which the faces face the electronic equipment as a target photographic object;
if the number of the shooting objects of the face facing the electronic equipment is multiple, determining the shooting object which is closest to the electronic equipment in the shooting objects of the face facing the electronic equipment;
if the number of the shooting objects closest to the electronic equipment is 1, determining the shooting object closest to the electronic equipment as a target shooting object;
and if a plurality of shooting objects closest to the electronic equipment are provided, determining the shooting object in the center of the lens in the shooting objects closest to the electronic equipment as a target shooting object.
Optionally, as an embodiment, the processor 510 is further configured to determine a face orientation of the target photographic subject; and determining a target part on the target shooting object according to the face orientation of the target shooting object.
Optionally, as an embodiment, the processor 510 is further configured to determine the face of the target photographic subject as a target part if the face of the target photographic subject faces the electronic device and the facial expression is a preset expression;
and if the face of the target shooting object is not oriented to the electronic equipment or the face of the target shooting object is oriented to the electronic equipment but the facial expression is not a preset expression, determining the barycentric position of the target shooting object as a target part.
Optionally, as an embodiment, the target wearable device is a watch or a bracelet; the processor 510 is further configured to acquire, by the target wearable device, a heart rate of the target shooting subject; and displaying the heart rate of the target shooting object in the picture of the target video.
Optionally, as an embodiment, the target wearable device is a watch or a bracelet; the processor 510 is further configured to obtain a surrounding scene of the target scene and weather information according to the position information; acquiring target music matched with the surrounding scene and the weather information; and adding the target music as background music of the target video.
It should be understood that in the embodiment of the present application, the input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes a touch panel 5071 and other input devices 5072. A touch panel 5071, also referred to as a touch screen. The touch panel 5071 may include two parts of a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 509 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communication. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the video shooting method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer-readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above video shooting method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatuses in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions recited, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the present embodiments are not limited to those precise embodiments, which are intended to be illustrative rather than restrictive, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of the appended claims.

Claims (8)

1. A method of video capture, the method comprising:
determining target wearable equipment connected with the electronic equipment through Bluetooth;
acquiring the position information of the target wearable device;
determining a target shooting object wearing the target wearing equipment in a target scene according to the position information;
determining a target part on the target shooting object;
taking the target part as a shooting focus, and carrying out video shooting on the target scene to obtain a target video;
the determining, according to the location information, a target shooting object wearing the target wearing device in a target scene includes:
determining each shooting object wearing the target wearing equipment in a target scene according to the position information;
acquiring contour information of each shooting object, wherein the contour information of each shooting object is used for determining whether each shooting object faces to a lens of the electronic equipment or the back faces to the lens of the electronic equipment;
and determining a target shooting object from the shooting objects according to the contour information of the shooting objects.
2. The method according to claim 1, wherein the determining a target photographic subject from the photographic subjects according to the contour information of the photographic subjects comprises:
determining a shooting object of which the face faces the electronic equipment in each shooting object according to the contour information of each shooting object;
if the number of the shooting objects of the face facing the electronic equipment is 1, determining the shooting object of the face facing the electronic equipment as a target shooting object;
if the number of the shooting objects of the face facing the electronic equipment is multiple, determining the shooting object which is closest to the electronic equipment in the shooting objects of the face facing the electronic equipment;
if the number of the shooting objects closest to the electronic equipment is 1, determining the shooting object closest to the electronic equipment as a target shooting object;
and if a plurality of shooting objects closest to the electronic equipment are provided, determining the shooting object in the center of the lens in the shooting objects closest to the electronic equipment as a target shooting object.
3. The method of claim 1, wherein the determining the target area on the target photographic subject comprises:
determining a face orientation of the target photographic subject;
and determining a target part on the target shooting object according to the face orientation of the target shooting object.
4. A video camera, the apparatus comprising:
the first determining module is used for determining target wearable equipment connected with the electronic equipment through Bluetooth;
the first acquisition module is used for acquiring the position information of the target wearable device;
the second determining module is used for determining a target shooting object wearing the target wearing equipment in a target scene according to the position information;
the third determining module is used for determining a target part on the target shooting object;
the shooting module is used for taking the target part as a shooting focus and carrying out video shooting on the target scene to obtain a target video;
the target wearing equipment is multiple;
the second determining module comprises:
the first determining sub-module is used for determining each shooting object wearing the target wearing equipment in a target scene according to the position information;
the acquisition sub-module is used for acquiring the contour information of each shooting object, and the contour information of each shooting object is used for determining whether each shooting object faces to the lens of the electronic equipment or the back faces to the lens of the electronic equipment;
and the second determining submodule is used for determining a target shooting object from all the shooting objects according to the contour information of all the shooting objects.
5. The apparatus of claim 4, wherein the second determining submodule comprises:
a first determination unit configured to determine a photographic subject whose face is oriented toward the electronic apparatus among the photographic subjects, based on contour information of the photographic subjects;
a second determination unit configured to determine, as a target photographic subject, a photographic subject whose face is directed toward the electronic apparatus if the photographic subject whose face is directed toward the electronic apparatus is 1;
a third determination unit configured to determine, if there are a plurality of photographic subjects whose faces face the electronic apparatus, a photographic subject whose face faces the electronic apparatus that is closest to the electronic apparatus among the photographic subjects whose faces face the electronic apparatus;
a fourth determination unit configured to determine the photographic subject closest to the electronic device as a target photographic subject if the photographic subject closest to the electronic device is 1;
a fifth determination unit, configured to determine, as a target photographic subject, a photographic subject in the center of a lens among the photographic subjects closest to the electronic device, if there are a plurality of photographic subjects closest to the electronic device.
6. The apparatus of claim 4, wherein the third determining module comprises:
a third determination sub-module for determining a face orientation of the target photographic subject;
and the fourth determining submodule is used for determining a target part on the target shooting object according to the face direction of the target shooting object.
7. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the video capture method of any of claims 1 to 3.
8. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the video shooting method according to any one of claims 1 to 3.
CN202011340850.8A 2020-11-25 2020-11-25 Video shooting method and device Active CN112511743B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011340850.8A CN112511743B (en) 2020-11-25 2020-11-25 Video shooting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011340850.8A CN112511743B (en) 2020-11-25 2020-11-25 Video shooting method and device

Publications (2)

Publication Number Publication Date
CN112511743A CN112511743A (en) 2021-03-16
CN112511743B true CN112511743B (en) 2022-07-22

Family

ID=74959857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011340850.8A Active CN112511743B (en) 2020-11-25 2020-11-25 Video shooting method and device

Country Status (1)

Country Link
CN (1) CN112511743B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115209027A (en) * 2021-03-25 2022-10-18 华为技术有限公司 Camera focusing method and electronic equipment
CN114500826B (en) * 2021-12-09 2023-06-27 成都市喜爱科技有限公司 Intelligent shooting method and device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106657781A (en) * 2016-12-19 2017-05-10 北京小米移动软件有限公司 Target object photographing method and target object photographing device
CN110213493A (en) * 2019-06-28 2019-09-06 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN110337806A (en) * 2018-05-30 2019-10-15 深圳市大疆创新科技有限公司 Group picture image pickup method and device
CN110868536A (en) * 2019-11-05 2020-03-06 珠海格力电器股份有限公司 Access control system control method and access control system
CN111543047A (en) * 2017-09-30 2020-08-14 深圳传音通讯有限公司 Video shooting method and device and computer readable storage medium
CN111601066A (en) * 2020-05-26 2020-08-28 维沃移动通信有限公司 Information acquisition method and device and electronic equipment
CN111726531A (en) * 2020-06-29 2020-09-29 北京小米移动软件有限公司 Image shooting method, processing method, device, electronic equipment and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7376250B2 (en) * 2004-01-05 2008-05-20 Honda Motor Co., Ltd. Apparatus, method and program for moving object detection
JP2007243634A (en) * 2006-03-09 2007-09-20 Seiko Epson Corp Image generating apparatus and photographing method of same
JP4382117B2 (en) * 2007-07-25 2009-12-09 株式会社スクウェア・エニックス Image generating apparatus and method, program, and recording medium
JP2009260630A (en) * 2008-04-16 2009-11-05 Olympus Corp Image processor and image processing program
JP5385032B2 (en) * 2009-07-08 2014-01-08 ソニーモバイルコミュニケーションズ株式会社 Imaging apparatus and imaging control method
CN103196429B (en) * 2013-03-25 2015-03-04 东南大学 Method for quickly obtaining and measuring orthophotoquad of city skyline contour line facade
CN103780841A (en) * 2014-01-23 2014-05-07 深圳市金立通信设备有限公司 Shooting method and shooting device
CN105872363A (en) * 2016-03-28 2016-08-17 广东欧珀移动通信有限公司 Adjustingmethod and adjusting device of human face focusing definition
CN110312069A (en) * 2018-03-20 2019-10-08 青岛海信移动通信技术股份有限公司 Focusing method and device in shooting process
CN111385460A (en) * 2018-12-28 2020-07-07 北京字节跳动网络技术有限公司 Image processing method and device
CN110264765A (en) * 2019-06-26 2019-09-20 广州小鹏汽车科技有限公司 Detection method, device, computer equipment and the storage medium of vehicle parking state
CN211401101U (en) * 2020-03-06 2020-09-01 深圳市九丞技术有限公司 High-precision 3D contour modeling equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106657781A (en) * 2016-12-19 2017-05-10 北京小米移动软件有限公司 Target object photographing method and target object photographing device
CN111543047A (en) * 2017-09-30 2020-08-14 深圳传音通讯有限公司 Video shooting method and device and computer readable storage medium
CN110337806A (en) * 2018-05-30 2019-10-15 深圳市大疆创新科技有限公司 Group picture image pickup method and device
CN110213493A (en) * 2019-06-28 2019-09-06 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN110868536A (en) * 2019-11-05 2020-03-06 珠海格力电器股份有限公司 Access control system control method and access control system
CN111601066A (en) * 2020-05-26 2020-08-28 维沃移动通信有限公司 Information acquisition method and device and electronic equipment
CN111726531A (en) * 2020-06-29 2020-09-29 北京小米移动软件有限公司 Image shooting method, processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112511743A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN112637500B (en) Image processing method and device
CN112511743B (en) Video shooting method and device
CN112954212B (en) Video generation method, device and equipment
CN112532885B (en) Anti-shake method and device and electronic equipment
CN113794834B (en) Image processing method and device and electronic equipment
CN112333382B (en) Shooting method and device and electronic equipment
CN112788244B (en) Shooting method, shooting device and electronic equipment
CN112437231B (en) Image shooting method and device, electronic equipment and storage medium
CN113709368A (en) Image display method, device and equipment
CN114466140B (en) Image shooting method and device
CN113891002B (en) Shooting method and device
CN112653841B (en) Shooting method and device and electronic equipment
CN115499589A (en) Shooting method, shooting device, electronic equipment and medium
CN112738398B (en) Image anti-shake method and device and electronic equipment
CN114339051A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN112383708B (en) Shooting method and device, electronic equipment and readable storage medium
CN114125226A (en) Image shooting method and device, electronic equipment and readable storage medium
CN114093005A (en) Image processing method and device, electronic equipment and readable storage medium
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN113794833A (en) Shooting method and device and electronic equipment
CN112367464A (en) Image output method and device and electronic equipment
CN117097982B (en) Target detection method and system
CN112367470B (en) Image processing method and device and electronic equipment
CN114071016B (en) Image processing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant