CN109218612B - Tracking shooting system and shooting method - Google Patents

Tracking shooting system and shooting method Download PDF

Info

Publication number
CN109218612B
CN109218612B CN201811088412.XA CN201811088412A CN109218612B CN 109218612 B CN109218612 B CN 109218612B CN 201811088412 A CN201811088412 A CN 201811088412A CN 109218612 B CN109218612 B CN 109218612B
Authority
CN
China
Prior art keywords
shooting
tracking
terminal
target object
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811088412.XA
Other languages
Chinese (zh)
Other versions
CN109218612A (en
Inventor
叶志申
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Fengzhan Electronic Technology Co ltd
Original Assignee
Dongguan Fengzhan Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Fengzhan Electronic Technology Co ltd filed Critical Dongguan Fengzhan Electronic Technology Co ltd
Priority to CN201811088412.XA priority Critical patent/CN109218612B/en
Publication of CN109218612A publication Critical patent/CN109218612A/en
Application granted granted Critical
Publication of CN109218612B publication Critical patent/CN109218612B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Abstract

The invention discloses a tracking shooting method.A shooting terminal establishes communication connection with a shooting auxiliary carrier, receives a tracking shooting instruction input by a user, and executes the tracking shooting instruction to enter a tracking shooting mode; and shooting operation is carried out in a tracking shooting mode, and tracking video information is obtained. The invention also discloses a tracking shooting system, which comprises a shooting terminal, a shooting auxiliary carrier, a server and a viewing terminal, wherein the shooting terminal and the shooting auxiliary carrier are small in size and light in weight, the shooting terminal and the shooting auxiliary carrier are detachably connected, are not limited by a field and are easy to carry, the application field range of tracking shooting is expanded, the server is in communication connection with the shooting terminal, the shooting terminal is in communication connection with the shooting auxiliary carrier, the shooting terminal is carried on the shooting auxiliary carrier, and the target object is tracked and shot in an all-dimensional manner along with the shooting auxiliary carrier, so that the flexibility is good, and the tracking effect is good.

Description

Tracking shooting system and shooting method
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of video shooting, in particular to a tracking shooting system and a shooting method.
[ background of the invention ]
The existing tracking shooting equipment is large in size and heavy in weight, so that the tracking shooting equipment is generally fixedly arranged in a fixed place, and the tracking shooting requirements of a user at any time and any place cannot be met. In addition, most of the shooting devices in the market perform tracking shooting on objects in a specific area, but do not lock a certain object to perform tracking shooting, and thus different shooting requirements cannot be met.
In view of the above, it is desirable to provide a tracking shooting apparatus that can solve the above technical problems.
[ summary of the invention ]
The invention aims to overcome the defects of the prior art, and provides a tracking shooting system which comprises a shooting terminal, a shooting auxiliary carrier, a server and a viewing terminal, wherein the shooting terminal is detachably connected with the shooting auxiliary carrier and is in communication connection with the shooting auxiliary carrier, the shooting terminal is provided with a camera, the server is in communication connection with the shooting terminal and the viewing terminal respectively, the shooting terminal comprises a processor and a memory, the processor is coupled with the memory, the memory is stored with a computer program which can be executed on the processor, and a face contour template is prestored in the memory.
Preferably, the photographing terminal includes a processor and a memory, the processor being coupled to the memory, the memory having stored thereon a computer program executable on the processor.
Preferably, the shooting terminal comprises a first bluetooth communication module, the shooting auxiliary carrier comprises a second bluetooth communication module, and the shooting terminal establishes communication connection with the shooting auxiliary carrier through the first bluetooth communication module and the second bluetooth communication module.
Preferably, a communication port is arranged on the shooting terminal, the shooting auxiliary carrier comprises a data line, a terminal of the data line is inserted into the communication port, and the shooting terminal is in communication connection with the shooting auxiliary carrier through the communication port and the data line.
Preferably, the shooting auxiliary carrier comprises a rotary seat and a shooting terminal placement groove, the lower end of the rotary seat is provided with a 360-degree rotating mechanism, and the rotary seat is provided with a shooting terminal placement groove matched with the shooting terminal.
Preferably, the shooting terminal comprises a connection establishing module, a mode entering module and a shooting module, the connection establishing module is detachably connected to the shooting auxiliary carrier, the connection establishing module is in communication connection with the shooting auxiliary carrier, the mode entering module is used for receiving a tracking shooting instruction input by a user and executing the tracking shooting instruction to enter a tracking shooting mode, and the shooting module is used for shooting in the tracking shooting mode to obtain tracking video information.
Preferably, the connection establishing module includes a starting unit and a communication connection establishing unit, the photographing module includes a confirming unit and an omnidirectional photographing unit, the confirming unit is used for confirming the photographing area of the photographing module and confirming at least one target object in the photographing area, and the omnidirectional photographing unit is used for controlling the photographing auxiliary carrier to rotate when detecting that the at least one target object moves, so that the at least one target object is in the photographing area.
The invention also provides a tracking shooting method, which comprises the following steps:
step S1, the shooting terminal establishes communication connection with the shooting auxiliary carrier through wired communication or wireless communication;
step S2, a user inputs a tracking shooting instruction to the viewing terminal, and the viewing terminal receives the tracking shooting instruction input by the user and feeds the tracking shooting instruction back to the server;
and step S3, the server transmits the tracking shooting instruction to the shooting terminal, and the shooting terminal receives the tracking shooting instruction, enters a tracking shooting mode to carry out shooting operation and obtains tracking video information.
Preferably, the tracking video information in step S3 includes:
when at least two target objects exist in a shooting area, the shooting terminal collects face contour information of each target object;
when the collected face contour information is consistent with the pre-stored face contour template, the target object corresponding to the face contour information is continuously tracked and shot, and the tracking and shooting of the rest target objects are stopped.
The invention has the beneficial effects that: compared with the prior art, the tracking shooting system and the shooting method provided by the invention have the advantages that the shooting terminal and the shooting auxiliary carrier are small in size and light in weight, the shooting terminal and the shooting auxiliary carrier are detachably connected, the shooting terminal is not limited by a place and is easy to carry, the application field range of tracking shooting is expanded, the server is in communication connection with the shooting terminal, the shooting terminal is in communication connection with the shooting auxiliary carrier, the shooting terminal is carried on the shooting auxiliary carrier, the target object is tracked and shot by the shooting auxiliary carrier in 360 degrees, the flexibility is good, the tracking effect is good, and the purpose of locking a specific object for tracking shooting can be met.
The features and advantages of the present invention will be described in detail by embodiments in conjunction with the accompanying drawings.
[ description of the drawings ]
FIG. 1 is a schematic diagram of a frame structure of a tracking shooting system according to the present invention;
FIG. 2 is a flowchart illustrating a tracking shooting method according to a first embodiment of the present invention;
FIG. 3 is a schematic flow chart of a modification of the first embodiment of the tracking shooting method of the present invention;
FIG. 4 is a schematic flow chart of a modified embodiment of the tracking shooting method based on FIG. 3;
FIG. 5 is a flowchart illustrating a second tracking shooting method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a photographing terminal of a tracking photographing system according to the present invention;
FIG. 7 is a schematic diagram of a connection establishing module of a tracking shooting system according to the present invention;
FIG. 8 is a schematic diagram of a camera module of a tracking camera system according to the present invention;
FIG. 9 is a schematic diagram of a frame structure of a photographing terminal of a tracking photographing system according to the present invention;
fig. 10 is a schematic diagram of a frame structure of another embodiment of a photographing terminal of a tracking photographing system according to the present invention.
In the figure: the method comprises the steps of 1-shooting a terminal, 2-shooting an auxiliary carrier, 3-a server, 4-a viewing terminal, 20-a rotating seat and 21-a shooting terminal placing groove.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood, however, that the description herein of specific embodiments is only intended to illustrate the invention and not to limit the scope of the invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The embodiment of the invention provides a tracking shooting system, which refers to fig. 1, and comprises a shooting terminal 1, a shooting auxiliary carrier 2, a server 3 and a viewing terminal 4, wherein the shooting terminal 1 is detachably connected with the shooting auxiliary carrier 2, the shooting terminal 1 is in communication connection with the shooting auxiliary carrier 2, a camera is arranged on the shooting terminal 1, the server 3 is in communication connection with the shooting terminal 1 and the viewing terminal 4 respectively, the shooting terminal 1 comprises a processor and a memory, the processor is coupled with the memory, a computer program capable of being executed on the processor is stored in the memory, and a face contour template is prestored in the memory.
Specifically, the photographing terminal 1 is used to perform tracking photographing to obtain tracking video information. Note that the photographing terminal 1 in the present embodiment includes a camera, a mobile terminal, or a tablet computer. Preferably, the photographing terminal 1 in the present embodiment is a mobile terminal. The shooting terminal 1 is used for shooting, and tracking shooting is performed with the aid of the omnidirectional shooting auxiliary carrier 2. Therefore, a camera terminal having a camera and a communication module (e.g., a bluetooth module) for communicating with the omni-directional photographing auxiliary carrier 2 is within the scope of the present invention.
Further, the shooting auxiliary carrier 2 is used for assisting the shooting terminal 1 in omnibearing shooting, and includes a panorama shooting pan-tilt, an unmanned aerial vehicle, a robot or a toy car, and the like, and has a communication module (such as a bluetooth module) communicating with the shooting terminal 1, in this embodiment, taking the panorama shooting pan-tilt as an example, the shooting auxiliary carrier 2 includes a rotary base 20 and a shooting terminal installation groove 21, the lower end of the rotary base 20 is provided with a 360-degree rotating mechanism, the rotary base 20 is provided with the shooting terminal installation groove 21 mutually adaptive to the shooting terminal 1, when the shooting needs to be tracked, the shooting terminal 1 is clamped in the shooting terminal installation groove 21, and the rotary base 20 drives the shooting terminal 1 to rotate 360 degrees.
The server 3 in this embodiment is configured to store tracking video information collected by the shooting terminal 1 in real time.
The viewing terminal 4 in this embodiment is configured to acquire tracking video information from the server 3 in real time, and output and display the tracking video information. The viewing terminal 4 in this embodiment includes a mobile terminal, a desktop computer, a tablet computer, a notebook computer, or the like.
Referring to fig. 6, the photographing terminal 1 in this embodiment includes a connection establishing module 10, a mode entering module 11, and a photographing module 12.
The connection establishing module 10 is used for being detachably connected to the shooting auxiliary carrier 2 and establishing communication connection with the shooting auxiliary carrier 2; a mode entering module 11, configured to receive a tracking shooting instruction input by a user, and execute the tracking shooting instruction to enter a tracking shooting mode; and the shooting module 12 is used for shooting in a tracking shooting mode to obtain tracking video information.
In addition to this embodiment, in other embodiments, referring to fig. 7, the connection establishing module 10 includes an initiating unit 100 and a communication connection establishing unit 101.
The starting unit 100 is configured to start a first bluetooth communication module of the starting unit, and start a second bluetooth communication module of the shooting auxiliary carrier 2; a communication connection establishing unit 101, configured to establish a communication connection through the first bluetooth communication module and the second bluetooth communication module.
In addition to the present embodiment, in other embodiments, referring to fig. 8, the photographing module 12 includes a confirmation unit 120 and an omnidirectional photographing unit 121.
The confirming unit 120 is configured to confirm a shooting area of the confirming unit, and confirm at least one target object in the shooting area; and an omnidirectional shooting unit 121, configured to control the shooting auxiliary carrier 2 to rotate when detecting that at least one target object moves, so that the at least one target object is within the shooting area.
In addition to the present embodiment, in other embodiments, referring to fig. 8, the shooting module 12 further includes a motion information collecting unit 130 and a tracking object switching unit 131.
The motion information acquiring unit 130 is configured to acquire motion information of each target object when at least two target objects exist in the shooting area; and a tracking object switching unit 131 configured to continue the tracking shooting of the target object corresponding to the motion information and stop the tracking shooting of the remaining target object when the motion information matches the preset operation information.
In addition to this embodiment, in other embodiments, referring to fig. 9, the shooting terminal 1 further includes a detection module 20, a storage module 21, an export module 30, an identifier adding module 40, an uploading module 41, and a compression processing module 50.
The detection module 20 is configured to detect whether the environment in which the detection module is located has a wireless network; and the storage module 21 is used for storing the tracking video information when the environment in which the tracking video information is located does not have a wireless network.
Further, on the basis of the present embodiment, in other embodiments, referring to fig. 9, the deriving module 30 is configured to, when receiving a call request of a preset application, derive tracking video information to the preset application, and perform an editing operation on the tracking video information in the preset application, where the editing operation includes one or a combination of cutting, image rendering, and text labeling.
Further, on the basis of the present embodiment, in other embodiments, referring to fig. 9, the identifier adding module 40 is configured to generate an identifier and add the identifier to the tracking video information to obtain the shared video information when the environment where the identifier is located has a wireless network; and the uploading module 41 is configured to upload shared video information to the server, and when the server 3 receives a sharing request sent by at least one viewing terminal 4 and determines that a target identifier of the sharing request matches an identifier of the shared video information, the server feeds back the shared video information to the viewing terminal 4 that initiated the sharing request.
Further, on the basis of the present embodiment, in another embodiment, referring to fig. 9, a compression processing module 50 is configured to compress the shared video information.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the shooting terminal is divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the above-mentioned apparatus may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 10 is a schematic block diagram of a shooting terminal according to still another embodiment of the present application, and referring to fig. 10, the shooting terminal in this embodiment includes: at least one processor 80, a memory 81, and a computer program 810 stored in the memory 81 and operable on the processor 80. When the processor 80 executes the computer program 810, the steps in the tracking video capturing and processing method described in the above embodiments are implemented, for example: step S1-step S3 shown in fig. 2. Alternatively, when the processor 80 executes the computer program 810, the functions of the modules/units in the above-described shooting terminal embodiment are implemented, for example: the functions of modules 10-12 shown in fig. 6.
The computer program 810 may be divided into one or more modules/units, which are stored in the memory 81 and executed by the processor 80 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 810 in the photographing terminal.
The photographing terminal includes, but is not limited to, a processor 80 and a memory 81. Those skilled in the art will appreciate that fig. 10 is only one example of a camera terminal and does not constitute a limitation of the camera terminal, and may include more or less components than those shown, or combine some components, or different components, for example, the camera terminal may further include an input device, an output device, a network access device, a bus, etc.
The Processor 80 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may be a read-only memory, a static storage device that may store static information and instructions, a random access memory, or a dynamic storage device that may store information and instructions, or may be an electrically erasable programmable read-only memory, a read-only optical disk, or other optical disk storage, magnetic disk storage media, or other magnetic storage devices. The memory 81 may be connected to the processor 80 via a communication bus or may be integrated with the processor 80.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed photographing terminal and method may be implemented in other ways. For example, the above-described embodiment of the shooting terminal is only illustrative, and for example, the division of a module or a unit is only a logical division, and another division may be implemented in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The embodiment of the present application further provides a storage medium for storing a computer program, which includes program data for executing the embodiment of the tracking video capturing and processing method of the present application. By executing the computer program stored in the storage medium, the tracking video shooting and processing method provided by the application can be realized.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by the computer program 810, and the computer program 810 can be stored in a computer-readable storage medium, where the computer program 810 can implement the steps of the methods described above when being executed by the processor 80. The computer program 810 comprises, inter alia, computer program code, which may be in the form of source code, object code, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer readable medium may include any suitable increase or decrease as required by legislation and patent practice in the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The first embodiment of the tracking shooting method comprises the following steps:
referring to fig. 2, an embodiment of the invention provides a tracking shooting method, which is characterized in that: the method comprises the following steps:
step S1, the shooting terminal 1 establishes communication connection with the shooting auxiliary carrier 2 through wired communication or wireless communication;
specifically, wired communication: the shooting terminal 1 is provided with a communication port, the shooting auxiliary carrier 2 is provided with a data line, and when the shooting terminal 1 and the shooting auxiliary carrier 2 need communication connection, a terminal of the data line is inserted into the communication port, so that the communication connection between the shooting terminal 1 and the shooting auxiliary carrier 2 can be established.
Wireless communication: the shooting auxiliary carrier 2 is provided with a wireless communication module, in this embodiment, the wireless communication module is a second bluetooth communication module, the shooting terminal 1 includes a first bluetooth communication module, and when the shooting terminal 1 needs communication connection with the shooting auxiliary carrier 2, the second bluetooth communication module is started and communication connection is established with the first bluetooth communication module of the shooting terminal 1, and then communication connection between the shooting terminal 1 and the shooting auxiliary carrier 2 is established. Optionally, when the photographing terminal 1 and the photographing auxiliary carrier 2 are in the same wireless network (for example, WiFi), the photographing terminal 1 and the photographing auxiliary carrier 2 may also be in communication connection via the wireless network.
Step S2, the user inputs a tracking shooting instruction to the viewing terminal 4, and the viewing terminal 4 receives the tracking shooting instruction input by the user and feeds the tracking shooting instruction back to the server 3;
in step S3, the server 3 transmits a tracking shooting instruction to the shooting terminal 1, and the shooting terminal 1 enters a tracking shooting mode to perform a shooting operation upon receiving the tracking shooting instruction, and obtains tracking video information.
Preferably, the tracking video information in step S3 includes: when at least two target objects exist in a shooting area, the shooting terminal 1 collects face contour information of each target object;
when the collected face contour information is consistent with the pre-stored face contour template, the target object corresponding to the face contour information is continuously tracked and shot, and the tracking and shooting of the rest target objects are stopped.
As an improvement to the present embodiment, in another embodiment, the tracking video information in step S3 includes: step S30, confirming the photographing area of the photographing terminal 1 itself, and confirming at least one target object within the photographing area. In the present embodiment, the current shooting area of the shooting terminal 1 is the shooting area. In this embodiment, a specific manner of confirming at least one target object in the shooting area may be: outputting and displaying a current photographic image; receiving at least one touch operation input by a user on a display screen; confirming that an object corresponding to each touch operation is a target object; the method can also comprise the following steps: before tracking shooting, shooting to obtain an image, and analyzing the image to obtain at least one target object; the method can also comprise the following steps: when tracking shooting is carried out, a first frame image is obtained through shooting, and the first frame image is analyzed to obtain at least one target object.
Further, in this embodiment, the priority of at least one target object may be determined according to the touch operation input order. Specifically, when there are A, B, C, D four target objects, a is the highest in priority, the priority is gradually lowered, and D is the lowest in priority. Therefore, during shooting, when A and D cannot be simultaneously present in the shooting area, A is tracked and D is discarded. Preferably, in the present embodiment, one target object is confirmed within the photographing region. In step S31, when the shooting terminal 1 detects that the at least one target object moves, the server 3 controls the motion of the shooting auxiliary carrier 2 so that the at least one target object is in the shooting area. When the movement of the at least one target object is detected, the movement of the photographing auxiliary carrier 2 is controlled so that the at least one target object is within the photographing region. The embodiment automatically detects the target object through the shooting terminal 1, and automatically controls the motion of the shooting auxiliary carrier 2 according to the detection result, so that the automatic shooting of the tracking video is realized, and the automatic performance of the tracking video shooting is improved.
Further, after the step S31, a step S40 is included, in which the photographing terminal 1 collects motion information of each target object when at least two target objects exist in the photographing region. In step S41, when the motion information matches the preset operation information, the target object corresponding to the motion information continues to be track-shot, and the track-shot of the remaining target object is stopped. In the present embodiment, the motion information includes hand motions (e.g., V-shaped gestures, OK gestures, arm swing motions, etc.), head motions (head nodding motions, head swinging motions, etc.), leg motions, etc. In order to describe the technical solution of the present invention in more detail, in this embodiment, taking a hand gesture as an example, the technical solution of the present invention is described in detail: assume that there are two target objects, a and B, within the shooting area. And at a certain moment, acquiring the gesture of swinging the V character of the A, continuing to track and shoot the A, and stopping shooting the B. The embodiment can quickly and automatically switch from the track shooting of a plurality of target objects to the track shooting of one target object through the capture of the motion information and the analysis of the motion information, thereby enabling the track shooting to be more intelligent. It should be noted that, the present embodiment may store a plurality of target objects currently being tracked and shot, and when receiving the motion information (for example, an OK gesture) of a target object, may also resume tracking and shooting of a plurality of target objects from tracking and shooting of one target object. Specifically, on the basis of the above embodiment, in other embodiments, the tracking shooting a and B is resumed when the gesture of swinging the OK word by the camera a is acquired.
The shooting terminal 1 and the shooting auxiliary carrier 2 in the embodiment are small in size and light in weight, so that the shooting terminal is not limited by a place and is easy to carry, and the application place range of tracking shooting is expanded. In addition, after the shooting terminal 1 is in communication connection with the shooting auxiliary carrier 2, tracking shooting is automatically performed, so that the automatic performance of tracking shooting is improved.
The second tracking shooting method embodiment:
as an improvement to the first embodiment, referring to fig. 5, an embodiment of the present invention provides a tracking shooting method, which is characterized in that: the method comprises the following steps: in step S10, the photographing terminal 1 is detachably connected to the photographing auxiliary carrier 2 and establishes a communication connection with the photographing auxiliary carrier 2.
The step S10 is similar to the step S1, and therefore, will not be described herein.
In step S11, a tracking shooting command input by the user is received, and the tracking shooting command is executed to enter a tracking shooting mode. The step S11 is similar to the step S2, and therefore, will not be described herein.
In step S12, a shooting operation is performed in the tracking shooting mode to obtain tracking video information. The step S12 is similar to the step S3, and therefore, will not be described herein.
Step S13, detecting whether the environment in which the mobile terminal is located has a wireless network. When the environment in which the mobile terminal is located does not have a wireless network, step S14 is performed. When the environment in which the mobile terminal is located has a wireless network, step S50 is performed.
In this embodiment, the shooting terminal may further detect whether a network is available, specifically, whether a wireless network where the shooting terminal is located is available or whether a mobile communication network of the shooting terminal is available.
In the present embodiment, the shooting terminal 1 needs to upload the tracking video information through the network, and since the traffic cost of the mobile communication network is relatively expensive, the present embodiment preferably transmits the tracking video information through the wireless network. However, in some special cases, the embodiment may also transmit the tracking video information through the mobile communication network.
In step S14, the tracking video information is stored.
In the present embodiment, the tracking video information is stored in the storage device of the shooting terminal 1, such as: and a hard disk.
According to the embodiment, when the network is unavailable (including the network without a wireless network or the mobile communication network is not connected), the tracking video information is automatically stored for the subsequent use of the user, so that the use experience of the user is improved.
As an improvement to the second embodiment, on the basis of the present embodiment, in other embodiments, after the step S14, the method further includes:
step S20, when receiving a call request of a preset application, deriving tracking video information to the preset application, and performing an editing operation on the tracking video information in the preset application, where the editing operation includes one or a combination of cutting, image rendering, and text labeling.
In the present embodiment, the preset application includes a beautiful show application, a video editing application, a judder making application, and the like. The technical solution of the present invention will be described in detail by taking a video editing application as an example.
Specifically, the video editing application selects the tracking video information according to the storage path of the tracking video information, and outputs and displays the tracking video information to the video editing application; the video editing application cuts off redundant video frames and performs operations such as color, scene adjustment, character labeling, image rendering and the like on specific video frames so as to meet the requirements of users.
The tracking video information stored in this embodiment can be used for calling other applications on the shooting terminal 1 to perform editing processing operation, so as to obtain video information meeting the user requirements, thereby further improving the user experience.
It should be noted that, the tracking video information derived by the current preset application may be in a copy form or a cut form, and when derived in the copy form, the tracking video information is not damaged, and may be continuously called by other applications. When the tracking video information is exported in a cutting mode, the tracking video information does not need to be stored, and the user needs to be the processed video information, so that the data storage capacity is reduced, and the data storage efficiency is improved.
As an improvement to the second embodiment, on the basis of the present embodiment, in other embodiments, after the step S13, the method further includes:
and step S50, generating an identifier and adding the identifier to the tracking video information to obtain shared video information.
In this embodiment, the identifier may be a string of randomly generated characters, a string of randomly generated numbers, or an encoded message generated according to a preset rule.
Step S51, uploading the shared video information to the server 3, and when the server 3 receives the sharing request sent by at least one viewing terminal 4 and determines that the target identifier of the sharing request matches the identifier of the shared video information, feeding back the shared video information to the viewing terminal 4 that initiated the sharing request.
In this embodiment, the shooting terminal 1 shoots a scene to obtain video information, and uploads the video information to the server 3 in real time, and the viewing terminal 4 is used for viewing the video information shot in real time.
It should be noted that the tracking shooting system of the present embodiment can be applied to a plurality of scenes, such as: a multi-person learning scenario, a remote command scenario, etc. In order to explain the technical scheme of the invention in detail, the present application is explained in detail by taking a multi-person conference scene as an example.
Assuming that the teacher in class trains at location A, the shooting terminal 1 is at location A, and the two viewing terminals 4 are at location B and location C, respectively, wherein A, B and C are different.
The shooting terminal 1 collects video information in the teaching teacher training process in real time, uploads the video information to the server 3 in real time, the viewing terminals 4 in the sites B, C respectively send sharing requests to the server 3, and the server 3 respectively feeds back the video information to the viewing terminals 4 in the sites B, C after receiving the sharing requests, so that training operation of multiple learning personnel in different sites at the same time is realized, and the training time efficiency of teaching personnel is saved.
In another embodiment, before step S51, the method further includes:
step S60, the shared video information is compressed.
According to the embodiment, the shared video information is compressed before being uploaded, so that the data transmission amount is reduced, the data transmission rate is improved, and the synchronous display effect of the viewing terminal is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The above detailed description of the embodiments of the present invention is provided as an example, and the present invention is not limited to the above described embodiments. It will be apparent to those skilled in the art that any equivalent modifications or substitutions can be made within the scope of the present invention, and thus, equivalent changes and modifications, improvements, etc. made without departing from the spirit and scope of the present invention should be included in the scope of the present invention.

Claims (7)

1. The utility model provides a track shooting system, is applied to many people and learns scene which characterized in that: the shooting terminal (1) is detachably connected with the shooting auxiliary carrier (2), the shooting terminal (1) is in communication connection with the shooting auxiliary carrier (2), a camera is arranged on the shooting terminal (1), the server (3) is in communication connection with the shooting terminal (1) and the viewing terminal (4) respectively, the shooting terminal (1) comprises a processor and a memory, the processor is coupled with the memory, a computer program capable of being executed on the processor is stored in the memory, and a face contour template is prestored in the memory; the shooting terminal (1) is provided with a confirmation unit (120), the confirmation unit (120) is used for confirming a shooting area of the shooting terminal, confirming at least one target object in the shooting area according to received touch operation of a user, and determining the priority of the at least one target object according to the input sequence of the touch operation; the shooting terminal (1) is provided with an action information acquisition unit (130) and a tracking object switching unit (131), wherein the action information acquisition unit (130) is used for acquiring action information of each target object when at least two target objects exist in a shooting area; a tracking object switching unit (131) for continuing to track and shoot the target object corresponding to the motion information and stopping tracking and shooting the rest target object when the motion information is matched with the preset motion information; the shooting terminal (1) is provided with an identification adding module (40) and an uploading module (41), wherein the identification adding module (40) is used for generating an identification and adding the identification to tracking video information to obtain shared video information when the environment where the shooting terminal is located has a wireless network; the uploading module (41) is used for uploading the shared video information to the server (3), when the server (3) receives a sharing request sent by at least one viewing terminal (4), and when the target identifier of the sharing request is judged to be matched with the identifier of the shared video information, the sharing video information is fed back to the viewing terminal (4) which initiates the sharing request;
when at least two target objects exist in a shooting area, the shooting terminal (1) collects face contour information of each target object; when the collected face contour information is consistent with a pre-stored face contour template, continuously tracking and shooting a target object corresponding to the face contour information, and stopping tracking and shooting the rest target objects;
the tracking object switching unit (131) is also configured to resume tracking shooting to a plurality of target objects from tracking shooting of one target object when motion information of a target object of current tracking shooting is received.
2. A track shot system as defined in claim 1, wherein: the shooting terminal (1) comprises a first Bluetooth communication module, the shooting auxiliary carrier (2) comprises a second Bluetooth communication module, and the shooting terminal (1) is in communication connection with the shooting auxiliary carrier (2) through the first Bluetooth communication module and the second Bluetooth communication module.
3. A track shot system as defined in claim 1, wherein: the shooting terminal (1) is provided with a communication port, the shooting auxiliary carrier (2) comprises a data line, a terminal of the data line is inserted into the communication port, and the shooting terminal (1) is in communication connection with the shooting auxiliary carrier (2) through the communication port and the data line.
4. A track shot system as defined in claim 1, wherein: shoot auxiliary carrier (2) including roating seat (20) and shooting terminal mounting groove (21), the lower extreme of roating seat (20) is equipped with 360 pivoted slewing mechanism, be equipped with on roating seat (20) with shoot terminal mounting groove (21) of terminal (1) mutual adaptation.
5. A track shot system as defined in claim 1, wherein: shoot terminal (1) including connecting and establishing module (10), mode entering module (11) and shooting module (12), connect and establish module (10) and can dismantle the connection on shooting auxiliary carrier (2) and connect and establish module (10) and shoot auxiliary carrier (2) communication connection, mode entering module (11) are used for receiving the tracking of user input and shoot the instruction, and carry out the tracking and shoot the instruction and in order to get into the tracking and shoot the mode, shooting module (12) are used for shooing the operation under the tracking shooting mode, obtain tracking video information.
6. A track shot system as defined in claim 5, wherein: the connection establishing module (10) comprises a starting unit (100) and a communication connection establishing unit (101), the shooting module (12) comprises a confirming unit (120) and an all-round shooting unit (121), the confirming unit (120) is used for confirming the shooting area of the shooting module and confirming at least one target object in the shooting area, and the all-round shooting unit (121) is used for controlling the shooting auxiliary carrier (2) to rotate when detecting that the at least one target object moves so that the at least one target object is in the shooting area.
7. A tracking shooting method is applied to a multi-user learning scene and is characterized in that: the method comprises the following steps:
step S1, the shooting terminal (1) establishes communication connection with the shooting auxiliary carrier (2) through wired communication or wireless communication;
step S2, a user inputs a tracking shooting instruction to the viewing terminal (4), and the viewing terminal (4) receives the tracking shooting instruction input by the user and feeds the tracking shooting instruction back to the server (3);
step S3, the server (3) transmits the tracking shooting instruction to the shooting terminal (1), the shooting terminal (1) receives the tracking shooting instruction and enters a tracking shooting mode to carry out shooting operation, and tracking video information is obtained, wherein the tracking video information comprises: confirming a shooting area of a shooting terminal, confirming at least one target object in the shooting area according to received touch operation of a user, and determining the priority of the at least one target object according to the input sequence of the touch operation;
after step S3, the method further includes:
step S40, when at least two target objects exist in the shooting area, the shooting terminal (1) collects the action information of each target object;
step S41, when the motion information matches the preset operation information, continuing to track and shoot the target object corresponding to the motion information, and stopping track and shoot the rest target object;
after step S3, the method further includes:
step S50, generating an identifier and adding the identifier to the tracking video information to obtain shared video information;
step S51, uploading shared video information to a server (3), when the server (3) receives a sharing request sent by at least one viewing terminal (4), and when the target identifier of the sharing request is judged to be matched with the identifier of the shared video information, feeding back the shared video information to the viewing terminal (4) initiating the sharing request;
wherein the tracking video information in step S3 includes: when at least two target objects exist in a shooting area, a shooting terminal (1) collects face contour information of each target object; when the collected face contour information is consistent with a pre-stored face contour template, continuously tracking and shooting a target object corresponding to the face contour information, and stopping tracking and shooting the rest target objects;
after the step S41, the method further includes: when motion information of a target object of current track shooting is received, track shooting to a plurality of target objects is resumed from track shooting of one target object.
CN201811088412.XA 2018-09-17 2018-09-17 Tracking shooting system and shooting method Active CN109218612B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811088412.XA CN109218612B (en) 2018-09-17 2018-09-17 Tracking shooting system and shooting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811088412.XA CN109218612B (en) 2018-09-17 2018-09-17 Tracking shooting system and shooting method

Publications (2)

Publication Number Publication Date
CN109218612A CN109218612A (en) 2019-01-15
CN109218612B true CN109218612B (en) 2022-04-22

Family

ID=64983926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811088412.XA Active CN109218612B (en) 2018-09-17 2018-09-17 Tracking shooting system and shooting method

Country Status (1)

Country Link
CN (1) CN109218612B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112822388B (en) 2019-11-15 2022-07-22 北京小米移动软件有限公司 Shooting mode triggering method, device, equipment and storage medium
CN114600445A (en) * 2020-12-16 2022-06-07 深圳市大疆创新科技有限公司 Tracking algorithm operation method and device, electronic equipment and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102239687A (en) * 2009-10-07 2011-11-09 松下电器产业株式会社 Device, method, program, and circuit for selecting subject to be tracked
CN102594990A (en) * 2012-02-10 2012-07-18 中兴通讯股份有限公司 Smart mobile phone base, mobile phone and implementation methods thereof
CN104853162A (en) * 2015-05-11 2015-08-19 杭州轨物科技有限公司 Network real-time monitoring system based on smart phone
CN106791535A (en) * 2016-11-28 2017-05-31 合网络技术(北京)有限公司 Video recording method and device
CN206398292U (en) * 2017-01-18 2017-08-11 东莞市壹佳柒电子科技有限公司 All-around intelligent tracks bluetooth auto heterodyne head
CN107249037A (en) * 2017-07-02 2017-10-13 东华大学 A kind of mobile communication Monitor Equipment of controllable head

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106024003B (en) * 2016-05-10 2020-01-31 北京地平线信息技术有限公司 Voice positioning and enhancing system and method combined with image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102239687A (en) * 2009-10-07 2011-11-09 松下电器产业株式会社 Device, method, program, and circuit for selecting subject to be tracked
CN102594990A (en) * 2012-02-10 2012-07-18 中兴通讯股份有限公司 Smart mobile phone base, mobile phone and implementation methods thereof
CN104853162A (en) * 2015-05-11 2015-08-19 杭州轨物科技有限公司 Network real-time monitoring system based on smart phone
CN106791535A (en) * 2016-11-28 2017-05-31 合网络技术(北京)有限公司 Video recording method and device
CN206398292U (en) * 2017-01-18 2017-08-11 东莞市壹佳柒电子科技有限公司 All-around intelligent tracks bluetooth auto heterodyne head
CN107249037A (en) * 2017-07-02 2017-10-13 东华大学 A kind of mobile communication Monitor Equipment of controllable head

Also Published As

Publication number Publication date
CN109218612A (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN108933915B (en) Video conference device and video conference management method
CN110944109B (en) Photographing method, device and equipment
CN109194916B (en) Movable shooting system with image processing module
JP2016527800A (en) Wireless video camera
US11463270B2 (en) System and method for operating an intelligent face framing management system for videoconferencing applications
CN112995566B (en) Sound source positioning method based on display device, display device and storage medium
EP2892205A1 (en) Method and device for determining terminal to be shared and system
WO2016008209A1 (en) Tool of mobile terminal and intelligent audio-video integration server
CN104715246A (en) Photographing assisting system, device and method with a posture adjusting function,
CN109218612B (en) Tracking shooting system and shooting method
WO2020227996A1 (en) Photography control method and apparatus, control device and photography device
US9007531B2 (en) Methods and apparatus for expanding a field of view in a video communication session
WO2017193805A1 (en) Method and apparatus for realizing remote monitoring in conference television system
CN111988555B (en) Data processing method, device, equipment and machine readable medium
CN106791681A (en) Video monitoring and face identification method, apparatus and system
WO2023236848A1 (en) Device control method, apparatus and system, and electronic device and readable storage medium
CN105187708A (en) Method and system for shooting panorama
CN109194918B (en) Shooting system based on mobile carrier
CN111526280A (en) Control method and device of camera device, electronic equipment and storage medium
US10447969B2 (en) Image processing device, image processing method, and picture transmission and reception system
CN111526295B (en) Audio and video processing system, acquisition method, device, equipment and storage medium
CN112995565B (en) Camera adjustment method of display device, display device and storage medium
CN203930321U (en) A kind of embedded projector and tele-conferencing system
CN202907049U (en) Automatic cruising camera
US20240104857A1 (en) Electronic system and method to provide spherical background effects for video generation for video call

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant