CN114245013A - Virtual shooting system and shooting method - Google Patents

Virtual shooting system and shooting method Download PDF

Info

Publication number
CN114245013A
CN114245013A CN202111544587.9A CN202111544587A CN114245013A CN 114245013 A CN114245013 A CN 114245013A CN 202111544587 A CN202111544587 A CN 202111544587A CN 114245013 A CN114245013 A CN 114245013A
Authority
CN
China
Prior art keywords
module
virtual
characteristic
matrix
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111544587.9A
Other languages
Chinese (zh)
Inventor
陈剑
杨晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sokailish Multimedia Technology Co ltd
Original Assignee
Shanghai Sokailish Multimedia Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sokailish Multimedia Technology Co ltd filed Critical Shanghai Sokailish Multimedia Technology Co ltd
Priority to CN202111544587.9A priority Critical patent/CN114245013A/en
Publication of CN114245013A publication Critical patent/CN114245013A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a virtual shooting system and a shooting method in the technical field of virtual images, wherein the virtual shooting system comprises a virtual base storage module, a scene base processing module, a feature delineation detection module and a reset query module; the virtual matrix storage module is used for storing image big data information of each division characteristic of the virtual matrix; the scene matrix processing module is used for carrying out characteristic division on an implanted matrix in the virtual simulation scene; the characteristic delineation detection module is used for performing delineation linear detection on the division characteristics implanted in the matrix; the system comprises a virtual matrix storage module, a reset query module and a virtual matrix fusion query module, wherein the virtual matrix storage module is used for storing a feature image of the implanted matrix, and the feature image is used for carrying out virtual matrix fusion query on the feature image.

Description

Virtual shooting system and shooting method
Technical Field
The present invention relates to the field of virtual image technology, and in particular, to a virtual shooting system and a virtual shooting method.
Background
The "virtual shooting" is a shooting operation required by a director during movie shooting, and all shots are taken in a virtual scene in a computer. The various elements required to take this shot, including scenes, characters, lights, etc., are all integrated into the computer, and the director can then "direct" the performance and actions of the character on the computer to move his shot from any angle, according to his own intentions.
In the prior art, in the process of virtual shooting, after a base body is implanted into a virtual scene, when the integration of the base body stroking position and the virtual scene is poor, the surface position needs to be processed one by one in a manual matting mode.
However, in the process of implementing the technical solution of the invention in the embodiments of the present application, the inventors of the present application find that the above-mentioned technology has at least the following technical problems:
when the mode of adopting the cutout to handle is handled the poor condition of drawing amalgamation nature, efficiency is slow, leads to the linear effect of drawing in the cutout result poor moreover easily, influences the whole effect after the integration.
Based on this, the invention designs a virtual shooting system and a shooting method to solve the above problems.
Disclosure of Invention
In order to solve the technical problems mentioned in the background art, the present invention provides a virtual shooting system and a shooting method.
In order to achieve the purpose, the invention adopts the following technical scheme:
a virtual shooting system comprises a virtual base storage module, a scene base processing module, a feature edge detection module and a reset query module;
the virtual matrix storage module is used for storing image big data information of each division characteristic of the virtual matrix;
the scene matrix processing module is used for carrying out characteristic division on an implanted matrix in the virtual simulation scene;
the characteristic delineation detection module is used for performing delineation linear detection on the division characteristics implanted in the matrix;
the resetting and inquiring module is used for eliminating the implanted substrate characteristics of which the stroking linearity does not meet the preset requirement and resetting the characteristic image reaching the specified similarity with the implanted substrate characteristics in the virtual substrate storage module to the implanted substrate characteristic region.
Preferably, the virtual base body storage module comprises a base body image acquisition module, a feature division module and a division image storage module;
the base body image acquisition module is used for acquiring all visual angle image big data information of the base body;
the characteristic division module is used for dividing each detail characteristic of the obtained virtual matrix image data;
and the divided image storage module is used for classifying and storing all the action images of each detail characteristic.
Preferably, the scene matrix processing module comprises a scene simulation module, a matrix implantation module and an implantation characteristic division module;
the scene simulation module is used for simulating a virtual scene;
the matrix implanting module is used for acquiring implanting information of the matrix to the virtual scene;
the implantation characteristic division module is used for carrying out characteristic division on an implantation matrix in a virtual scene.
Preferably, the feature stroke detection module comprises a stroke smoothness analysis module and a feature cleaning module;
the stroke smoothness analysis module is used for performing linear processing on the divided implantation characteristics and performing query analysis on linear peripheral pixel points;
and the characteristic cleaning module is used for collecting a control signal when the linear peripheral pixel point reaches a preset value to control and clean the characteristic.
Preferably, the stroking smoothness analysis module comprises a stroking linear query module, a linear data storage module and a linear peripheral pixel point query module;
the linear data storage module is used for storing linear curve data corresponding to various stroked images;
the stroked linear query module is used for querying linear curve data corresponding to the implanted characteristic stroked image;
and the linear peripheral pixel point query module is used for querying pixel point data information on two sides of the linear curve.
Preferably, the reset query module comprises a feature search module and an implantation processing module;
the characteristic searching module is used for searching a virtual characteristic image in the virtual base storage module, wherein the similarity between the characteristic searching module and the cleaned implantation characteristic reaches a specified similarity;
and the implantation processing module is used for implanting the inquired virtual characteristic image into the corresponding position of the substrate.
Preferably, the feature searching module comprises a cleaning feature acquisition module and a feature comparison module;
the cleaning characteristic acquisition module is used for acquiring implantation characteristic data to be cleaned;
the feature comparison module is used for comparing the similarity of the cleaned implantation features with the storage features in the virtual base body storage module and outputting a virtual feature image reaching the specified similarity.
Preferably, the implantation processing module comprises a proportion adjusting module, a scene implantation module and a combination color adjusting module;
the proportion adjusting module is used for correspondingly adjusting the proportion of the virtual features according to the proportion of the cleaned implanted features;
the scene implantation module is used for implanting the virtual features with the adjusted proportion into corresponding positions of a base body in a scene;
and the combination color adjusting module is used for adjusting the color of the virtual features implanted on the substrate.
A virtual photographing method includes the steps of:
s1, acquiring a virtual matrix image, classifying and storing the virtual matrix image after characteristic division;
s2, collecting the division characteristic information of the implanted matrix in scene simulation;
s3, detecting the pixel quantity of the linear periphery of the division characteristics of the implanted matrix;
s4, detecting a control signal of which the pixel point quantity exceeds a preset value, and controlling to carry out similarity query on the virtual features;
and S5, outputting the virtual feature reaching the specified similarity value and implanting the corresponding substrate position of the implanted feature.
One or more technical solutions provided in the embodiments of the present invention have at least the following technical effects or advantages:
1. according to the invention, the characteristics of the matrix implanted into the virtual scene are divided, and the condition of the pixel points around the stroked line of the implanted characteristics is analyzed, so that the detection of the fusion of each characteristic of the implanted matrix in the virtual scene can be conveniently realized;
2. according to the method, each azimuth characteristic of the virtual base body is divided and stored, so that when the integration of the implanted characteristics does not meet the preset requirement, similarity query is carried out on the implanted characteristics in the virtual characteristic library, and the corresponding position of the virtual characteristics which reach the specified similarity in the implanted base body is searched, so that the integration of the virtual characteristics after being added into the base body in the virtual scene is ensured;
in conclusion, the method has the advantages of accurate fusion query of the virtual matrix, better fusion between the virtual scene and the matrix and the like.
Drawings
The invention is described in further detail below with reference to the following figures and detailed description:
FIG. 1 is a diagram of the overall system architecture of the present invention;
FIG. 2 is a system diagram of a feature delineation detection module of the present invention;
FIG. 3 is a system diagram of a reset query module according to the present invention;
fig. 4 is a flowchart of the photographing method of the present invention.
Detailed Description
The following description of the embodiments of the present invention is provided for illustrative purposes, and other advantages and effects of the present invention will become apparent to those skilled in the art from the present disclosure.
Example one
Please refer to fig. 1 to 3. The invention provides a technical scheme that: a virtual shooting system comprises a virtual base storage module, a scene base processing module, a feature edge detection module and a reset query module;
the virtual matrix storage module is used for storing image big data information of each division characteristic of the virtual matrix;
the scene matrix processing module is used for carrying out characteristic division on an implanted matrix in the virtual simulation scene;
the characteristic delineation detection module is used for performing delineation linear detection on the division characteristics implanted in the matrix;
the resetting and inquiring module is used for eliminating the implanted substrate characteristics of which the stroking linearity does not meet the preset requirement and resetting the characteristic image reaching the specified similarity with the implanted substrate characteristics in the virtual substrate storage module to the implanted substrate characteristic region.
Through the steps, it is not difficult to find that, in the process of virtual shooting, the virtual substrate can be a human body or an article, the article can be an intelligent device, such as a robot, the characteristics of the virtual substrate, such as the characteristics of the arms, fingers, feet and the like of the human body, and the intelligent arm and the like of the intelligent robot, are collected through the virtual substrate storage module, when the substrate is implanted in the virtual simulation scene through the scene substrate processing module, the implanted substrate is divided into the corresponding characteristics, the characteristic edge tracing detection module extracts edge tracing images of the divided characteristics, performs edge tracing linear detection, inquires the pixel quantity around the edge tracing linear, extracts the implanted characteristics to perform virtual characteristic image search reaching the specified similarity in the virtual substrate storage module when the pixel quantity exceeds the standard, and guides the image into the corresponding position of the substrate in the virtual scene, thereby ensuring the perfect fusion of the matrix and the virtual scene.
In order to better realize the characteristic classification storage of the virtual matrix, the virtual matrix storage module comprises a matrix image acquisition module, a characteristic division module and a divided image storage module;
the base body image acquisition module is used for acquiring all visual angle image big data information of the base body;
the characteristic division module is used for dividing each detail characteristic of the obtained virtual matrix image data;
and the divided image storage module is used for classifying and storing all the action images of each detail characteristic.
In the embodiment, by acquiring the azimuth image data of the matrix, and performing feature division and classified storage on the acquired matrix image, the virtual features reaching the specified similarity can be conveniently inquired according to the implantation features, and perfect fusion after implantation is ensured.
In order to better realize the query processing of the scene matrix, the scene matrix processing module comprises a scene simulation module, a matrix implantation module and an implantation characteristic division module;
the scene simulation module is used for simulating a virtual scene;
the matrix implanting module is used for acquiring implanting information of the matrix to the virtual scene;
the implantation characteristic division module is used for carrying out characteristic division on an implantation matrix in a virtual scene.
In this embodiment, by implanting the matrix in the virtual scene, the feature of the implanted matrix in the virtual scene is divided, so as to facilitate subsequent feature fusion query.
In order to better realize the fusion query of the features to the virtual scene, as shown in fig. 2, the feature delineation detection module comprises a delineation smoothness analysis module and a feature cleaning module;
the stroke smoothness analysis module is used for performing linear processing on the divided implantation characteristics and performing query analysis on linear peripheral pixel points;
and the characteristic cleaning module is used for collecting a control signal when the linear peripheral pixel point reaches a preset value to control and clean the characteristic.
In this embodiment, through the smoothness analysis of drawing a border, can be convenient for know the linear peripheral pixel condition after the base member is implanted into the virtual scene, and then when the pixel volume exceeds standard, clear up this linear corresponding characteristic.
In order to better realize the analysis and processing of the stroking condition, the stroking smoothness analysis module comprises a stroking linear query module, a linear data storage module and a linear peripheral pixel point query module;
the linear data storage module is used for storing linear curve data corresponding to various stroked images;
the stroked linear query module is used for querying linear curve data corresponding to the implanted characteristic stroked image;
and the linear peripheral pixel point query module is used for querying pixel point data information on two sides of the linear curve.
In this embodiment, by storing the linear curve data corresponding to all the stroked edges, the corresponding linear curve is queried according to the characteristic stroked edge image, and then the pixel conditions of the linear periphery dispersion are queried, that is, the greater the number of the pixels, the poorer the fusion of the virtual scene is.
In order to better realize the reset processing of the virtual features in the substrate, as shown in fig. 3, the reset query module comprises a feature search module and an implantation processing module;
the characteristic searching module is used for searching a virtual characteristic image in the virtual base storage module, wherein the similarity between the characteristic searching module and the cleaned implantation characteristic reaches a specified similarity;
and the implantation processing module is used for implanting the inquired virtual characteristic image into the corresponding position of the substrate.
In this embodiment, after the virtual feature having the specified similarity with the implantation feature is searched, the perfect fusion of the corresponding position of the substrate is performed after the implantation process.
In order to better realize the query processing of the virtual features, the feature searching module comprises a cleaning feature acquisition module and a feature comparison module;
the cleaning characteristic acquisition module is used for acquiring implantation characteristic data to be cleaned;
the feature comparison module is used for comparing the similarity of the cleaned implantation features with the storage features in the virtual base body storage module and outputting a virtual feature image reaching the specified similarity.
In the embodiment, the accuracy of virtual feature query is ensured by collecting the cleaning features and performing the virtual feature query with the specified similarity in the virtual feature repository.
In order to better realize the resetting processing of the virtual features to the corresponding positions of the implanted features in the matrix, the implanted processing module comprises a proportion adjusting module, a scene implanting module and a combination color adjusting module;
the proportion adjusting module is used for correspondingly adjusting the proportion of the virtual features according to the proportion of the cleaned implanted features;
the scene implantation module is used for implanting the virtual features with the adjusted proportion into corresponding positions of a base body in a scene;
and the combination color adjusting module is used for adjusting the color of the virtual features implanted on the substrate.
In this embodiment, the color difference after the virtual feature and the substrate are combined is reduced by adjusting the corresponding position of the implanted substrate after the virtual feature is adjusted according to the feature ratio of the implanted feature, and adjusting the combined color.
Example two
Referring to fig. 4, the present invention further provides a virtual shooting method, which includes the following steps:
s1, acquiring a virtual matrix image, classifying and storing the virtual matrix image after characteristic division;
s2, collecting the division characteristic information of the implanted matrix in scene simulation;
s3, detecting the pixel quantity of the linear periphery of the division characteristics of the implanted matrix;
s4, detecting a control signal of which the pixel point quantity exceeds a preset value, and controlling to carry out similarity query on the virtual features;
and S5, outputting the virtual feature reaching the specified similarity value and implanting the corresponding substrate position of the implanted feature.
According to the virtual shooting method, the characteristic of the virtual substrate is divided and the virtual characteristic images are stored in advance, when the substrate is implanted into the virtual scene, the substrate is divided in the virtual scene, the pixel conditions of the linear periphery of the stroked edges of the substrate in the virtual scene are judged, when the number of the pixel points exceeds the standard, the virtual characteristic images which reach the specified similarity in the virtual characteristic image storage library corresponding to the implanted characteristics are automatically inquired, and the virtual characteristic images are reset to the specified position of the substrate in the virtual scene, so that the fusion of the implanted characteristics in the virtual scene is ensured.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A virtual shooting system is characterized by comprising a virtual base storage module, a scene base processing module, a feature delineation detection module and a reset query module;
the virtual matrix storage module is used for storing image big data information of each division characteristic of the virtual matrix;
the scene matrix processing module is used for carrying out characteristic division on an implanted matrix in the virtual simulation scene;
the characteristic delineation detection module is used for performing delineation linear detection on the division characteristics implanted in the matrix;
the resetting and inquiring module is used for eliminating the implanted substrate characteristics of which the stroking linearity does not meet the preset requirement and resetting the characteristic image reaching the specified similarity with the implanted substrate characteristics in the virtual substrate storage module to the implanted substrate characteristic region.
2. The virtual shooting system of claim 1, wherein the virtual base storage module comprises a base image acquisition module, a feature dividing module and a divided image storage module;
the base body image acquisition module is used for acquiring all visual angle image big data information of the base body;
the characteristic division module is used for dividing each detail characteristic of the obtained virtual matrix image data;
and the divided image storage module is used for classifying and storing all the action images of each detail characteristic.
3. The virtual shooting system of claim 1, wherein the scene matrix processing module comprises a scene simulation module, a matrix implantation module and an implantation feature dividing module;
the scene simulation module is used for simulating a virtual scene;
the matrix implanting module is used for acquiring implanting information of the matrix to the virtual scene;
the implantation characteristic division module is used for carrying out characteristic division on an implantation matrix in a virtual scene.
4. The virtual camera system as claimed in claim 1, wherein the feature stroke detection module comprises a stroke smoothness analysis module and a feature cleaning module;
the stroke smoothness analysis module is used for performing linear processing on the divided implantation characteristics and performing query analysis on linear peripheral pixel points;
and the characteristic cleaning module is used for collecting a control signal when the linear peripheral pixel point reaches a preset value to control and clean the characteristic.
5. The virtual camera system as claimed in claim 4, wherein the stroke smoothness analysis module comprises a stroke linear query module, a linear data storage module and a linear peripheral pixel point query module;
the linear data storage module is used for storing linear curve data corresponding to various stroked images;
the stroked linear query module is used for querying linear curve data corresponding to the implanted characteristic stroked image;
and the linear peripheral pixel point query module is used for querying pixel point data information on two sides of the linear curve.
6. The virtual camera system as claimed in claim 5, wherein the feature cleaning module comprises a pixel amount detection module and a disassembly control module;
the pixel quantity detection module is used for monitoring signals of which the total quantity of the pixels inquired by the linear peripheral pixel point inquiry module exceeds a preset value;
and the disassembly control module is used for controlling the disassembly and the removal of the implantation characteristic on the substrate when the signals exceeding and customizing are acquired.
7. The virtual camera system as claimed in claim 1, wherein the reset query module comprises a feature search module and an implantation processing module;
the characteristic searching module is used for searching a virtual characteristic image in the virtual base storage module, wherein the similarity between the characteristic searching module and the cleaned implantation characteristic reaches a specified similarity;
and the implantation processing module is used for implanting the inquired virtual characteristic image into the corresponding position of the substrate.
8. The virtual camera system as claimed in claim 7, wherein the feature searching module comprises a cleaning feature collecting module and a feature comparing module;
the cleaning characteristic acquisition module is used for acquiring implantation characteristic data to be cleaned;
the feature comparison module is used for comparing the similarity of the cleaned implantation features with the storage features in the virtual base body storage module and outputting a virtual feature image reaching the specified similarity.
9. The virtual camera system as claimed in claim 8, wherein the implantation processing module comprises a scale adjustment module, a scene implantation module and a combination color adjustment module;
the proportion adjusting module is used for correspondingly adjusting the proportion of the virtual features according to the proportion of the cleaned implanted features;
the scene implantation module is used for implanting the virtual features with the adjusted proportion into corresponding positions of a base body in a scene;
and the combination color adjusting module is used for adjusting the color of the virtual features implanted on the substrate.
10. A virtual photographing method is characterized by comprising the following steps:
s1, acquiring a virtual matrix image, classifying and storing the virtual matrix image after characteristic division;
s2, collecting the division characteristic information of the implanted matrix in scene simulation;
s3, detecting the pixel quantity of the linear periphery of the division characteristics of the implanted matrix;
s4, detecting a control signal of which the pixel point quantity exceeds a preset value, and controlling to carry out similarity query on the virtual features;
and S5, outputting the virtual feature reaching the specified similarity value and implanting the corresponding substrate position of the implanted feature.
CN202111544587.9A 2021-12-16 2021-12-16 Virtual shooting system and shooting method Pending CN114245013A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111544587.9A CN114245013A (en) 2021-12-16 2021-12-16 Virtual shooting system and shooting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111544587.9A CN114245013A (en) 2021-12-16 2021-12-16 Virtual shooting system and shooting method

Publications (1)

Publication Number Publication Date
CN114245013A true CN114245013A (en) 2022-03-25

Family

ID=80757579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111544587.9A Pending CN114245013A (en) 2021-12-16 2021-12-16 Virtual shooting system and shooting method

Country Status (1)

Country Link
CN (1) CN114245013A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202682A1 (en) * 2006-10-11 2010-08-12 Gta Geoinformatik Gmbh Method for texturizing virtual three-dimensional objects
US20150379335A1 (en) * 2014-06-27 2015-12-31 Microsoft Corporation Dynamic Remapping of Components of a Virtual Skeleton
JP2019220932A (en) * 2018-06-14 2019-12-26 株式会社バーチャルキャスト Content distribution system, content distribution method, computer program, content distribution server, and transmission path
CN112714337A (en) * 2020-12-22 2021-04-27 北京百度网讯科技有限公司 Video processing method and device, electronic equipment and storage medium
CN113330484A (en) * 2018-12-20 2021-08-31 斯纳普公司 Virtual surface modification

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202682A1 (en) * 2006-10-11 2010-08-12 Gta Geoinformatik Gmbh Method for texturizing virtual three-dimensional objects
US20150379335A1 (en) * 2014-06-27 2015-12-31 Microsoft Corporation Dynamic Remapping of Components of a Virtual Skeleton
JP2019220932A (en) * 2018-06-14 2019-12-26 株式会社バーチャルキャスト Content distribution system, content distribution method, computer program, content distribution server, and transmission path
CN113330484A (en) * 2018-12-20 2021-08-31 斯纳普公司 Virtual surface modification
CN112714337A (en) * 2020-12-22 2021-04-27 北京百度网讯科技有限公司 Video processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
Wahl et al. Surflet-pair-relation histograms: a statistical 3D-shape representation for rapid classification
CN109313799B (en) Image processing method and apparatus
EP3499414B1 (en) Lightweight 3d vision camera with intelligent segmentation engine for machine vision and auto identification
CN108875730B (en) Deep learning sample collection method, device, equipment and storage medium
CN109325456B (en) Target identification method, target identification device, target identification equipment and storage medium
CN107481267A (en) A kind of shooting projection interactive system and method based on binocular vision
CN111062400B (en) Target matching method and device
CN108566513A (en) A kind of image pickup method of unmanned plane to moving target
CN113837079A (en) Automatic focusing method and device for microscope, computer equipment and storage medium
CN112364865B (en) Method for detecting small moving target in complex scene
CN111402331B (en) Robot repositioning method based on visual word bag and laser matching
KR20170036747A (en) Method for tracking keypoints in a scene
CN110309810B (en) Pedestrian re-identification method based on batch center similarity
Lisanti et al. Continuous localization and mapping of a pan–tilt–zoom camera for wide area tracking
CN111709317B (en) Pedestrian re-identification method based on multi-scale features under saliency model
CN113160283A (en) Target tracking method based on SIFT under multi-camera scene
CN111445497A (en) Target tracking and following method based on scale context regression
CN115278014A (en) Target tracking method, system, computer equipment and readable medium
CN107479715A (en) The method and apparatus that virtual reality interaction is realized using gesture control
CN112580435B (en) Face positioning method, face model training and detecting method and device
Yihong et al. An image database system with fast image indexing capability based on color histograms
CN114245013A (en) Virtual shooting system and shooting method
CN114004891A (en) Distribution network line inspection method based on target tracking and related device
CN108898636B (en) Camera one-dimensional calibration method based on improved PSO
CN106354263A (en) Real-time man-machine interaction system based on facial feature tracking and working method of real-time man-machine interaction system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination