CN111639720B - Stage light-following positioning system - Google Patents

Stage light-following positioning system Download PDF

Info

Publication number
CN111639720B
CN111639720B CN202010528934.8A CN202010528934A CN111639720B CN 111639720 B CN111639720 B CN 111639720B CN 202010528934 A CN202010528934 A CN 202010528934A CN 111639720 B CN111639720 B CN 111639720B
Authority
CN
China
Prior art keywords
unit
user
target
information
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010528934.8A
Other languages
Chinese (zh)
Other versions
CN111639720A (en
Inventor
朱国良
张航
谢海歧
吴立锋
薛焕新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dafeng Industry Co Ltd
Original Assignee
Zhejiang Dafeng Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dafeng Industry Co Ltd filed Critical Zhejiang Dafeng Industry Co Ltd
Priority to CN202010528934.8A priority Critical patent/CN111639720B/en
Publication of CN111639720A publication Critical patent/CN111639720A/en
Application granted granted Critical
Publication of CN111639720B publication Critical patent/CN111639720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a stage light-following positioning system which comprises a target input unit, a target locking unit, a diaphragm locking unit, a circle trace analysis unit, a fused analysis unit, a central processing unit, a display unit, a storage unit, a personal data recording unit, a characteristic analysis unit, a data pre-drilling unit, a lamplight driving unit and a management unit, wherein the target input unit is connected with the target locking unit; the method includes the steps that a real-time whole-body picture of a user is input through a target input unit, and then five corresponding features are summarized on the real-time whole-body picture; then processing the received picture according to the time stamp of the received picture by a target locking unit to obtain a related appointed calculated value, and selecting random features according to the size of the appointed calculated value and the numerical value at each position of the time stamp; after the characteristics are selected, corresponding target personnel are locked according to the selected characteristics, so that interference is avoided because specific contents of the follow-up identification are known, and other people imitate the corresponding characteristics to influence the performance of the scene.

Description

Stage light-following positioning system
Technical Field
The invention belongs to the field of light-following positioning, relates to a stage light-following positioning technology, and in particular relates to a stage light-following positioning system.
Background
The patent with the publication number of CN205655171U discloses a wireless stage light-following lamp, which comprises a base, a support and a light-following lamp main body, the support is installed on the base, the light-following lamp main body is installed on the support, install horizontal rotation controlling means in the support, install power module, main control unit, perpendicular rotation control module, light-following lamp driving module, wireless communication module, radiating module and temperature measurement module in the light-following lamp main body, perpendicular rotation control module, light-following lamp driving module, wireless communication module, radiating module and temperature measurement module all link to each other with the main control unit electrical property, the light-following lamp links to each other with the light-following lamp driving module electrical property. This wireless stage light following lamp passes through wireless signal transmission's mode control stage light following lamp, can realize wireless control, has avoided the wiring, makes things convenient for the stage to follow the installation of light following lamp, is particularly suitable for temporary stage to use, has certain popularization nature.
However, although this application discloses a stage light-following lamp, the corresponding automatic control method of the light-following lamp is not disclosed, and the existing automatic light-following lamp light-following system is too single in characteristic of recognition of the light-following lamp, lacks variation, is easily utilized by some people, imitates corresponding characteristics, and thus has the capability of interfering with stage light-following, so that a solution is provided for solving the technical problems.
Disclosure of Invention
The invention aims to provide a stage light-following positioning system.
The aim of the invention can be achieved by the following technical scheme:
a stage light following positioning system comprises a target input unit, a target locking unit, a diaphragm analysis unit, a fused analysis unit, a central processing unit, a display unit, a storage unit, a personal data recording unit, a characteristic analysis unit, a data pre-exercise unit, a lamplight driving unit and a management unit;
the target input unit is used for inputting a real-time whole-body picture of the light-following object and marking the real-time whole-body picture as real-time picture information, wherein the real-time whole-body picture refers to a picture shot by a target user when the target user makes up on a stage; the target input unit is used for transmitting the real-time picture information to the target locking unit, the target locking unit receives the real-time picture information transmitted by the target input unit and performs locking operation by combining the real-time picture information, and the specific steps of the locking operation are as follows:
step one: firstly, acquiring real-time picture information and receiving a time stamp of the real-time picture information;
step two: intercepting the time stamp, obtaining the time stamp in a month, day and time format, and correspondingly marking the time stamp as X1-X6 to obtain a time digital group Xi, wherein i=1..6;
step three: acquiring a time digital group Xi;
step four: the specified calculation value Zd is calculated according to a formula, and the specific calculation formula is as follows:
Figure BDA0002534534960000021
step five: carrying out residual analysis on the Zd, and specifically calculating a residual value Y by using a formula; the specific calculation formula is as follows: y=zd% 3; obtaining an updated specified calculated value Zd;
step six: acquiring real-time picture information, and extracting features to obtain a feature combination Tj, j=1..5;
step seven: selecting the judging characteristics of the field according to the specified calculated value Zd; three measured characteristics are obtained;
step eight: obtaining a detected feature, locking the position of a target user according to the detected feature, and marking the position information of the target user as a target position point;
the target locking unit is used for transmitting the target position point to the trauma analysis unit and the aperture locking unit; the trauma analysis unit receives the target position point transmitted by the target locking unit;
the ring track analysis unit receives the ring position transmitted by the ring track locking unit and carries out ring track analysis on the ring position, so as to obtain ring center point and ring diameter information and ring fusion information formed by fusing a stage coordinate system;
the ring track analysis unit is used for transmitting ring fusion information to the fusion analysis unit, and the fusion analysis unit is used for transmitting the ring fusion information and the target position point to the central processor; the central processing unit receives the circle fusion information and the target position point transmitted by the fusion analysis unit;
the data pre-exercise unit is used for pre-exercise the movement of the user on the stage, specifically, the movement speed of the user when the user exercises in four directions of front, back, left and right on the stage in advance is obtained, and the movement speeds of the user in four directions of front left, front right, back left and back right are obtained, wherein the four directions of front left, front right, back left and back right refer to the movement of the user in a direction 45 degrees from the vertical direction; the data pre-exercise unit is used for acquiring the moving speeds in eight directions to obtain a moving speed group, and the moving speeds are acquired by calculating the average value after the user moves for three times in the corresponding direction; the data pre-exercise unit is used for transmitting the moving speed group to the characteristic analysis unit, the characteristic analysis unit is used for fusing the moving speed group and the corresponding user to form personal speed information and transmitting the personal speed information to the personal data recording unit, and the personal data recording unit receives the personal speed information transmitted by the characteristic analysis unit and stores the personal speed information in real time;
the central processing unit is used for carrying out loop transfer analysis on the loop fusion information and the target position point by combining the personal data recording unit and the lamplight driving unit, and the specific analysis steps are as follows:
SS01: acquiring target position points and circle fusion information;
SS02: firstly, acquiring circle center point and circle diameter information in circle fusion information and a stage coordinate system;
SS03: marking a target position point of a user as a stage coordinate system;
SS04: acquiring the distance between the target position point and the circle center point, and marking the distance as a circle center distance Qx;
SS05: acquiring circle diameter information, and marking the circle diameter information as Jq;
SS06: before starting, firstly, according to coincidence of a circle center point and a target position point, enabling a user to be positioned at the circle center point;
SS07: and then monitoring the circle center distance Qx of the user in real time, when the circle center distance Qx exceeds Q1, wherein Q1 is a preset value, generating a moving party judging signal, and automatically analyzing the moving direction at the moment, wherein the specific mode is as follows:
SS071: according to the position directions of the target position point and the circle center point, the moving position of the user can be directly obtained, an included angle formed by the moving position of the user and the vertical direction is automatically obtained, the moving position of the user is obtained according to the included angle, the moving position of the user is compared with a moving speed group in personal speed information of the personal data recording unit, when the real-time moving position of the user is positioned between the two corresponding moving directions, the moving speed average value of the two directions is obtained, and the average value is marked as a predicted speed value;
SS08: marking the moving position of the user as a transferring direction, and marking the preset speed value as a transferring speed;
SS09: when the circle diameter information Jq minus the circle center distance Qx is larger than or equal to Q2, Q2 is a preset value and larger than Q1, and a movement signal is generated; at the moment, driving the light driving unit to adjust the aperture position according to the transfer direction and the corresponding transfer speed until the aperture center distance Qx is zero;
SS10: repeating the steps SS01-SS09 to track the target position of the user;
the management unit is in communication connection with the central processing unit.
Further, the specific steps of performing the residual analysis in the locking operation step five to obtain the updated specified calculated value Zd are as follows:
s1: re-establishing a specific numerical value of the specified calculated value Zd according to the Y value;
when y=0, let zd=3;
otherwise, let zd=y;
and obtaining the updated specified calculated value Zd.
Further, the feature extraction in the locking operation step six specifically includes:
s1: defining facial features as a feature T1, wherein the facial features are face information in corresponding real-time picture information;
s2: the coat representation color of the user is obtained, the coat representation color is marked as a characteristic II T2, and the coat representation color obtaining mode is as follows: acquiring all colors in the coat and acquiring the areas of the corresponding colors; the method comprises the steps of obtaining the total area of the coat, dividing the corresponding color area by the total area to obtain the facial occupation ratio, and marking the corresponding color with the facial occupation ratio exceeding the preset occupation ratio B1 as the coat representation color;
s3: obtaining a lower garment characterization color, wherein the lower garment characterization color obtaining mode is consistent with the upper garment characterization color obtaining mode in the step S2; marking the characterization color of the lower garment as a characteristic three T3;
s4: the shoe characterization color is obtained, and the shoe characterization color obtaining mode is consistent with the coat characterization color obtaining mode in the step S2; marking the characterization color of the shoes as a characteristic four T4;
s5: obtaining the jacking length of a user, wherein the jacking length refers to the maximum value from the ground to the overhead height when the user stands, and the maximum value is marked as jacking length K; correspondingly marking the top length K as a characteristic five T5;
s6: a feature combination consisting of features one to five is obtained, tj, j=1..5.
Further, the specific method for obtaining three measured characteristics in the locking operation step seven is as follows:
s01: acquiring a specified calculated value Zd, and when zd=1, enabling i=1 at the moment to acquire a corresponding specific value of X1;
s02: starting from the first numerical value, counting to the X1 st numerical value, and identifying the corresponding feature Tj as a measured feature;
s03: when zd=2, let i=2 and 3 at this time, obtain the specific numerical values of corresponding X2 and X3;
s04: starting from the first numerical value, counting to the X2 numerical value, and identifying the corresponding feature Tj as a measured feature; then counting to an X3 number value, marking the corresponding feature Tj as a measured feature, and obtaining two measured features;
s05: when zd=3, let i=4, 5, and 6 at this time, obtain specific values of corresponding X4, X5, and X6;
s06: starting from the first numerical value, counting to the X4 numerical value, and identifying the corresponding feature Tj as a measured feature; and then, counting to X5 and X6 numerical values, and marking the corresponding feature Tj as a measured feature to obtain three measured features.
Further, the loop trace analysis specifically comprises the following steps:
step one: designating any point of the stage as a circle center, covering the stage to establish a plane coordinate system, and expressing the coordinate system as a stage coordinate system;
step two: acquiring an aperture position, and marking the aperture position on a stage coordinate system;
step three: acquiring the aperture radius, and marking the aperture radius as aperture information;
step four: the circle center of the aperture is obtained, the point is marked as a circle center point, and the circle center point is marked on a stage coordinate system;
step five: and tracking the position of the diaphragm in real time, and fusing the center point of the diaphragm, the diameter information and the stage coordinate system to form fused information.
Further, the management unit is used for inputting all preset values Q1, Q2 and B1.
Further, the central processing unit is used for stamping the moving signal with a time stamp to form a light-following record, and transmitting the light-following record to the storage unit for real-time storage; the central processing unit is also used for transmitting the light following record to the display unit for real-time display.
The invention has the beneficial effects that:
the method includes the steps that a real-time whole-body picture of a user is input through a target input unit, and then five corresponding features are summarized on the real-time whole-body picture; then processing the received picture according to the time stamp of the received picture by a target locking unit to obtain a related appointed calculated value, and selecting random features according to the size of the appointed calculated value and the numerical value at each position of the time stamp; after the characteristics are selected, corresponding target personnel are locked according to the selected characteristics, so that interference is avoided because specific contents of the follow-up identification are known, and other people are utilized to imitate the corresponding characteristics to influence the performance of the scene;
the method comprises the steps that a diaphragm locking unit is used for locking the position of a diaphragm, a diaphragm trace analysis unit is used for obtaining key data of the diaphragm, a fused analysis unit is used for transmitting all collected characteristics, relevant tracks and coordinate systems of the position points of a target user and the diaphragm to a central processing unit, and the central processing unit is used for controlling the projection position of the diaphragm in combination with relevant rules so as to ensure that the user is positioned at the center of the diaphragm; thereby perfecting the whole control process; the invention is simple and effective, and is easy and practical.
Drawings
The present invention is further described below with reference to the accompanying drawings for the convenience of understanding by those skilled in the art.
Fig. 1 is a system block diagram of the present invention.
Detailed Description
As shown in fig. 1, the stage light-following positioning system comprises a target input unit, a target locking unit, a diaphragm locking unit, a circle trace analysis unit, a fused analysis unit, a central processing unit, a display unit, a storage unit, a personal data recording unit, a characteristic analysis unit, a data pre-drilling unit, a lamplight driving unit and a management unit;
the target input unit is used for inputting a real-time whole-body picture of the light-following object and marking the real-time whole-body picture as real-time picture information, wherein the real-time whole-body picture refers to a picture shot by a target user when the target user makes up on a stage; the target input unit is used for transmitting the real-time picture information to the target locking unit, the target locking unit receives the real-time picture information transmitted by the target input unit and performs locking operation by combining the real-time picture information, and the specific steps of the locking operation are as follows:
step one: firstly, acquiring real-time picture information and receiving a time stamp of the real-time picture information;
step two: intercepting the time stamp, obtaining the time stamp in a month, day and time format, and correspondingly marking the time stamp as X1-X6 to obtain a time digital group Xi, wherein i=1..6; specific examples are 061017 corresponding to X1-X6 when the specific example is 06 months and 10 days 17;
step three: acquiring a time digital group Xi;
step four: the specified calculation value Zd is calculated according to a formula, and the specific calculation formula is as follows:
Figure BDA0002534534960000071
step five: carrying out residual analysis on the Zd, and specifically calculating a residual value Y by using a formula; the specific calculation formula is as follows: y=zd% 3;
s1: re-establishing a specific numerical value of the specified calculated value Zd according to the Y value;
when y=0, let zd=3;
otherwise, let zd=y;
obtaining an updated specified calculated value Zd;
step six: the method comprises the following specific steps of obtaining real-time picture information and extracting features:
s1: defining facial features as a feature T1, wherein the facial features are face information in corresponding real-time picture information;
s2: the coat representation color of the user is obtained, the coat representation color is marked as a characteristic II T2, and the coat representation color obtaining mode is as follows: acquiring all colors in the coat and acquiring the areas of the corresponding colors; the method comprises the steps of obtaining the total area of the coat, dividing the corresponding color area by the total area to obtain the facial occupation ratio, and marking the corresponding color with the facial occupation ratio exceeding the preset occupation ratio B1 as the coat representation color;
s3: obtaining a lower garment characterization color, wherein the lower garment characterization color obtaining mode is consistent with the upper garment characterization color obtaining mode in the step S2; marking the characterization color of the lower garment as a characteristic three T3;
s4: the shoe characterization color is obtained, and the shoe characterization color obtaining mode is consistent with the coat characterization color obtaining mode in the step S2; marking the characterization color of the shoes as a characteristic four T4;
s5: obtaining the jacking length of a user, wherein the jacking length refers to the maximum value from the ground to the overhead height when the user stands, and the maximum value is marked as jacking length K; correspondingly marking the top length K as a characteristic five T5;
s6: obtaining a feature combination consisting of features one to five, tj, j=1..5;
step seven: selecting the judging characteristics of the field according to the specified calculated value Zd; the method comprises the following steps:
s01: acquiring a specified calculated value Zd, and when zd=1, enabling i=1 at the moment to acquire a corresponding specific value of X1;
s02: starting from the first numerical value, counting to the X1 st numerical value, and identifying the corresponding feature Tj as a measured feature;
s03: when zd=2, let i=2 and 3 at this time, obtain the specific numerical values of corresponding X2 and X3;
s04: starting from the first numerical value, counting to the X2 numerical value, and identifying the corresponding feature Tj as a measured feature; then counting to an X3 number value, marking the corresponding feature Tj as a measured feature, and obtaining two measured features;
s05: when zd=3, let i=4, 5, and 6 at this time, obtain specific values of corresponding X4, X5, and X6;
s06: starting from the first numerical value, counting to the X4 numerical value, and identifying the corresponding feature Tj as a measured feature; then counting to X5 and X6 numerical values, marking the corresponding feature Tj as a measured feature, and obtaining three measured features;
step eight: obtaining a detected feature, locking the position of a target user according to the detected feature, and marking the position information of the target user as a target position point;
the target locking unit is used for transmitting the target position point to the trauma analysis unit and the aperture locking unit; the trauma analysis unit receives the target position point transmitted by the target locking unit;
the ring track analysis unit receives the ring track position transmitted by the ring track locking unit and performs ring track analysis on the ring track position, and the ring track analysis specifically comprises the following steps:
step one: designating any point of the stage as a circle center, covering the stage to establish a plane coordinate system, and expressing the coordinate system as a stage coordinate system;
step two: acquiring an aperture position, and marking the aperture position on a stage coordinate system;
step three: acquiring the aperture radius, and marking the aperture radius as aperture information;
step four: the circle center of the aperture is obtained, the point is marked as a circle center point, and the circle center point is marked on a stage coordinate system;
step five: tracking the position of the diaphragm in real time, and fusing the center point of the diaphragm with the diameter information of the diaphragm and a stage coordinate system to form fused information;
the ring track analysis unit is used for transmitting ring fusion information to the fusion analysis unit, and the fusion analysis unit is used for transmitting the ring fusion information and the target position point to the central processor; the central processing unit receives the circle fusion information and the target position point transmitted by the fusion analysis unit;
the data pre-exercise unit is used for pre-exercise the movement of the user on the stage, specifically, the movement speed of the user when the user exercises in four directions of front, back, left and right on the stage in advance is obtained, and the movement speeds of the user in four directions of front left, front right, back left and back right are obtained, wherein the four directions of front left, front right, back left and back right refer to the movement of the user in a direction 45 degrees from the vertical direction; the data pre-exercise unit is used for acquiring the moving speeds in eight directions to obtain a moving speed group, and the moving speeds are acquired by calculating the average value after the user moves for three times in the corresponding direction; the data pre-exercise unit is used for transmitting the moving speed group to the characteristic analysis unit, the characteristic analysis unit is used for fusing the moving speed group and the corresponding user to form personal speed information and transmitting the personal speed information to the personal data recording unit, and the personal data recording unit receives the personal speed information transmitted by the characteristic analysis unit and stores the personal speed information in real time;
the central processing unit is used for carrying out loop transfer analysis on the loop fusion information and the target position point by combining the personal data recording unit and the lamplight driving unit, and the specific analysis steps are as follows:
SS01: acquiring target position points and circle fusion information;
SS02: firstly, acquiring circle center point and circle diameter information in circle fusion information and a stage coordinate system;
SS03: marking a target position point of a user as a stage coordinate system;
SS04: acquiring the distance between the target position point and the circle center point, and marking the distance as a circle center distance Qx;
SS05: acquiring circle diameter information, and marking the circle diameter information as Jq;
SS06: before starting, firstly, according to coincidence of a circle center point and a target position point, enabling a user to be positioned at the circle center point;
SS07: and then monitoring the circle center distance Qx of the user in real time, when the circle center distance Qx exceeds Q1, wherein Q1 is a preset value, generating a moving party judging signal, and automatically analyzing the moving direction at the moment, wherein the specific mode is as follows:
SS071: according to the position directions of the target position point and the circle center point, the moving position of the user can be directly obtained, an included angle formed by the moving position of the user and the vertical direction is automatically obtained, the moving position of the user is obtained according to the included angle, the moving position of the user is compared with a moving speed group in personal speed information of the personal data recording unit, when the real-time moving position of the user is positioned between the two corresponding moving directions, the moving speed average value of the two directions is obtained, and the average value is marked as a predicted speed value;
SS08: marking the moving position of the user as a transferring direction, and marking the preset speed value as a transferring speed;
SS09: when the circle diameter information Jq minus the circle center distance Qx is larger than or equal to Q2, Q2 is a preset value and larger than Q1, and a movement signal is generated; at the moment, driving the light driving unit to adjust the aperture position according to the transfer direction and the corresponding transfer speed until the aperture center distance Qx is zero;
SS10: repeating the steps SS01-SS09 to track the target position of the user;
the management unit is used for inputting all preset values Q1, Q2 and B1; a step of
The central processing unit is used for stamping the moving signal with a time stamp to form a light-following record, and transmitting the light-following record to the storage unit for real-time storage; the central processing unit is also used for transmitting the light following record to the display unit for real-time display.
When the stage light-following positioning system works, firstly, a real-time whole-body picture of a user is input through a target input unit, and then five corresponding features are summarized on the real-time whole-body picture; then processing the received picture according to the time stamp of the received picture by a target locking unit to obtain a related appointed calculated value, and selecting random features according to the size of the appointed calculated value and the numerical value at each position of the time stamp; after the characteristics are selected, corresponding target personnel are locked according to the selected characteristics, so that interference is avoided because specific contents of the follow-up identification are known, and other people are utilized to imitate the corresponding characteristics to influence the performance of the scene;
the method comprises the steps that a diaphragm locking unit is used for locking the position of a diaphragm, a diaphragm trace analysis unit is used for obtaining key data of the diaphragm, a fused analysis unit is used for transmitting all collected characteristics, relevant tracks and coordinate systems of the position points of a target user and the diaphragm to a central processing unit, and the central processing unit is used for controlling the projection position of the diaphragm in combination with relevant rules so as to ensure that the user is positioned at the center of the diaphragm; thereby perfecting the whole control process; the invention is simple and effective, and is easy and practical.
The foregoing is merely illustrative of the structures of this invention and various modifications, additions and substitutions for those skilled in the art can be made to the described embodiments without departing from the scope of the invention or from the scope of the invention as defined in the accompanying claims.

Claims (5)

1. The stage light following positioning system is characterized by comprising a target input unit, a target locking unit, a diaphragm locking unit, a circle trace analysis unit, a fused analysis unit, a central processing unit, a display unit, a storage unit, a personal data recording unit, a characteristic analysis unit, a data pre-drilling unit, a lamplight driving unit and a management unit;
the target input unit is used for inputting a real-time whole-body picture of the light-following object and marking the real-time whole-body picture as real-time picture information, wherein the real-time whole-body picture refers to a picture shot by a target user when the target user makes up on a stage; the target input unit is used for transmitting the real-time picture information to the target locking unit, the target locking unit receives the real-time picture information transmitted by the target input unit and performs locking operation by combining the real-time picture information, and the specific steps of the locking operation are as follows:
step one: firstly, acquiring real-time picture information and receiving a time stamp of the real-time picture information;
step two: intercepting the time stamp, obtaining the time stamp in a month, day and time format, and correspondingly marking the time stamp as X1-X6 to obtain a time digital group Xi, wherein i=1..6;
step three: acquiring a time digital group Xi;
step four: the specified calculation value Zd is calculated according to a formula, and the specific calculation formula is as follows:
Figure FDA0004112277840000011
step five: carrying out residual analysis on the Zd, and specifically calculating a residual value Y by using a formula; the specific calculation formula is as follows: y=zd% 3; obtaining an updated specified calculated value Zd;
step six: acquiring real-time picture information, and extracting features to obtain a feature combination Tj, j=1..5;
step seven: selecting the judging characteristics of the field according to the specified calculated value Zd; obtaining a measured characteristic;
step eight: obtaining a detected feature, locking the position of a target user according to the detected feature, and marking the position information of the target user as a target position point;
the target locking unit is used for transmitting the target position point to the trauma analysis unit and the aperture locking unit; the trauma analysis unit receives the target position point transmitted by the target locking unit;
the ring track analysis unit receives the ring position transmitted by the ring track locking unit and carries out ring track analysis on the ring position, so as to obtain ring center point and ring diameter information and ring fusion information formed by fusing a stage coordinate system;
the ring track analysis unit is used for transmitting ring fusion information to the fusion analysis unit, and the fusion analysis unit is used for transmitting the ring fusion information and the target position point to the central processor; the central processing unit receives the circle fusion information and the target position point transmitted by the fusion analysis unit;
the data pre-exercise unit is used for pre-exercise the movement of the user on the stage, specifically, the movement speed of the user when the user exercises in four directions of front, back, left and right on the stage in advance is obtained, and the movement speeds of the user in four directions of front left, front right, back left and back right are obtained, wherein the four directions of front left, front right, back left and back right refer to the movement of the user in a direction 45 degrees from the vertical direction; the data pre-exercise unit is used for acquiring the moving speeds in eight directions to obtain a moving speed group, and the moving speeds are acquired by calculating the average value after the user moves for three times in the corresponding direction; the data pre-exercise unit is used for transmitting the moving speed group to the characteristic analysis unit, the characteristic analysis unit is used for fusing the moving speed group and the corresponding user to form personal speed information and transmitting the personal speed information to the personal data recording unit, and the personal data recording unit receives the personal speed information transmitted by the characteristic analysis unit and stores the personal speed information in real time;
the central processing unit is used for carrying out loop transfer analysis on the loop fusion information and the target position point by combining the personal data recording unit and the lamplight driving unit, and the specific analysis steps are as follows:
SS01: acquiring target position points and circle fusion information;
SS02: firstly, acquiring circle center point and circle diameter information in circle fusion information and a stage coordinate system;
SS03: marking a target position point of a user as a stage coordinate system;
SS04: acquiring the distance between the target position point and the circle center point, and marking the distance as a circle center distance Qx;
SS05: acquiring circle diameter information, and marking the circle diameter information as Jq;
SS06: before starting, firstly, according to coincidence of a circle center point and a target position point, enabling a user to be positioned at the circle center point;
SS07: and then monitoring the circle center distance Qx of the user in real time, when the circle center distance Qx exceeds Q1, wherein Q1 is a preset value, generating a moving party judging signal, and automatically analyzing the moving direction at the moment, wherein the specific mode is as follows:
SS071: according to the position directions of the target position point and the circle center point, the moving position of the user can be directly obtained, an included angle formed by the moving position of the user and the vertical direction is automatically obtained, the moving position of the user is obtained according to the included angle, the moving position of the user is compared with a moving speed group in personal speed information of the personal data recording unit, when the real-time moving position of the user is positioned between the two corresponding moving directions, the moving speed average value of the two directions is obtained, and the average value is marked as a predicted speed value;
SS08: marking the moving position of the user as a transferring direction, and marking the preset speed value as a transferring speed;
SS09: when the circle diameter information Jq minus the circle center distance Qx is larger than or equal to Q2, Q2 is a preset value and larger than Q1, and a movement signal is generated; at the moment, driving the light driving unit to adjust the aperture position according to the transfer direction and the corresponding transfer speed until the aperture center distance Qx is zero;
SS10: repeating the steps SS01-SS09 to track the target position of the user;
the management unit is in communication connection with the central processing unit;
the specific steps of the locking operation step five for carrying out residual analysis to obtain the updated specified calculated value Zd are as follows:
s1: re-establishing a specific numerical value of the specified calculated value Zd according to the Y value;
when y=0, let zd=3;
otherwise, let zd=y;
obtaining an updated specified calculated value Zd;
the specific method for obtaining the measured characteristic in the locking operation step seven is as follows:
s01: acquiring a specified calculated value Zd, and when zd=1, enabling i=1 at the moment to acquire a corresponding specific value of X1;
s02: starting from the first numerical value, counting to the X1 st numerical value, and identifying the corresponding feature Tj as a measured feature;
s03: when zd=2, let i=2 and 3 at this time, obtain the specific numerical values of corresponding X2 and X3;
s04: starting from the first numerical value, counting to the X2 numerical value, and identifying the corresponding feature Tj as a measured feature; then counting to an X3 number value, marking the corresponding feature Tj as a measured feature, and obtaining two measured features;
s05: when zd=3, let i=4, 5, and 6 at this time, obtain specific values of corresponding X4, X5, and X6;
s06: starting from the first numerical value, counting to the X4 numerical value, and identifying the corresponding feature Tj as a measured feature; and then, counting to X5 and X6 numerical values, and marking the corresponding feature Tj as a measured feature to obtain three measured features.
2. The stage light-following positioning system according to claim 1, wherein the feature extraction in the locking operation step six comprises the specific steps of:
s1: defining facial features as a feature T1, wherein the facial features are face information in corresponding real-time picture information;
s2: the coat representation color of the user is obtained, the coat representation color is marked as a characteristic II T2, and the coat representation color obtaining mode is as follows: acquiring all colors in the coat and acquiring the areas of the corresponding colors; the method comprises the steps of obtaining the total area of the coat, dividing the corresponding color area by the total area to obtain the facial occupation ratio, and marking the corresponding color with the facial occupation ratio exceeding the preset occupation ratio B1 as the coat representation color;
s3: obtaining a lower garment characterization color, wherein the lower garment characterization color obtaining mode is consistent with the upper garment characterization color obtaining mode in the step S2; marking the characterization color of the lower garment as a characteristic three T3;
s4: the shoe characterization color is obtained, and the shoe characterization color obtaining mode is consistent with the coat characterization color obtaining mode in the step S2; marking the characterization color of the shoes as a characteristic four T4;
s5: obtaining the jacking length of a user, wherein the jacking length refers to the maximum value from the ground to the overhead height when the user stands, and the maximum value is marked as jacking length K; correspondingly marking the top length K as a characteristic five T5;
s6: a feature combination consisting of features one to five is obtained, tj, j=1..5.
3. A stage light-following positioning system according to claim 1, wherein the loop analysis comprises the following specific steps:
step one: designating any point of the stage as a circle center, covering the stage to establish a plane coordinate system, and expressing the coordinate system as a stage coordinate system;
step two: acquiring an aperture position, and marking the aperture position on a stage coordinate system;
step three: acquiring the aperture radius, and marking the aperture radius as aperture information;
step four: the circle center of the aperture is obtained, the point is marked as a circle center point, and the circle center point is marked on a stage coordinate system;
step five: and tracking the position of the diaphragm in real time, and fusing the center point of the diaphragm, the diameter information and the stage coordinate system to form fused information.
4. A stage light-following positioning system according to claim 2, wherein the management unit is configured to enter all preset values Q1, Q2 and B1.
5. The stage light-following positioning system according to claim 1, wherein the central processing unit is used for time stamping the moving signal to form a light-following record, and transmitting the light-following record to the storage unit for real-time storage; the central processing unit is also used for transmitting the light following record to the display unit for real-time display.
CN202010528934.8A 2020-06-11 2020-06-11 Stage light-following positioning system Active CN111639720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010528934.8A CN111639720B (en) 2020-06-11 2020-06-11 Stage light-following positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010528934.8A CN111639720B (en) 2020-06-11 2020-06-11 Stage light-following positioning system

Publications (2)

Publication Number Publication Date
CN111639720A CN111639720A (en) 2020-09-08
CN111639720B true CN111639720B (en) 2023-06-27

Family

ID=72331251

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010528934.8A Active CN111639720B (en) 2020-06-11 2020-06-11 Stage light-following positioning system

Country Status (1)

Country Link
CN (1) CN111639720B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112347847A (en) * 2020-09-27 2021-02-09 浙江大丰实业股份有限公司 Automatic positioning system for stage safety monitoring
CN113993250B (en) * 2021-12-24 2022-03-15 深圳市奥新科技有限公司 Stage lighting control method, device, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716476A (en) * 2019-11-08 2020-01-21 珠海市鸿瑞信息技术股份有限公司 Industrial control system network security situation perception system based on artificial intelligence

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103179765A (en) * 2013-04-22 2013-06-26 浙江大丰实业有限公司 Stage light effect control system
US9429398B2 (en) * 2014-05-21 2016-08-30 Universal City Studios Llc Optical tracking for controlling pyrotechnic show elements
CN105045237A (en) * 2015-07-22 2015-11-11 浙江大丰实业股份有限公司 Intelligent distributed stage data mining system
CN109661086A (en) * 2019-01-15 2019-04-19 广州黑豹演艺科技有限公司 A kind of stage follow spotlight autocontrol method and system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716476A (en) * 2019-11-08 2020-01-21 珠海市鸿瑞信息技术股份有限公司 Industrial control system network security situation perception system based on artificial intelligence

Also Published As

Publication number Publication date
CN111639720A (en) 2020-09-08

Similar Documents

Publication Publication Date Title
CN111639720B (en) Stage light-following positioning system
CN110017841A (en) Vision positioning method and its air navigation aid
CN109525937B (en) Positioning method of indoor positioning management system integrating multiple positioning modes
US9973275B2 (en) System and method for lighting and building occupant tracking
CN208675549U (en) A kind of fence management system
US10333620B2 (en) System and method for lighting and building occupant tracking
WO2017045467A1 (en) Position determination method, device, system and processing center
CN108234927A (en) Video frequency tracking method and system
CN206559654U (en) A kind of transformer station management system positioned based on UWB
CN107071899A (en) Real-time positioning system in a kind of quick high accuracy room
CN111083642A (en) Method for positioning personnel in treasury
CN109116298A (en) A kind of localization method, storage medium and positioning system
CN107972027B (en) Robot positioning method and device and robot
CN110068791A (en) Indoor locating system based on array antenna
CN107600113A (en) A kind of mobile device personnel are close to monitor and alarm system and method
CN110266999A (en) A kind of system and method for garden place security control
CN112037477A (en) Indoor electronic fence positioning method and system based on RFID
CN109826668B (en) Underground multi-source accurate personnel positioning system and method
CN107886543A (en) A kind of mine personnel localization method and device
CN112068567A (en) Positioning method and positioning system based on ultra-wideband and visual image
CN114493084A (en) Park emergency linkage method and system based on BIM + GIS
CN207010998U (en) Real-time positioning apparatus in a kind of quick high accuracy room
CN111856394A (en) Accurate positioning device and method based on combination of UWB and monitoring
CN114268900B (en) Indoor positioning method and system
EP4102258A1 (en) Measurement method and measurement system for nuclear power plant radiation dose distribution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant