CN112289150A - Space-time mirror array device, interaction control method thereof and storage medium - Google Patents

Space-time mirror array device, interaction control method thereof and storage medium Download PDF

Info

Publication number
CN112289150A
CN112289150A CN202011248635.5A CN202011248635A CN112289150A CN 112289150 A CN112289150 A CN 112289150A CN 202011248635 A CN202011248635 A CN 202011248635A CN 112289150 A CN112289150 A CN 112289150A
Authority
CN
China
Prior art keywords
sound
optical lenses
moving object
sound source
mirror array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011248635.5A
Other languages
Chinese (zh)
Other versions
CN112289150B (en
Inventor
郑洵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen EBT Environmental Art Engineering Design Co ltd
Original Assignee
Shenzhen EBT Environmental Art Engineering Design Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen EBT Environmental Art Engineering Design Co ltd filed Critical Shenzhen EBT Environmental Art Engineering Design Co ltd
Priority to CN202011248635.5A priority Critical patent/CN112289150B/en
Publication of CN112289150A publication Critical patent/CN112289150A/en
Application granted granted Critical
Publication of CN112289150B publication Critical patent/CN112289150B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/06Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for physics
    • G09B23/14Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for physics for acoustics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/06Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for physics
    • G09B23/22Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for physics for optics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Algebra (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Optics & Photonics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The application relates to a space-time mirror array device, an interactive control method thereof and a storage medium, wherein the interactive control method comprises the following steps: acquiring the spatial positions of moving objects around the plurality of optical lenses and the sound of surrounding sound sources; rotating at least one optical lens of the plurality of optical lenses according to the spatial position of the moving object to rotate the optical lens to face the moving object; the lighting order and brightness of the plurality of light emitting parts are adjusted according to the sound of the sound source to illuminate the plurality of optical lenses, respectively. The interactive control method endows the space-time mirror array device with strong interactive operation performance, shows wonderful phenomenon through the control mode of somatosensory interaction, enables any fine behavior of audiences to be amplified and displayed by the device by hundreds of times, and finally enables sound, light and common and difficultly-perceived connection between space and people to be easily observed in interaction.

Description

Space-time mirror array device, interaction control method thereof and storage medium
Technical Field
The invention relates to the technical field of multimedia teaching, in particular to a spatio-temporal mirror array device, a control method thereof and a storage medium.
Background
The art design is an independent art subject and mainly comprises manufacturing and designing of industrial art, environmental design, planar design, multimedia design and the like. The art design is a highly comprehensive subject, which relates to social, cultural, economic, market, science and technology factors, and the aesthetic standard of the art design changes with the change of the factors. The artistic design is actually the embodiment of the comprehensive quality of the designer, such as expressive power, perception power and imagination power.
With the development of multimedia technology, computer technology and telemetry technology, interactive display of teaching instruments is becoming one of the common interactive information display and media tools in people's normal lives, interactive multimedia is increasingly applied to various fields, and a scene or some products are displayed in exhibition activities and popular science sites. Most of the current multimedia display is limited to infusion mode, and plane display is adopted, so that good display effect cannot be achieved.
Some current multimedia display platforms on the market construct too simply, lack individualized and humanized design, though can satisfy the show of simple science popularization scene, do not have the characteristic and hardly inhale audience, can not let audience all-round watch, the bandwagon effect is poor and block seriously, leans on manpower control moreover, leads to intelligent degree low, can not change operating condition according to the circumstances, causes the condition of taking electricity.
Disclosure of Invention
The invention mainly solves the technical problems that: how to improve the interactive operation performance of the existing multimedia display platform. In order to solve the technical problem, the application provides a spatio-temporal mirror array device, an interaction control method thereof and a storage medium.
According to a first aspect, there is provided in one embodiment a spatiotemporal mirror array device comprising: a plurality of optical lenses distributed in the space in an array manner; the rotating mechanisms are connected with the optical lenses in a one-to-one correspondence mode and used for respectively rotating the optical lenses; position detection means for detecting a spatial position of a moving object around the plurality of optical lenses; and the controller is in signal connection with the plurality of rotating mechanisms and the position detection component and is used for driving at least one rotating mechanism in the plurality of rotating mechanisms to rotate so that the connected optical lens faces the movable object.
The space-time mirror array device also comprises a plurality of light-emitting components and a sound detection component which is in signal connection with the controller; the plurality of light-emitting parts are respectively arranged on the plurality of optical lenses and used for respectively illuminating the plurality of optical lenses; the sound detection component is used for detecting the sound of the sound source around the plurality of optical lenses; the controller is in signal connection with the plurality of light emitting parts and is further configured to adjust the lighting order and brightness of the plurality of light emitting parts to adapt to sound fluctuations of the sound source.
Each optical lens is a rectangular double-sided mirror, and the height of each optical lens is greater than or equal to that of the movable object.
The plurality of rotating mechanisms are fixed on a display platform, and each rotating mechanism is connected to the corresponding optical lens through a rotating shaft extending upwards; and a servo motor is arranged in each rotating mechanism and used for driving the optical lens connected to the rotating shaft to rotate under the control action of the controller.
The position detection component is one or more depth cameras, and the depth cameras are used for collecting images around the optical lenses so as to detect the spatial position of the moving object through the images.
According to a second aspect, an embodiment provides a method for interactive control of a spatiotemporal mirror array device, comprising the steps of: acquiring the spatial positions of moving objects around the plurality of optical lenses and the sound of surrounding sound sources; rotating at least one of the plurality of optical lenses according to the spatial position of the moving object to rotate the optical lens to face the moving object; the lighting order and brightness of a plurality of light emitting parts are adjusted according to the sound of the sound source to illuminate the plurality of optical lenses, respectively.
The acquiring the spatial position and the surrounding sound of the moving object around the plurality of optical lenses comprises the following steps: acquiring images around the plurality of optical lenses, and identifying the spatial position of the moving object according to the images; and acquiring acoustic wave electric signals around the plurality of optical lenses, and determining the sound magnitude and the propagation direction of the sound source according to the acoustic wave electric signals.
The rotating at least one of the plurality of optical lenses according to the spatial position of the moving object to rotate the optical lens to face the moving object includes: selecting a number of optical lenses proximate to the moving object according to the spatial position of the moving object; and respectively sending motor control signals to the rotating mechanisms corresponding to the selected optical lenses, so that the optical lenses connected with the rotating mechanisms in a driving way rotate to face the movable object.
The adjusting of the lighting order and brightness of a plurality of light emitting parts according to the sound of the sound source to illuminate the plurality of optical lenses, respectively, includes: determining the propagation direction and the sound size of the sound source according to the sound generated by the sound source; sequentially illuminating the plurality of light emitting parts in a propagation direction of the sound source; adjusting the brightness of the lighted light emitting part to adapt to the sound size of the sound source.
According to a third aspect, an embodiment provides a computer-readable storage medium comprising a program executable by a processor to implement the interactive control method described in the second aspect above.
The beneficial effect of this application is:
the spatio-temporal mirror array device and the interaction control method thereof and the storage medium according to the above embodiments, wherein the interaction control method of the spatio-temporal mirror array device comprises: acquiring the spatial positions of moving objects around the plurality of optical lenses and the sound of surrounding sound sources; rotating at least one optical lens of the plurality of optical lenses according to the spatial position of the moving object to rotate the optical lens to face the moving object; the lighting order and brightness of the plurality of light emitting parts are adjusted according to the sound of the sound source to illuminate the plurality of optical lenses, respectively. On the one hand, the claimed space-time mirror array device has high intelligent performance, deep interaction of sound, light and space and moving objects (such as people) is realized, the relation between the moving objects and the surrounding space environment is excavated in a natural mode, the difficult-to-see relation between the sound, the light and the space and the moving objects is visualized, the audience and the device can talk with each other by the means, the audience can see a fantastic scene comprehensively shown by the sound, the light and the space through the device, and the interest of the audience to natural science is comprehensively promoted. On the other hand, the claimed interactive control method endows the space-time mirror array device with strong interactive operation performance, shows wonderful phenomenon through a control mode of somatosensory interaction, utilizes a rotary optical lens to construct a space, captures information of surrounding sound, light and the like, and reacts to behaviors of surrounding moving objects, so that the device shows sound with a lamp strip effect in an audio jump mode, displays light and space through imaging of the objects in a mirror, shows the effects to audiences after completing geometric-level growth in the mirror array, any slight actions of the audiences can be amplified and displayed by the device by hundreds of times, and finally, the common and difficultly-perceived relation between the sound, the light and the space and people can be easily observed in interaction.
Drawings
FIG. 1 is a perspective view of a spatiotemporal mirror array apparatus according to an embodiment of the present application;
FIG. 2 is a front view of a spatiotemporal mirror array arrangement;
FIG. 3 is a side view of a spatiotemporal mirror array arrangement;
FIG. 4 is a top view of a spatiotemporal mirror array arrangement;
FIG. 5 is a block diagram of a single optical lens;
FIG. 6 is a state diagram of the spatiotemporal mirror array device in an interactive situation;
FIG. 7 is a state diagram of the spatiotemporal mirror array arrangement under illumination;
FIG. 8 is a flowchart illustrating an interaction control method for the spatiotemporal mirror array apparatus according to a second embodiment of the present invention;
FIG. 9 is a flowchart illustrating an interactive control method;
FIG. 10 is a schematic structural diagram of a spatiotemporal mirror array control apparatus according to a third embodiment of the present application.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings).
In the application, physics and chemistry of an abstract space and human-natural interaction can be realized through the space-time mirror array device. The space is a relative concept and forms an abstract concept of the object, and the abstract concept of the object exists by referring to the space; then the space-time mirror array is skillfully linked by a special display method to link people and space materials, so that audiences can think and understand the conceptual relationship between the reference material and the space abstraction in the illusion space experience again. The technical scheme is mainly to combine the artistic, scientific and data through the form of material space and information design, hopefully to vividly transmit the space and physical information through the organic linkage of nature and people, and attract audiences to actively develop the imagination and curiosity of the audiences.
The first embodiment,
Referring to fig. 1, the present embodiment discloses a spatio-temporal mirror array apparatus, which includes a plurality of optical lenses 11, a plurality of rotating mechanisms 12, a position detecting member 15 and a controller 14, which are described below.
A plurality of optical lenses 11 are distributed in the space in an array manner to form a distribution area with a plurality of rows and a plurality of columns, and when a moving object (such as a pedestrian R) passes through the periphery of the distribution area, the optical lenses 11 can be seen and can also generate interactive operation with the moving object.
It should be noted that each optical lens 11 may be a glass mirror, a metal plate, a reflective film, or other components, as long as the surface has a smooth optical reflection area, and the object in the environment is imaged by the mirror imaging principle.
The plurality of rotating mechanisms 12 are connected to the plurality of optical lenses 11 in a one-to-one correspondence, and are configured to rotate the respective optical lenses. Each rotating mechanism 12 can be arranged on a certain platform, so as to respectively connect each optical lens 11 upwards and support the optical lens 11; of course, each rotating mechanism 12 can also be arranged on a certain ceiling or gantry, so as to connect each optical lens 11 downwards and simultaneously lift the optical lens 11. Preferably, each turning mechanism 12 is provided on a platform, which facilitates installation and commissioning.
The position detecting unit 15 is disposed in or on the periphery of the distribution area of the plurality of optical lenses 11, and detects a spatial position of a moving object around the plurality of optical lenses 11. For example, as shown in fig. 1, when the pedestrian R passes around the optical lens distribution area or stops for a short time around the plurality of optical lenses 11, the position detection part 15 detects the position of the pedestrian R, and detects which specific position around the pedestrian R is located. It is understood that the position detecting component 15 may employ a sensing component such as an infrared sensor, an ultrasonic sensor, an image sensor, etc. to detect the object and the human body in the space.
The controller 14 is in signal connection with the plurality of rotating mechanisms 12 and the position detecting part 15, and then the controller 14 mainly functions to drive at least one of the plurality of rotating mechanisms 12 to rotate so that the connected optical lens 11 faces the moving object. It can be understood that when the number of the plurality of optical lenses 11 is very large, the distribution array thereof will be in a dense form, and the optical lens farther away from the active object will not obtain the image of the active object, and at this time, a part of the rotating mechanism close to the active object is driven to rotate the optical lens, so that some optical lenses in front of the active object are just facing the active object, thereby imaging the active object from the front.
In one embodiment, referring to fig. 1-4, a plurality of rotating mechanisms 12 are fixed on a display platform 13, and each rotating mechanism 12 is connected to a corresponding optical lens by an upwardly extending rotating shaft 121. Normally, each rotating mechanism 12 is at the initial position, and at this time, each optical lens 11 faces in the same direction, and the rotating angle is zero. The display platform 13 is a base platform adapted to mount a plurality of rotating mechanisms 12, and the inside of the base platform can be wired and can also be provided with components such as a controller, a driver and the like.
In a specific embodiment, referring to fig. 2, a servo motor (not shown in fig. 2) is disposed in each rotating mechanism 12, and the servo motor is used for driving the optical lens connected to the rotating shaft 121 to rotate under the control of the controller 14. In addition, the servo motors of the respective rotating mechanisms 12 are in signal connection with the controller 14 via a bus driver (not shown in fig. 2) which is used to convert motor control signals generated by the controller 14 into electric drive signals for the servo motors. Due to the large number of rotary mechanisms, a plurality of bus drives can be provided, so that each bus drive is in signal connection with one or more servomotors.
It should be noted that, referring to fig. 1, since the orientations of each optical lens 11 facing the pedestrian R are different, the controller 14 calculates the rotation angle of each optical lens 11 facing the pedestrian R, so as to control the bus driver to send the electric driving signal to the corresponding servo motor according to the rotation angle, so that the servo motor drives the optical lens 11 to rotate until the optical lens 11 reaches the rotation angle.
Of course, in order to achieve the rotating effect of fast response and accurate positioning, a high-precision and high-rotation-speed servo motor can be adopted, and even a special motion controller is configured for the servo motor.
Further, referring to fig. 2, 3 and 4, the top ends of the plurality of optical lenses 11 are in the same plane, and the plurality of optical lenses 11 are distributed on the plane to form a rectangular array, a square array or a triangular array. Preferably, the plurality of optical lenses 11 is a rectangular array in fig. 4, which has 10 rows in total, and 9 optical lenses are uniformly arranged in each row, and a moving object (such as a pedestrian R) can only move around the rectangular array.
Further, referring to fig. 1 and 5, each optical lens 11 is a rectangular double-sided mirror, and the height of the optical lens 11 is greater than or equal to the height of a moving object (e.g., a pedestrian R). This is so arranged that the pedestrian R can see his image clearly and completely regardless of the angle from which the optical lens array is viewed.
Of course, referring to fig. 5, in order to achieve undistorted imaging effect, each optical lens 11 may employ a planar rectangular double-sided mirror.
In a specific embodiment, referring to fig. 1, the position detection means 15 is one or more depth cameras for capturing images around a plurality of optical lenses to detect the spatial position of the moving object from the images. It can be understood that the depth camera is a camera having a depth of field measurement function, and since the imaging range is limited, when a plurality of depth cameras (for example, 4 depth cameras) are provided, the entire image of the periphery of the optical lens array can be captured, so that each pedestrian can be monitored while the object of the captured image is identified. Of course, the Detection mode of the depth camera for identifying the object by image capture may adopt the prior art, for example, the Pedestrian Detection (Pedestrian Detection) is performed by using the technical means such as a vector machine (SVM), an Adaboost classifier, a neural network classifier, and the like, so as to determine whether there is a Pedestrian and provide accurate positioning.
In one embodiment, referring to fig. 1 and 5, the spatiotemporal mirror array device further includes a plurality of light emitting parts 17, the plurality of light emitting parts 17 being respectively provided on the plurality of optical lenses 11 for respectively illuminating the plurality of optical lenses 11. In order to achieve a better interactive experience effect, the spatio-temporal mirror array device is often installed in an indoor or night place with weak light, and at this time, the light emitting part 17 is required to assist in illuminating the optical lens 11, so that a pedestrian can see the image of the pedestrian on the optical lens.
Further, referring to fig. 5, 6 and 7, the light emitting component 17 is a strip light, and surrounds the periphery of the optical lens 11. Therefore, the lighting effect of the lamp strip on the optical lens 11 can be enhanced, and the jump lighting function of the lamp strip is further facilitated.
In one embodiment, referring to FIGS. 1 and 5, the spatiotemporal mirror array device further comprises a sound detection component 16 in signal communication with the controller 14, the sound detection component 16 being configured to detect sound from a source surrounding the plurality of optical lenses, where the source may be a pedestrian R or other pedestrian, animal or sound emitting device. Preferably, the sound detection component 16 employs one or more sound sensors, and the sound sensors are disposed in or around the distribution area of the optical lens, so that complete sound pickup can be performed around the optical lens array, and the sound source can be located by using the sound intensity and the sound time difference, and the propagation direction and the magnitude of the sound can be determined.
In one embodiment, the controller 14 is connected to the plurality of light emitting member signals 17, and the controller 14 is further configured to adjust the lighting sequence and brightness of the plurality of light emitting members 11 to adapt to sound fluctuations of the sound source. For example, after the controller 14 detects the direction and the sound level of the sound source through the sound detection part 16, it controls a part of the light emitting parts close to the sound source to be turned on preferentially, a part of the light emitting parts far away from the sound source to be turned on later, and controls the brightness of the light emitting parts to be higher if the sound level is high, so that the light emitting effects of the light emitting parts 11 are related to the sound fluctuation state of the sound source, and a good interaction function between the pedestrian R and the spatio-temporal mirror array is realized.
Example II,
On the basis of the spatio-temporal mirror array apparatus disclosed in the first embodiment, an interactive control method is disclosed in this embodiment, and the interactive control method is mainly applied to the controller 14 in fig. 1.
Referring to fig. 8, the interactive control method according to the embodiment includes steps S210 to S230, which are described below.
Step S210, the spatial positions of the moving objects around the plurality of optical lenses and the sound of the surrounding sound source are acquired. Referring to fig. 1, since the position detecting part 15 and the sound detecting part 16 are provided, the controller 14 easily acquires the spatial position of a moving object (such as a pedestrian R) around the optical lens distribution area, and the sound of a sound source (such as a pedestrian R).
In one embodiment, fig. 9, the step S210 may specifically include steps S211-S212, which are respectively described as follows.
In step S211, the controller acquires images around the plurality of optical lenses, and identifies a spatial position of the moving object from the images.
Such as fig. 1, the position detection section 15 employs one or more depth cameras for capturing images around a plurality of optical lenses to detect the spatial position of a moving object from the images. For example, 4 depth cameras are provided to completely capture images of the periphery of the optical lens array, so that each pedestrian can be monitored while the controller 14 identifies the object of the captured image. Of course, the Detection mode of the depth camera for identifying the object by image capture may adopt the prior art, for example, the Pedestrian Detection (Pedestrian Detection) is performed by using the technical means such as a vector machine (SVM), an Adaboost classifier, a neural network classifier, and the like, so as to determine whether there is a Pedestrian and provide accurate positioning.
In step S212, the controller obtains the acoustic wave electrical signals around the plurality of optical lenses, and determines the sound level according to the acoustic wave electrical signals.
For example, as shown in fig. 1, the sound detection component 16 employs one or more sound sensors, and these sound sensors are disposed in or around the distribution area of the optical lens, so that complete sound pickup can be performed around the optical lens array; the controller 14 receives the electrical sound signals from the sound sensor, and analyzes the sound intensity and sound time difference of the sound source, thereby locating the sound source and determining the propagation direction and magnitude of the sound.
Step S220, rotating at least one optical lens of the plurality of optical lenses according to the spatial position of the active object to rotate the optical lens to face the active object. Referring to fig. 1, since a corresponding rotating mechanism 12 is provided for each optical lens 11, the controller 14 can easily control the rotating mechanism 12 to drive the connected optical lens 11 to rotate, so as to reach the rotation angle specified by the controller 14.
In one embodiment, referring to fig. 9, the step S220 may specifically include steps S221-S222, which are respectively described as follows.
In step S221, the controller selects a plurality of optical lenses close to the active object according to the spatial position of the active object.
For example, as shown in fig. 1 and fig. 6, the pedestrian R stands on one side of the optical lens array, and the controller 14 selects the first rows of optical lenses 11 close to the pedestrian R, so as to control the first rows of optical lenses 11 to participate in the interactive operation. It will be appreciated that this can both reduce the number of controls for the optical lens 11 for a single pedestrian and does not affect the interactive operation of other pedestrians on the remaining sides of the optical lens array.
In step S222, the controller sends motor control signals to the rotating mechanisms corresponding to the selected optical lenses, so that the rotating mechanisms drive the connected optical lenses to rotate so as to rotate to face the moving object.
For example, referring to fig. 1 and 6, after the controller 14 selects the first rows of optical lenses 11 close to the pedestrian R, it calculates the rotation angle of each of the optical lenses 11 facing the pedestrian R, so as to control the bus driver to send an electric driving signal to the servo motor of the corresponding rotating mechanism 12 according to the rotation angle, so that the servo motor drives the optical lenses 11 to rotate until the optical lenses 11 reach the rotation angle, and finally the front surface of the optical lenses 11 faces the pedestrian R.
In step S230, the lighting order and brightness of the plurality of light emitting parts are adjusted according to the sound of the sound source to respectively illuminate the plurality of optical lenses. Referring to fig. 1 and 6, since each optical lens 11 is provided with the light emitting part 17, the controller 14 can easily perform lighting control on the light emitting parts, so as to realize different lens lighting effects.
In one embodiment, referring to fig. 9, the step S230 may specifically include steps S231-S232, which are respectively described as follows.
In step S231, the controller determines a propagation direction and a sound size of the sound source according to the sound generated by the sound source.
Referring to fig. 1, after the controller 14 obtains the sound wave electric signals from the sound detection component 16, it can use the sound intensity and the sound time difference to locate the sound source and determine the propagation direction and magnitude of the sound.
In step S232, the controller sequentially lights the plurality of light emitting parts in the propagation direction of the sound source, and the controller adjusts the brightness of the lighted light emitting parts to fit the sound magnitude of the sound source.
For example, as shown in fig. 1 and 6, after the controller 14 detects the direction and the sound level of the sound source through the sound detection component 16, it controls a part of the light emitting components close to the sound source to be turned on preferentially, a part of the light emitting components far away from the sound source to be turned on later, and the higher the sound level is, the higher the brightness of the light emitting components is, so that the light emitting effects of the light emitting components 11 are related to the sound fluctuation state of the sound source, and the jump-type lighting effect of each light emitting component 17 is realized, thereby simulating the propagation direction and the sound level of the sound source, further realizing a good interaction function between the pedestrian R and the spatio-temporal mirror array, and finally achieving the lighting.
Example III,
On the basis of the interactive control method of the spatio-temporal mirror array device disclosed in the second embodiment, the present embodiment discloses a spatio-temporal mirror array control apparatus.
Referring to fig. 10, the spatiotemporal mirror array control device 3 mainly includes a memory 31 and a processor 32. The memory 31 serves as a computer-readable storage medium for storing a program, which may be a program code corresponding to the interactive control methods S210-S230 in the second embodiment.
The processor 32 is connected to the memory 31 for executing the programs stored in the memory 31 to implement and interact with the control method. The functions performed by the processor 32 can refer to the controller 14 in the second embodiment, and will not be described in detail here.
Those skilled in the art will appreciate that all or part of the functions of the various methods in the above embodiments may be implemented by hardware, or may be implemented by computer programs. When all or part of the functions of the above embodiments are implemented by a computer program, the program may be stored in a computer-readable storage medium, and the storage medium may include: a read only memory, a random access memory, a magnetic disk, an optical disk, a hard disk, etc., and the program is executed by a computer to realize the above functions. For example, the program may be stored in a memory of the device, and when the program in the memory is executed by the processor, all or part of the functions described above may be implemented. In addition, when all or part of the functions in the above embodiments are implemented by a computer program, the program may be stored in a storage medium such as a server, another computer, a magnetic disk, an optical disk, a flash disk, or a removable hard disk, and may be downloaded or copied to a memory of a local device, or may be version-updated in a system of the local device, and when the program in the memory is executed by a processor, all or part of the functions in the above embodiments may be implemented.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (10)

1. A spatiotemporal mirror array apparatus, comprising:
a plurality of optical lenses distributed in the space in an array manner;
the rotating mechanisms are connected with the optical lenses in a one-to-one correspondence mode and used for respectively rotating the optical lenses;
position detection means for detecting a spatial position of a moving object around the plurality of optical lenses;
and the controller is in signal connection with the plurality of rotating mechanisms and the position detection component and is used for driving at least one rotating mechanism in the plurality of rotating mechanisms to rotate so that the connected optical lens faces the movable object.
2. The spatiotemporal mirror array apparatus of claim 1, further comprising a plurality of light emitting components and sound detecting components;
the plurality of light-emitting parts are respectively arranged on the plurality of optical lenses and used for respectively illuminating the plurality of optical lenses;
the sound detection component is used for detecting the sound of the sound source around the plurality of optical lenses;
the controller is in signal connection with the plurality of light emitting parts and the sound detecting part, and is further configured to adjust the lighting order and brightness of the plurality of light emitting parts to adapt to sound fluctuations of the sound source.
3. The spatiotemporal mirror array apparatus according to claim 2, wherein each of the optical lenses is a rectangular double-sided mirror, and a height of the optical lens is greater than or equal to a height of the moving object.
4. The spatiotemporal mirror array apparatus of claim 1, wherein the plurality of rotation mechanisms are fixed to a display platform and each of the rotation mechanisms is connected to a corresponding one of the optical lenses by an upwardly extending rotation shaft;
and a servo motor is arranged in each rotating mechanism and used for driving the optical lens connected to the rotating shaft to rotate under the control action of the controller.
5. The spatiotemporal mirror array apparatus of claim 1, wherein the position detection means is one or more depth cameras for capturing images around the plurality of optical lenses to detect the spatial position of the moving object from the images.
6. An interactive control method of a spatiotemporal mirror array device is characterized by comprising the following steps:
acquiring the spatial positions of moving objects around the plurality of optical lenses and the sound of surrounding sound sources;
rotating at least one of the plurality of optical lenses according to the spatial position of the moving object to rotate the optical lens to face the moving object;
the lighting order and brightness of a plurality of light emitting parts are adjusted according to the sound of the sound source to illuminate the plurality of optical lenses, respectively.
7. The interactive control method of claim 6, wherein the obtaining the spatial position of the moving object around the plurality of optical lenses and the sound of the surrounding sound source comprises:
acquiring images around the plurality of optical lenses, and identifying the spatial position of the moving object according to the images;
and acquiring acoustic wave electric signals around the plurality of optical lenses, and determining the sound magnitude and the propagation direction of the sound source according to the acoustic wave electric signals.
8. The interactive control method of claim 6, wherein said rotating at least one of the plurality of optical lenses according to the spatial position of the active object to rotate the optical lens to face the active object comprises:
selecting a number of optical lenses proximate to the moving object according to the spatial position of the moving object;
and respectively sending motor control signals to the rotating mechanisms corresponding to the selected optical lenses, so that the optical lenses connected with the rotating mechanisms in a driving way rotate to face the movable object.
9. The interactive control method according to claim 6, wherein the adjusting of the lighting order and the brightness of the plurality of light emitting parts according to the sound of the sound source to illuminate the plurality of optical lenses, respectively, comprises:
determining the propagation direction and the sound size of the sound source according to the sound generated by the sound source;
sequentially illuminating the plurality of light emitting parts in a propagation direction of the sound source;
adjusting the brightness of the lighted light emitting part to adapt to the sound size of the sound source.
10. A computer-readable storage medium, characterized by comprising a program executable by a processor to implement the interactive control method according to any one of claims 6 to 9.
CN202011248635.5A 2020-11-10 2020-11-10 Space-time mirror array device, interaction control method thereof and storage medium Active CN112289150B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011248635.5A CN112289150B (en) 2020-11-10 2020-11-10 Space-time mirror array device, interaction control method thereof and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011248635.5A CN112289150B (en) 2020-11-10 2020-11-10 Space-time mirror array device, interaction control method thereof and storage medium

Publications (2)

Publication Number Publication Date
CN112289150A true CN112289150A (en) 2021-01-29
CN112289150B CN112289150B (en) 2023-03-14

Family

ID=74350899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011248635.5A Active CN112289150B (en) 2020-11-10 2020-11-10 Space-time mirror array device, interaction control method thereof and storage medium

Country Status (1)

Country Link
CN (1) CN112289150B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201238326A (en) * 2011-03-04 2012-09-16 Tung-Fa Wu Real-time interactive 3D entertainment device and 3D replication
CN203630757U (en) * 2013-12-02 2014-06-04 上海禾木城市规划设计有限公司 Interactive magic mirror formed by splicing of multiple projectors
CN203706134U (en) * 2013-11-25 2014-07-09 吉林省装饰工程设计院有限公司 360-degree panoramic display motion sensing interactive system
CN104461006A (en) * 2014-12-17 2015-03-25 卢晨华 Internet intelligent mirror based on natural user interface
CN104643747A (en) * 2013-11-25 2015-05-27 西安思创达通讯科技有限责任公司 Mirror capable of automatically swerving
CN206349055U (en) * 2016-12-27 2017-07-21 上海宝瓶建筑装饰工程有限公司 A kind of Fresnel mirror principle show stand component
CN206672488U (en) * 2017-03-08 2017-11-24 上海亿品展示创意有限公司 A kind of optical principle interaction display device
CN107654960A (en) * 2012-05-07 2018-02-02 陈家铭 Lamp light control system and method
CN108027645A (en) * 2016-06-07 2018-05-11 宝娜科技有限公司 Mirror face display equipment and its operating method
CN108804064A (en) * 2018-05-31 2018-11-13 北京爱国小男孩科技有限公司 A kind of intelligent display system and its application process
US20190291647A1 (en) * 2018-03-22 2019-09-26 Simplehuman, Llc Voice-activated vanity mirror
CN209515169U (en) * 2019-04-14 2019-10-18 武汉景智云数字技术有限公司 Based on the visual wisdom exhibition room display systems of three-dimensional
CN110639215A (en) * 2019-10-08 2020-01-03 凯奇集团有限公司 Interactive installation based on optics reflection/refraction
CN210324104U (en) * 2019-09-30 2020-04-14 深圳市中亿睿科技有限公司 Magic mirror anthropology interaction display system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201238326A (en) * 2011-03-04 2012-09-16 Tung-Fa Wu Real-time interactive 3D entertainment device and 3D replication
CN107654960A (en) * 2012-05-07 2018-02-02 陈家铭 Lamp light control system and method
CN203706134U (en) * 2013-11-25 2014-07-09 吉林省装饰工程设计院有限公司 360-degree panoramic display motion sensing interactive system
CN104643747A (en) * 2013-11-25 2015-05-27 西安思创达通讯科技有限责任公司 Mirror capable of automatically swerving
CN203630757U (en) * 2013-12-02 2014-06-04 上海禾木城市规划设计有限公司 Interactive magic mirror formed by splicing of multiple projectors
CN104461006A (en) * 2014-12-17 2015-03-25 卢晨华 Internet intelligent mirror based on natural user interface
CN108027645A (en) * 2016-06-07 2018-05-11 宝娜科技有限公司 Mirror face display equipment and its operating method
CN206349055U (en) * 2016-12-27 2017-07-21 上海宝瓶建筑装饰工程有限公司 A kind of Fresnel mirror principle show stand component
CN206672488U (en) * 2017-03-08 2017-11-24 上海亿品展示创意有限公司 A kind of optical principle interaction display device
US20190291647A1 (en) * 2018-03-22 2019-09-26 Simplehuman, Llc Voice-activated vanity mirror
CN110353449A (en) * 2018-03-22 2019-10-22 新璞修人有限公司 The dressing glass of acoustic control
CN108804064A (en) * 2018-05-31 2018-11-13 北京爱国小男孩科技有限公司 A kind of intelligent display system and its application process
CN209515169U (en) * 2019-04-14 2019-10-18 武汉景智云数字技术有限公司 Based on the visual wisdom exhibition room display systems of three-dimensional
CN210324104U (en) * 2019-09-30 2020-04-14 深圳市中亿睿科技有限公司 Magic mirror anthropology interaction display system
CN110639215A (en) * 2019-10-08 2020-01-03 凯奇集团有限公司 Interactive installation based on optics reflection/refraction

Also Published As

Publication number Publication date
CN112289150B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
JP6513074B2 (en) Display device
US9949346B2 (en) Candle flame simulation using a projection system
EP3238432B1 (en) Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with movable illuminated region of interest
US7605861B2 (en) Apparatus and method for performing motion capture using shutter synchronization
US9723293B1 (en) Identifying projection surfaces in augmented reality environments
US9268520B1 (en) Altering content projection
US20110211110A1 (en) A method and an interactive system for controlling lighting and/or playing back images
CN110673716B (en) Method, device, equipment and storage medium for interaction between intelligent terminal and user
US10976905B2 (en) System for rendering virtual objects and a method thereof
WO2019229789A1 (en) Trained model suggestion system, trained model suggestion method, and program
US9547162B2 (en) Interactive projection system
US10079966B2 (en) Systems and techniques for capturing images for use in determining reflectance properties of physical objects
CN112289150B (en) Space-time mirror array device, interaction control method thereof and storage medium
WO2021249938A1 (en) A control system and method of configuring a light source array
EP4162773A1 (en) A control system and method of configuring a light source array
US20160119614A1 (en) Display apparatus, display control method and computer readable recording medium recording program thereon
CN111279796A (en) Lighting apparatus
US9124786B1 (en) Projecting content onto semi-persistent displays
KR20190104177A (en) Mobile robot device and its motion control method
JP2016118816A (en) Display system, display method, and program
KR20200102732A (en) Content display apparatus assembled digital signage and smart lighting
EP3841511B1 (en) Visual modelling system and method thereof
CN115002984A (en) Miniature stage lamp system
JP5176056B2 (en) Simultaneous control system for device units, lighting control system, and home appliance control system
WO2020219471A1 (en) Digital shadow box

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant