CN109085779B - Control system and method for non-contact type mutual entertainment experience and totem pole - Google Patents

Control system and method for non-contact type mutual entertainment experience and totem pole Download PDF

Info

Publication number
CN109085779B
CN109085779B CN201810868435.6A CN201810868435A CN109085779B CN 109085779 B CN109085779 B CN 109085779B CN 201810868435 A CN201810868435 A CN 201810868435A CN 109085779 B CN109085779 B CN 109085779B
Authority
CN
China
Prior art keywords
module
mouth
triode
right eye
left eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810868435.6A
Other languages
Chinese (zh)
Other versions
CN109085779A (en
Inventor
郭鑫
刘超
邵鹏鹏
吴楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haining Harbin Institute Of New Investment Partnership LP
Original Assignee
Haining Harbin Institute Of New Investment Partnership LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haining Harbin Institute Of New Investment Partnership LP filed Critical Haining Harbin Institute Of New Investment Partnership LP
Priority to CN201810868435.6A priority Critical patent/CN109085779B/en
Publication of CN109085779A publication Critical patent/CN109085779A/en
Application granted granted Critical
Publication of CN109085779B publication Critical patent/CN109085779B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2664Audio light, animation, stage, theatre light

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control system, a control method and a totem pole for non-contact type mutual entertainment experience, wherein the control system comprises a human-computer interaction server, a posture recognition module, an audio processing circuit, a sound, a motion control CPU, a lamp set, an upper left eye module, an upper right eye module, a lower left eye module, a lower right eye module, a mouth driver module and a mouth module, the lamp set comprises the lamp control module, the upper left eye module comprises an upper left eye rudder unit, the upper right eye module comprises an upper right eye rudder unit, the lower left eye module comprises a lower left eye rudder unit, the lower right eye module comprises a lower right eye rudder unit, and the mouth module comprises a mouth servo motor. The defects that the attractiveness of experience customers is insufficient, curiosity cannot be met and the like are overcome, culture explanation and propagation of audio properties can be performed in a targeted mode, and the problem that peripheral articles such as display boards occupy too large area is solved.

Description

Control system and method for non-contact type mutual entertainment experience and totem pole
Technical Field
The invention relates to the technical field of robots, in particular to a control system and method for non-contact type mutual entertainment experience and a totem pole.
Background
The totem pole is a mark of ancient culture of Indian on the west coast, is just the culture ecology of Indian people in the west, cultivates the totem pole culture with unique characteristics, each totem pole pattern has designated symbolic meaning, mainly takes human-shaped patterns, animal-shaped patterns and material object patterns, because each pattern has different meanings in different tribes along with the passage of time and differences of tribes, the totem types are roughly divided into five types, namely a house pole, a tomb pole, a commemorative pole, a jean pole or a pubic pole and a welcome pole, the material is mostly carved by red cedar, the thickness and the length are not clearly limited, the totem type is mainly high, the occupied area is large, and the storage time cannot be too long due to the fact that a wood structure is exposed to the natural environment for a long time, and is generally not more than 200 years.
At present, general totem-pole theme parks all have totem poles, all are the symbol of protection historical relic or possessing cultural value, generally do not allow touching, and its major structure is mainly wooden, and is unpowered model, can only provide ornamental value to experience the customer, lacks the interactivity.
Disclosure of Invention
In order to solve the technical problems, the invention provides the following scheme:
in a first aspect, the present invention provides a control system for a contactless interactive entertainment experience, comprising:
the human-computer interaction system comprises a human-computer interaction server, a posture recognition module, an audio processing circuit, a sound device, a motion control CPU, a lamp set, an upper left eye module, an upper right eye module, a lower left eye module, a lower right eye module, a mouth driver module and a mouth module, wherein the lamp set comprises the lamp control module;
the human-computer interaction server is respectively connected with the gesture recognition module, the audio processing circuit and the motion control CPU, receives experience client action data according to the gesture recognition module, processes and sends the experience client action data to the motion control CPU and the audio processing circuit;
the motion control CPU is connected with the light control module, the mouth driver module, the upper left eye steering unit, the upper right eye steering unit, the lower left eye steering unit and the lower right eye steering unit, and the mouth driver module is connected with the mouth servo motor and used for controlling the mouth servo motor so as to control the mouth module, the upper left eye module, the upper right eye module, the lower left eye module and the lower right eye module to dynamically feed back actions of experience customers;
the audio processing circuit is connected with the sound box to dynamically feed back and experience the sound of a client;
the lamp group is used for presenting the light effect of dazzling colored lamps.
Further, a state acquisition module is further arranged between the human-computer interaction server and the mouth servo motor to acquire the state of the mouth servo motor and feed back the state to the human-computer interaction server, so that the human-computer interaction server can perform feedback control on the mouth driver module according to a signal fed back by the state acquisition module.
Further, the state acquisition module is a photoelectric switch module, and when the mouth servo motor stops at the middle position before obtaining the control signal of the mouth driver module, the photoelectric switch module automatically restores to the stop position after the mouth servo motor obtains the control signal of the mouth driver module.
Further, the human-computer interaction server receives experience client action data according to the gesture recognition module, processes and sends the experience client action data to the motion control CPU and the audio processing circuit, and includes:
the gesture recognition module receives static image or dynamic image data of an external experience client in a specified range;
the gesture recognition module converts the static image or dynamic image data into a static gesture command and a dynamic gesture command and transmits the static gesture command and the dynamic gesture command to the man-machine interaction server;
the man-machine interaction server processes communication and interfaces among the modules, and various static attitude commands and dynamic attitude commands, compares and fits with data of a software database, calls corresponding data instructions in the database and sends the corresponding data instructions to the audio processing circuit and the motion control CPU.
Further, the motion control CPU adopts an STM32F103VET6 single chip microcomputer, the lamp set comprises at least one LED, the lamp control module comprises a transistor Q1, a MOS transistor Q2, a transistor Q3, a MOS transistor Q4, a transistor Q5, a transistor Q6, a transistor Q7, wherein a base of the transistor Q7 is connected to a PWM signal output terminal of the single chip microcomputer, a collector of the transistor Q7 is connected to a power input terminal of the single chip microcomputer through a diode D1, an emitter of the transistor Q7 is connected to a base of the transistor Q3, a collector of the transistor Q3 is connected to a source of the MOS transistor Q3, an emitter of the transistor Q3 is connected to a gate of the transistor Q3, a drain of the transistor Q3 is connected to a first terminal of the inductor L3, a drain of the second MOS transistor Q3 of the inductor L3 is connected to a collector of the transistor Q3 and a power input, an emitter of the transistor Q3 is connected to the gate of the transistor Q3 and a collector of the transistor Q3 is connected to the transistor Q3 through a diode D3, and, the first end of an inductor L1 is connected with the first end of a resistor R3 through a diode D3, the second end of a resistor R3 is used as the positive power input end of the lamp group, the second end of a resistor R3 is connected with the collector of a triode Q5 through a resistor R8, the emitter of a triode Q5 is connected with the first end of a resistor R12 through the second end of a resistor R12 and is used as the negative power input end of the lamp group, the base of a triode Q5 is connected with the base of a triode Q6 and the power input end of the singlechip, the collector of a triode Q6 is connected with the first end of a resistor R3 and the positive input end of an operational amplifier OPA through a resistor R9, the emitter of the triode Q6 is connected with the reference electrode of a regulator D5 and is connected with the negative power input end of the lamp group and the cathode of a regulator D5 through a resistor R13, the anode of a regulator D5 is connected with the base of a triode Q6, the output end of, resistor R14 is followed by the inverting input of OPA.
Furthermore, motion control CPU with upper portion left eye rudder unit upper portion right eye rudder unit the left eye rudder unit of lower part all is provided with steering wheel control circuit between each in the right eye rudder unit of lower part, steering wheel control circuit includes opto-coupler OC1, and the input of luminotron connects 3.3V voltage through resistance R16 among the opto-coupler OC1, and the output street singlechip PWM signal output of luminotron among the opto-coupler OC1, the input of photic ware among the opto-coupler OC1 connects steering wheel control circuit's rotation angle control signal input, and the output of photic ware is ground connection.
Further, each of the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit, and the lower right eye rudder unit at least includes one 180 ° HS-645MG steering engine module.
Further, mouth driver module includes opto-coupler OC2 and opto-coupler OC3, the input of luminotron in opto-coupler OC2 connects 3.3V voltage through resistance R20, the output of luminotron connects singlechip pulse output pin in opto-coupler OC2, the input of photic ware connects mouth driver module pulse input among opto-coupler OC2, the output ground connection of photic ware, the input of luminotron connects 3.3V voltage through resistance R22 among opto-coupler OC3, the output of luminotron connects singlechip pulse output pin among opto-coupler OC3, the input of photic ware connects mouth driver module direction input among opto-coupler OC3, the output ground connection of photic ware.
In a second aspect, the present invention provides a control method for the control system for contactless interactive entertainment experience, comprising:
the SS1 and the gesture recognition module receive the static image or dynamic image data of the external experience client in the designated range;
the SS2 and the gesture recognition module convert the static image or dynamic image data into a static gesture command and a dynamic gesture command and transmit the static gesture command and the dynamic gesture command to the man-machine interaction server;
SS3, the man-machine interaction server processes the communication, interface and various static attitude command and dynamic attitude command among each module, and compares and fits with the data of the software database, calls the corresponding data command in the database and sends to the audio processing circuit and the motion control CPU;
the SS4 and the motion control CPU receive and analyze the data instruction and send control signals to the light control module, the mouth driver module, the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit;
the SS5, the mouth driver module, the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit respectively control the mouth module, the upper left eye module, the upper right eye module, the lower left eye module and the lower right eye module to dynamically feed back actions of experience customers;
SS6, audio processing circuit experiences the sound of customer through the dynamic feedback of sound; the light control module controls the light group to show a colorful light effect;
SS7, state collection module gathers mouth servo motor state information to with state information feedback to human-computer interaction server, human-computer interaction server according to mouth servo motor state information adjustment control command, and when mouth servo motor stopped in the intermediate position before obtaining mouth driver module control signal, wait that mouth servo motor obtains mouth driver module control signal after, photoelectric switch module automatic recovery to the dead point position.
In a third aspect, the present invention provides a totem pole, which comprises the above-mentioned control system for non-contact type mutual entertainment experience, and operates by adopting the above-mentioned control method.
The non-contact type mutual entertainment experience control system solves the defects that the attraction of experience customers is insufficient, curiosity cannot be met and the like, can pertinently explain and spread the audio character culture, and saves overlarge occupied area of peripheral articles such as display boards and the like.
Drawings
Fig. 1 illustrates a block diagram of a control system for a contactless interactive entertainment experience according to the present invention.
Fig. 2 shows a circuit configuration diagram of a light control module of the present invention.
Fig. 3 shows a steering engine control circuit configuration diagram of the present invention.
FIG. 4 shows a diagram of an RS232 communication circuit according to the present invention.
Fig. 5 shows an audio processing circuit configuration diagram of the present invention.
Fig. 6 shows a circuit configuration diagram of the mouth driver module of the present invention.
Detailed Description
The technical scheme of the invention is further explained by combining the drawings and the specific embodiments in the specification. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
The embodiment of the application provides a control system for non-contact type mutual entertainment experience, and the system principle is as follows: experience clients make designated actions or postures according to instructions, action or posture signals are received through a KINECT (posture recognition) signal receiving device, the signals are sent to a server to be analyzed and recognized according to software properties, signals are output to a sound box and a motion control CPU according to corresponding analysis contents set in the sound box, audio information is directly output by the sound box, the motion control CPU drives an upper eye action feedback group and a lower eye action feedback group and sends signals to a mouth driver module according to the signals, the mouth driver module outputs the signals to a mouth servo motor according to the received signals to achieve mouth action feedback, and meanwhile, the motion control CPU also drives a light control module to drive a lamp group to emit colorful light.
Fig. 1 illustrates a block diagram of a control system for a contactless interactive entertainment experience according to the present invention.
As shown in fig. 1, a control system for a contactless interactive entertainment experience comprises: the human-computer interaction system comprises a human-computer interaction server, a posture recognition module, an audio processing circuit, a sound device, a motion control CPU, a lamp set, an upper left eye module, an upper right eye module, a lower left eye module, a lower right eye module, a mouth driver module and a mouth module, wherein the lamp set comprises the lamp control module;
the human-computer interaction server is respectively connected with the gesture recognition module, the audio processing circuit and the motion control CPU, receives experience client action data according to the gesture recognition module, processes and sends the experience client action data to the motion control CPU and the audio processing circuit;
the motion control CPU is connected with the light control module, the mouth driver module, the upper left eye steering unit, the upper right eye steering unit, the lower left eye steering unit and the lower right eye steering unit, and the mouth driver module is connected with the mouth servo motor and used for controlling the mouth servo motor so as to control the mouth module, the upper left eye module, the upper right eye module, the lower left eye module and the lower right eye module to dynamically feed back actions of experience customers;
the audio processing circuit is connected with the sound box to dynamically feed back and experience the sound of a client;
the lamp group is used for presenting the light effect of dazzling colored lamps.
According to an embodiment of the invention, a state acquisition module is further included between the human-computer interaction server and the mouth servo motor to acquire the state of the mouth servo motor and feed back the state to the human-computer interaction server, so that the human-computer interaction server performs feedback control on the mouth driver module according to a signal fed back by the state acquisition module.
According to an embodiment of the present invention, the state acquisition module is a photoelectric switch module, and when the mouth servo motor stops at the middle position before receiving the mouth driver module control signal, the photoelectric switch module automatically returns to the stop position after the mouth servo motor receives the mouth driver module control signal.
According to an embodiment of the present invention, the human-computer interaction server receives experience client action data according to a gesture recognition module, processes the experience client action data, and sends the experience client action data to the motion control CPU and the audio processing circuit, and the human-computer interaction server includes:
the gesture recognition module receives static image or dynamic image data of an external experience client in a specified range;
the gesture recognition module converts the static image or dynamic image data into a static gesture command and a dynamic gesture command and transmits the static gesture command and the dynamic gesture command to the man-machine interaction server;
the man-machine interaction server processes communication and interfaces among the modules, and various static attitude commands and dynamic attitude commands, compares and fits with data of a software database, calls corresponding data instructions in the database and sends the corresponding data instructions to the audio processing circuit and the motion control CPU.
According to one embodiment of the invention, the motion control CPU adopts an STM32F103VET6 single chip microcomputer, and the lamp group comprises at least one LED.
Fig. 2 shows a circuit configuration diagram of a light control module of the present invention.
As shown in fig. 2, the lamp control module includes a transistor Q1, a transistor Q2, a transistor Q3, a transistor Q4, a transistor Q5, a transistor Q6, and a transistor Q7, wherein a base of the transistor Q7 is connected to a PWM signal output terminal of the single chip microcomputer, a collector of the transistor Q7 is connected to a power input terminal of the single chip microcomputer via a diode D7, an emitter of the transistor Q7 is connected to a base of the transistor Q7, a collector of the transistor Q7 is connected to a source of the transistor Q7, an emitter of the transistor Q7 is connected to a gate of the transistor Q7, a drain of the transistor Q7 is connected to a first terminal of the inductor L7, a second terminal of the inductor L7 is connected to a drain of the transistor Q7, a source of the transistor Q7 is connected to a collector of the transistor Q7 and a power input, an emitter of the transistor Q7 is connected to the gate of the transistor Q7 and a collector of the transistor Q7 is connected to a first terminal of the diode R7 via a diode D7, the second end of the resistor R3 is used as the positive power input end of the lamp group, the second end of the resistor R3 is connected with the collector of the triode Q5 through the resistor R8, the emitter of the triode Q5 is connected with the first end of the resistor R12 through the resistor R12 as the negative power input end of the lamp group, the base of the triode Q5 is connected with the base of the triode Q6 and the power input end of the single chip microcomputer, the collector of the triode Q6 is connected with the first end of the resistor R3 through the resistor R9 and is connected with the positive input end of the operational amplifier OPA, the emitter of the triode Q6 is connected with the reference electrode of the regulator D5 and with the negative power input end of the lamp group and the cathode of the regulator D5 through the resistor R13, the anode of the regulator D5 is connected with the base of the triode Q6, and the output end of the operational amplifier OPA is connected with the clock signal end of the single chip microcomputer and the negative input end of the.
When the PWM signal is high, the transistor Q7 is turned on, the gates of the MOS transistors Q2 and MOSQ4 are both high, and at this time, the MOS transistors Q2 and MOSQ4 are turned on, the input power supplies charge the inductor L1, the inductor current linearly increases, the energy increases, the current flows to Vin-the MOS transistor Q4-the inductor L1-the MOS transistor Q2 returns, when the PWM signal is low, the transistor Q7 is turned off, the gates of the MOS transistors Q2 and MOSQ4 are both low, at this time, the MOS transistors Q2 and MOSQ4 are turned off, the inductor L1 releases the energy, and supplies power to the lamp set, and the current flows to the lamp set of the diode D6-the inductor L1-the diode D3-and returns.
Fig. 3 shows a steering engine control circuit configuration diagram of the present invention.
As shown in fig. 3, the motion control CPU and the upper portion left eye rudder unit, the upper portion right eye rudder unit, the lower portion left eye rudder unit, all be provided with steering wheel control circuit between each in the lower portion right eye rudder unit, steering wheel control circuit includes opto-coupler OC1, and the input of luminotron in opto-coupler OC1 connects 3.3V voltage through resistance R16, and the output of luminotron in opto-coupler OC1 meets the rotation angle control signal input of rudder control circuit, and the output of photic ware is ground connection in the input of photic ware among opto-coupler OC 1.
The optocoupler OC1 adopts a TLP521 type optocoupler.
According to an embodiment of the present invention, each of the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit, and the lower right eye rudder unit comprises at least one 180 ° HS-645MG steering engine module.
The singlechip control program outputs square waves with the period of 20ms, the high level time is controlled to change between 0.5ms and 2.5ms by changing the duty ratio of the waveform, namely PWM waves, the light emitting diodes in the optical coupler are controlled through the current limiting resistor R16, the transistor on the amplifying side of the optical coupler is driven, the signals are isolated and converted into steering engine control signals with the level of 5V through the optical coupler OC1, and the rotation angles of the steering engines are further controlled through the current limiting resistor R17.
FIG. 4 shows a diagram of an RS232 communication circuit according to the present invention.
The level of the computer communication port is converted into a level signal used by the singlechip through a MAX3232CE control chip. The communication function between the server and the control panel is realized.
Fig. 5 shows an audio processing circuit configuration diagram of the present invention.
As shown in fig. 5, the audio processing circuit includes a power amplifier chip, a forward input terminal of the power amplifier chip is connected to an audio acquisition signal input terminal via a variable resistor R19 and a capacitor C19, and a Vout output terminal of the power amplifier chip is connected to a sound device as an audio signal output terminal.
The power amplifier chip adopts an LM386 chip.
The audio acquisition signal port is used for acquiring audio signals, the audio signals are amplified by the audio power amplifier LM386 to become signal amplitude which can be identified by the single chip microcomputer, and the signal amplitude is acquired and processed by the single chip microcomputer AD to be converted into mouth motion amplitude, so that the mouth and audio matched motion is realized.
Fig. 6 shows a circuit configuration diagram of the mouth driver module of the present invention.
As shown in fig. 6, the mouth driver module includes an optical coupler OC2 and an optical coupler OC3, an input end of a light emitting tube in the optical coupler OC2 is connected with 3.3V voltage through a resistor R20, an output end of the light emitting tube in the optical coupler OC2 is connected with a pulse output pin of a single chip microcomputer, an input end of a light receiver in the optical coupler OC2 is connected with a pulse input end of the mouth driver module, an output end of the light receiver is grounded, an input end of the light emitting tube in the optical coupler OC3 is connected with 3.3V voltage through a resistor R22, an output end of the light emitting tube in the optical coupler OC3 is connected with the pulse output pin of the single chip microcomputer, an input end of the light.
The singlechip outputs pulses, controls the optocoupler to convert the pulses into 5V level signals which can be used by the servo motor through the current limiting resistor R20, and controls the positioning operation function of the servo motor. The direction control pin of the single chip microcomputer controls the rotation direction of the servo motor by changing the high and low levels.
In addition, a method for the control system for the contactless mutual entertainment experience is also provided, which comprises the following steps:
the SS1 and the gesture recognition module receive the static image or dynamic image data of the external experience client in the designated range;
the SS2 and the gesture recognition module convert the static image or dynamic image data into a static gesture command and a dynamic gesture command and transmit the static gesture command and the dynamic gesture command to the man-machine interaction server;
SS3, the man-machine interaction server processes the communication, interface and various static attitude command and dynamic attitude command among each module, and compares and fits with the data of the software database, calls the corresponding data command in the database and sends to the audio processing circuit and the motion control CPU;
the SS4 and the motion control CPU receive and analyze the data instruction and send control signals to the light control module, the mouth driver module, the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit;
the SS5, the mouth driver module, the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit respectively control the mouth module, the upper left eye module, the upper right eye module, the lower left eye module and the lower right eye module to dynamically feed back actions of experience customers;
SS6, audio processing circuit experiences the sound of customer through the dynamic feedback of sound; the light control module controls the light group to show a colorful light effect;
SS7, state collection module gathers mouth servo motor state information to with state information feedback to human-computer interaction server, human-computer interaction server according to mouth servo motor state information adjustment control command, and when mouth servo motor stopped in the intermediate position before obtaining mouth driver module control signal, wait that mouth servo motor obtains mouth driver module control signal after, photoelectric switch module automatic recovery to the dead point position.
In addition, the invention also provides a totem pole, which adopts the control system for the non-contact type mutual entertainment experience, and the control method in the invention operates.
The totem pole analyzes and processes data according to static images or dynamic image data of external experience clients in a designated range and sends the data to the outsider server software end, corresponding data instructions in the database are compared and called and sent to the audio output module and the motion control CPU module, corresponding data are directly output by audio, and the motion control CPU module controls a plurality of groups of steering engines and servo motors to move and outputs corresponding data.
According to the scheme of the invention, the interactive entertainment function is added, the experience client can make the totem pole dynamically feed back and make a sound according to the designated action or posture, the play value is improved for the experience client, the experience client can moderately relax the mood to participate in the experience interaction, the joy degree is improved, the positive effects of knowing the relevant culture and the like are increased, the totem pole can be stored for a long time, the totem pole cannot be damaged after the experience client touches the totem pole, and the totem pole can be used indoors and outdoors and has small occupied area.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Program instructions which invoke the methods of the present application may be stored on a fixed or removable recording medium and/or transmitted via a data stream on a broadcast or other signal-bearing medium and/or stored within a working memory of a computer device operating in accordance with the program instructions. An embodiment according to the present application comprises a device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the device to perform a method and/or a solution according to the aforementioned embodiments of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of devices recited in the device claims may also be implemented by one device through software or hardware.

Claims (9)

1. A control system for a contactless interactive entertainment experience, comprising:
the human-computer interaction system comprises a human-computer interaction server, a posture recognition module, an audio processing circuit, a sound device, a motion control CPU, a lamp set, an upper left eye module, an upper right eye module, a lower left eye module, a lower right eye module, a mouth driver module and a mouth module, wherein the lamp set comprises the lamp control module;
the human-computer interaction server is respectively connected with the gesture recognition module, the audio processing circuit and the motion control CPU, receives experience client action data according to the gesture recognition module, processes and sends the experience client action data to the motion control CPU and the audio processing circuit;
the motion control CPU is connected with the light control module, the mouth driver module, the upper left eye steering unit, the upper right eye steering unit, the lower left eye steering unit and the lower right eye steering unit, and the mouth driver module is connected with the mouth servo motor and used for controlling the mouth servo motor so as to control the mouth module, the upper left eye module, the upper right eye module, the lower left eye module and the lower right eye module to dynamically feed back actions of experience customers;
the audio processing circuit is connected with the sound box to dynamically feed back and experience the sound of a client;
the lamp group is used for presenting the light effect of the colorful lamp,
the movement control CPU adopts an STM32F103VET6 singlechip, the lamp group comprises at least one LED, the lamplight control module comprises a triode Q1, an MOS tube Q2, a triode Q3, an MOS tube Q4, a triode Q5, a triode Q6 and a triode Q7, wherein the base of the triode Q7 is connected with the PWM signal output end of the singlechip, the collector of the triode Q7 is connected with the power input end of the singlechip through a diode D1, the emitter of the triode Q7 is connected with the base of the triode Q3, the collector of the triode Q3 is connected with the source of the MOS tube Q3, the emitter of the triode Q3 is connected with the gate of the MOS tube Q3, the drain of the MOS tube Q3 is connected with the first end of an inductor L3, the second drain of the MOS tube Q3 of the inductor L3 is connected with the drain of the MOS tube Q3, the source of the MOS tube Q3 is connected with the collector of the triode Q3 and the power input, the emitter of the triode Q3 is connected with the gate of the triode Q3 through a diode D3, the first end of an inductor L1 is connected with the first end of a resistor R3 through a diode D3, the second end of a resistor R3 is used as the positive power input end of the lamp group, the second end of a resistor R3 is connected with the collector of a triode Q5 through a resistor R8, the emitter of a triode Q5 is connected with the first end of a resistor R12 through the second end of a resistor R12 and is used as the negative power input end of the lamp group, the base of a triode Q5 is connected with the base of a triode Q6 and the power input end of the singlechip, the collector of a triode Q6 is connected with the first end of a resistor R3 and the positive input end of an operational amplifier OPA through a resistor R9, the emitter of the triode Q6 is connected with the reference electrode of a regulator D5 and is connected with the negative power input end of the lamp group and the cathode of a regulator D5 through a resistor R13, the anode of a regulator D5 is connected with the base of a triode Q6, the output end of, resistor R14 is followed by the inverting input of OPA.
2. The control system for non-contact type interactive entertainment experience according to claim 1, further comprising a state acquisition module between the human-computer interaction server and the mouth servo motor for acquiring the state of the mouth servo motor and feeding back the state to the human-computer interaction server, so that the human-computer interaction server performs feedback control on the mouth driver module according to a signal fed back by the state acquisition module.
3. The control system for a non-contact type interactive entertainment experience of claim 2, wherein the state acquisition module is a photoelectric switch module, and when the mouth servo motor stops at a middle position before receiving the mouth driver module control signal, the photoelectric switch module automatically returns to a stop position after the mouth servo motor receives the mouth driver module control signal.
4. The control system for a non-contact type mutual entertainment experience according to claim 1, wherein the human-computer interaction server receives experience client action data according to a gesture recognition module, processes and sends the experience client action data to the motion control CPU and the audio processing circuit, and comprises:
the gesture recognition module receives static image or dynamic image data of an external experience client in a specified range;
the gesture recognition module converts the static image or dynamic image data into a static gesture command and a dynamic gesture command and transmits the static gesture command and the dynamic gesture command to the man-machine interaction server;
the man-machine interaction server processes communication and interfaces among the modules, and various static attitude commands and dynamic attitude commands, compares and fits with data of a software database, calls corresponding data instructions in the database and sends the corresponding data instructions to the audio processing circuit and the motion control CPU.
5. The control system for non-contact type interactive entertainment experience according to claim 1, wherein a steering engine control circuit is arranged between the motion control CPU and each of the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit, the steering engine control circuit comprises an optical coupler OC1, an input end of a light emitting tube in the optical coupler OC1 is connected with 3.3V voltage through a resistor R16, an output end of the light emitting tube in the optical coupler OC1 is connected with a single chip microcomputer PWM signal output end, an input end of a light receiver in the optical coupler OC1 is connected with a rotation angle control signal input end of the steering engine control circuit, and an output end of the light receiver is grounded.
6. The control system for a contactless interactive entertainment experience of claim 5 wherein each of the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit, the lower right eye rudder unit includes at least one 180 ° HS-645MG steering engine module.
7. The control system for a non-contact type interactive entertainment experience according to claim 5, wherein the mouth driver module comprises an optical coupler OC2 and an optical coupler OC3, an input end of a light emitting tube in the optical coupler OC2 is connected with 3.3V voltage through a resistor R20, an output end of a light emitting tube in the optical coupler OC2 is connected with a pulse output pin of the single chip microcomputer, an input end of a light receiver in the optical coupler OC2 is connected with a pulse input end of the mouth driver module, an output end of the light receiver is grounded, an input end of a light emitting tube in the optical coupler OC3 is connected with 3.3V voltage through a resistor R22, an output end of a light emitting tube in the optical coupler OC3 is connected with a pulse output pin of the single chip microcomputer, an input end of a light receiver in the optical coupler.
8. A control method for a control system for a contactless interactive entertainment experience according to any of the claims 1-7, characterized in that it comprises:
the SS1 and the gesture recognition module receive the static image or dynamic image data of the external experience client in the designated range;
the SS2 and the gesture recognition module convert the static image or dynamic image data into a static gesture command and a dynamic gesture command and transmit the static gesture command and the dynamic gesture command to the man-machine interaction server;
SS3, the man-machine interaction server processes the communication, interface and various static attitude command and dynamic attitude command among each module, and compares and fits with the data of the software database, calls the corresponding data command in the database and sends to the audio processing circuit and the motion control CPU;
the SS4 and the motion control CPU receive and analyze the data instruction and send control signals to the light control module, the mouth driver module, the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit;
the SS5, the mouth driver module, the upper left eye rudder unit, the upper right eye rudder unit, the lower left eye rudder unit and the lower right eye rudder unit respectively control the mouth module, the upper left eye module, the upper right eye module, the lower left eye module and the lower right eye module to dynamically feed back actions of experience customers;
SS6, audio processing circuit experiences the sound of customer through the dynamic feedback of sound; the light control module controls the light group to show a colorful light effect;
SS7, state collection module gathers mouth servo motor state information to with state information feedback to human-computer interaction server, human-computer interaction server according to mouth servo motor state information adjustment control command, and when mouth servo motor stopped in the intermediate position before obtaining mouth driver module control signal, wait that mouth servo motor obtains mouth driver module control signal after, photoelectric switch module automatic recovery to the dead point position.
9. A totem pole comprising a control system for a contactless interactive entertainment experience according to any of claims 1 to 7 and operating with the control method of claim 8.
CN201810868435.6A 2018-08-02 2018-08-02 Control system and method for non-contact type mutual entertainment experience and totem pole Expired - Fee Related CN109085779B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810868435.6A CN109085779B (en) 2018-08-02 2018-08-02 Control system and method for non-contact type mutual entertainment experience and totem pole

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810868435.6A CN109085779B (en) 2018-08-02 2018-08-02 Control system and method for non-contact type mutual entertainment experience and totem pole

Publications (2)

Publication Number Publication Date
CN109085779A CN109085779A (en) 2018-12-25
CN109085779B true CN109085779B (en) 2021-05-25

Family

ID=64833617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810868435.6A Expired - Fee Related CN109085779B (en) 2018-08-02 2018-08-02 Control system and method for non-contact type mutual entertainment experience and totem pole

Country Status (1)

Country Link
CN (1) CN109085779B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101332362A (en) * 2008-08-05 2008-12-31 北京中星微电子有限公司 Interactive delight system based on human posture recognition and implement method thereof
CN101954191A (en) * 2010-08-25 2011-01-26 颜小洋 Intelligent entertainment mobile robot
CN103593691A (en) * 2013-11-26 2014-02-19 刘启强 Real scene interactive entertainment system
CN105171752A (en) * 2015-09-10 2015-12-23 刘玉伟 Robot system and method thereof
CN107471229A (en) * 2017-09-30 2017-12-15 江西洪都航空工业集团有限责任公司 A kind of Edutainment robot based on ROS frameworks

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8461724B2 (en) * 2009-07-27 2013-06-11 Live-FX, LLC Universal control system with universal interface to operate a plurality of devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101332362A (en) * 2008-08-05 2008-12-31 北京中星微电子有限公司 Interactive delight system based on human posture recognition and implement method thereof
CN101954191A (en) * 2010-08-25 2011-01-26 颜小洋 Intelligent entertainment mobile robot
CN103593691A (en) * 2013-11-26 2014-02-19 刘启强 Real scene interactive entertainment system
CN105171752A (en) * 2015-09-10 2015-12-23 刘玉伟 Robot system and method thereof
CN107471229A (en) * 2017-09-30 2017-12-15 江西洪都航空工业集团有限责任公司 A kind of Edutainment robot based on ROS frameworks

Also Published As

Publication number Publication date
CN109085779A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN203300127U (en) Children teaching and monitoring robot
CN205516491U (en) Mutual children's toys of intelligence
CN203167902U (en) Multifunctional intelligent mosquito repelling device
CN208629445U (en) Autonomous introduction system platform robot
KR20180124310A (en) Modular Robot System for Coding Education Using Artificial Intelligence
CN105718880A (en) Remote office signing and handwriting verification robot system design
CN206434286U (en) Combined intelligent sweeping robot
CN109032041A (en) A kind of spin control system based on camera position detection
CN205247209U (en) Intelligent meal delivery robot
Chang et al. Smart hat: design and implementation of a wearable learning device for kids using AI and IoTs techniques
CN109085779B (en) Control system and method for non-contact type mutual entertainment experience and totem pole
CN202661965U (en) Optical recognition wireless teaching pen
CN204045122U (en) A kind of sweeping robot device
CN201226189Y (en) Multimedia virtual teaching equipment
CN104133398B (en) Control circuit inside supermarket shopping robot
CN104490408A (en) Wearing-type animal behavior trace tracking device and wear-type animal behavior trace tracking method
CN104637393A (en) Intelligent human-machine tellurion
CN111862727A (en) Artificial intelligence graphical programming teaching platform and method
CN205693758U (en) A kind of VR equipment with night vision function
US20180341366A1 (en) Non-edge projection and integrated touch electronic whiteboard
CN212044742U (en) Modularization color recognition robot
Ma et al. A cloud-based quadruped service robot with multi-scene adaptability and various forms of human-robot interaction
CN106502171A (en) A kind of multi-sensory speech robot based on embedded system
CN203566698U (en) Building block type intelligent robot with micro-control unit capable of being inserted, replaced and erased
CN203950293U (en) A kind of intelligent virtual blackboard system photoelectricity erase pen

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210525

CF01 Termination of patent right due to non-payment of annual fee