CN112328126A - System and method for judging screen shooting of ejection toy - Google Patents

System and method for judging screen shooting of ejection toy Download PDF

Info

Publication number
CN112328126A
CN112328126A CN202011106427.1A CN202011106427A CN112328126A CN 112328126 A CN112328126 A CN 112328126A CN 202011106427 A CN202011106427 A CN 202011106427A CN 112328126 A CN112328126 A CN 112328126A
Authority
CN
China
Prior art keywords
impact
microphone
sound
screen
transparent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011106427.1A
Other languages
Chinese (zh)
Inventor
刘亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Ziyao Information Technology Co ltd
Original Assignee
Shanghai Ziyao Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Ziyao Information Technology Co ltd filed Critical Shanghai Ziyao Information Technology Co ltd
Priority to CN202011106427.1A priority Critical patent/CN112328126A/en
Publication of CN112328126A publication Critical patent/CN112328126A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/02Shooting or hurling games
    • A63F9/0204Targets therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Toys (AREA)

Abstract

The invention relates to a positioning method, in particular to a system and a method for judging screen shooting of an ejection toy. The system structure comprises a display screen for displaying images, a transparent impact shielding screen is arranged in front of the display screen, sound collection devices are arranged around the transparent impact shielding screen, impact sounds generated when external objects impact the transparent impact shielding screen are collected by the sound collection devices and sent to a data processor for calculation, and coordinates of an impact point are obtained through calculation. The invention can calculate the coordinate value of the impact point by utilizing the difference of the transmission distances and the time difference of the impact sounds acquired from different acquisition positions, thereby determining the impact point so as to obtain the impact point in the game program by matching with the game software at the later stage.

Description

System and method for judging screen shooting of ejection toy
Technical Field
The invention relates to a positioning method, in particular to a system and a method for judging a screen shooting impact point of an ejection toy.
Background
In the prior art, touch screen games are widely applied to the entertainment field, and users can realize touch collision on a touch screen through a touch screen technology to realize a game effect. However, because the touch screen is manufactured at high cost, and for some games requiring strong impact, such as archery and pitching, the current relatively delicate touch screen cannot meet the requirements.
In order to realize the click positioning function of the touch screen and solve the problem of non-impact resistance of the conventional common touch screen, how to realize the positioning of the impact point of the impact-resistant screen becomes a problem which is difficult to solve.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a system and a method for judging the shot screen impact point of an ejection toy, which can calculate the coordinate value of the impact point by utilizing the difference of the transmission distances and the time difference of impact sounds acquired from different acquisition positions so as to determine the impact point, so that the impact point in a game program can be obtained by matching with game software at a later stage.
The technical scheme adopted by the invention is as follows:
the utility model provides a launch toy and penetrate screen decision-making system, is including the display screen that shows the image, and display screen the place ahead sets up transparent striking and hides the screen, transparent striking be provided with sound collection system around hiding the screen, sound collection system bump into external object transparent striking and hide the striking sound collection that produces on the screen and send for data processor and calculate, calculate the coordinate that obtains the striking point.
The sound collection device comprises at least 2 groups of microphones, the microphones are distributed around the transparent impact shielding screen, the microphones are respectively and electrically connected with a sound processor with a noise reduction module through lines, and the sound processor is electrically connected with the data processor.
The microphones are 2 groups.
The microphones are 3 groups, which are distributed around the transparent impact shielding screen and the three groups are not in the same straight line.
The microphones are 4 groups, and are distributed around the transparent impact shielding screen, and any three groups are not in the same straight line.
A screen shooting judgment method for an ejection toy comprises the following steps:
A. the external object impacts the transparent impact shielding screen, and the generated impact sound is transmitted to the periphery of the transparent impact shielding screen;
B. the microphone arranged around the transparent impact shielding screen collects the impact sound and sends the impact sound to the sound processor, and the sound processor reduces noise and identifies the impact sound, converts the impact sound into a digital signal and sends the digital signal to the data processor for calculation;
C. and calculating to obtain the coordinates of the impact point.
The specific way of identification is: the identification is based on the frequency, amplitude of the impact sound and the air vibration of the transverse or longitudinal waves.
After the impact sound is converted into the digital signal, the digital signal is compared with the digital voice signal of the impact sound recorded in advance, the corresponding digital voice signal belonging to the impact sound is screened out, and the time for acquiring the impact sound is recorded.
The algorithm adopted in the calculation in the step C specifically comprises the following steps:
c1, speed of sound in air 340 m/s, assuming coordinates of impact point (x)i,yi,zi);
C2, selecting the microphone to be used, assuming the coordinates of its location point: the first microphone has coordinates of (x)1,y1,z1) The second microphone has coordinates of (x)2,y2,z2) The coordinate of the third microphone is (x)3,y3,z3) The coordinate of the fourth microphone is (x)4,y4,z4) … …, and so on;
c3, the time difference between the arrival of the impact sound at the first microphone and the second microphone is ti,12The time difference between the sound arriving at the second microphone and the third microphone is ti,23The time difference between the sound arriving at the third microphone and the fourth microphone is ti,34The time difference between the sound arriving at the fourth microphone and the first microphone is ti,41… …; taking absolute values of the time differences;
the difference d between the distance from the impact sound to the first microphone and the distance from the impact sound to the second microphonei,12A difference d between the arrival of the impact sound at the second microphone and the arrival of the impact sound at the third microphonei,23A difference d in distance between the arrival of the impact sound at the third microphone and the arrival of the impact sound at the fourth microphonei,34The difference d in the distance between the arrival of the impact sound at the fourth microphone and the arrival of the impact sound at the first microphonei,41… …, respectively; taking an absolute value of the distance;
since the distance is the speed of sound, di,12=ti,12340. f; solving d in the same wayi,23;di,34And di,41……;
C4, calculated according to the following formula:
Figure BDA0002727068700000021
the coordinate (x) of the impact point can be obtained according to the calculation formulai,yi,zi);
And C5, converting the coordinates of the impact point into pixel rendering point coordinates on the display screen, and transmitting the data of the pixel rendering point coordinates on the display screen to the display screen for displaying.
The technical scheme provided by the invention has the beneficial effects that:
the invention collects the sound generated by the impact of external objects such as water bombs and the like on the transparent impact shielding screen through the microphones arranged at fixed positions, positions the impact points of the delay time of the sound collected by different microphones through calculation, and converts the delay time into digital signals for storage and retransmission, thereby being matched with the impact point data in a game system and realizing the reproduction of the impact effect.
The method can realize the purpose that the high-strength impact-resistant protection screen can also realize the positioning of the impact point, and can avoid the problems that a touch screen is fragile, easy to damage after being used for a long time and easy to reduce the sensitivity.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an algorithm (two sets of microphones) of a screen shot determination system and method for a toy ejection according to the present invention;
FIG. 2 is a schematic diagram of an algorithm (three sets of microphones) of the screen shot determination system and method for the ejection toy of the present invention;
fig. 3 is an algorithm diagram (four groups of microphones) of the screen shot determination system and method for the ejection toy of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Example one
This embodiment provides a launch toy and penetrate screen striking point decision-making system, including the display screen that shows the image, display screen the place ahead set up transparent striking and hide the screen, transparent striking hide the screen around be provided with sound collection system, sound collection system with the striking sound collection that produces on the transparent striking hides the screen of external object striking and send for data processor and calculate, calculate the coordinate that obtains the striking point.
If transparent striking shelters from the screen and all sound collection system all are in transparent striking and shelter from on the plane at screen place, transparent striking shelters from the screen and is in a certain single quadrant, and the sound collection system of this embodiment includes 2 microphones of group. The microphone is electrically connected with a sound processor with a noise reduction module through a line, and the sound processor is electrically connected with the data processing module and is respectively positioned at the left side and the right side or the upper side and the lower side of the transparent impact shielding screen.
As shown in fig. 1, F1 and F2 are the positions of the microphones, M is the impact point, and the dotted line portion is the position of the transparent impact shield. The data processor obtains the time point of obtaining of the sound generated by the sound processor through obtaining of the first microphone F1 and the second microphone F2, calculates a plurality of time differences, calculates the position of a display mark on the display screen, and displays a mark pattern on the display screen, wherein the mark pattern is used for corresponding to the impact area of an external object such as a water bomb and a transparent impact shielding screen.
Example two
As shown in fig. 2, the present embodiment is different from the first embodiment in that the microphones are 3 groups, i.e., F1, F2, and F3, which are distributed around the transparent impact shielding screen and the three groups are not in the same straight line.
EXAMPLE III
As shown in fig. 3, the present embodiment is different from the first embodiment in that the microphones are 4 groups, i.e., F1, F2, F3 and F4, which are distributed around the transparent impact shielding screen and any three groups are not in the same straight line.
Example four
The embodiment provides a method for judging screen shooting of an ejection toy, which comprises the following steps:
A. the external object impacts the transparent impact shielding screen, and the generated impact sound is transmitted to the periphery of the transparent impact shielding screen;
B. the microphone arranged around the transparent impact shielding screen collects the impact sound and sends the impact sound to the sound processor, and the sound processor reduces noise and identifies the impact sound, converts the impact sound into a digital signal and sends the digital signal to the data processor for calculation;
C. and calculating to obtain the coordinates of the impact point.
The specific way of identification is: the identification is based on the frequency, amplitude of the impact sound and the air vibration of the transverse or longitudinal waves.
After the impact sound is converted into the digital signal, the digital signal is compared with the digital voice signal of the impact sound recorded in advance, the corresponding digital voice signal belonging to the impact sound is screened out, and the time for acquiring the impact sound is recorded.
The algorithm adopted in the calculation in the step C specifically comprises the following steps:
c1, speed of sound in air 340 m/s, assuming coordinates of impact point (x)i,yi,zi);
C2, selecting the microphone to be used, assuming the coordinates of its location point: the first microphone has coordinates of (x)1,y1,z1) The second microphone has coordinates of (x)2,y2,z3) The coordinate of the third microphone is (x)3,y3,z3) The coordinate of the fourth microphone is (x)4,y4,z4) … …, and so on;
c3, the time difference between the sound reaching the first microphone and the second microphone is ti,12The time difference between the sound arriving at the second microphone and the third microphone is ti,23The time difference between the sound arriving at the third microphone and the fourth microphone is ti,34The time difference between the sound arriving at the fourth microphone and the first microphone is ti,41… …; taking absolute values of the time differences;
the distance d between the impact sound reaching the first microphone and the impact sound reaching the second microphonei,12The distance d between the impact sound reaching the second microphone and the impact sound reaching the third microphonei,23The distance d between the impact sound reaching the third microphone and the impact sound reaching the fourth microphonei,34The distance d between the impact sound reaching the fourth microphone and the impact sound reaching the first microphonei,41… …, respectively; taking an absolute value of the distance;
since the distance is the speed of sound, di,12=ti,12340. f; solving d in the same wayi,23;di,34And di,41……;
C4, calculated according to the following formula:
Figure BDA0002727068700000051
the coordinate (x) of the impact point can be obtained according to the calculation formulai,yi,zi);
And C5, converting the coordinates of the impact point into pixel rendering point coordinates on the display screen, and transmitting the data of the pixel rendering point coordinates on the display screen to the display screen for displaying.
As shown in fig. 1, F1 and F2 are the positions of the microphones, M is the impact point, and the dotted line portion is the position of the transparent impact shield. The data processor obtains the time point of obtaining of the sound generated by the sound processor through obtaining of the first microphone F1 and the second microphone F2, calculates a plurality of time differences, calculates the position of a display mark on the display screen, and displays a mark pattern on the display screen, wherein the mark pattern is used for corresponding to the impact area of an external object such as a water bomb and a transparent impact shielding screen.
The data processor comprises a first receiving module, an operation parameter acquiring module, a mark confirming module and a mark pattern displaying module, wherein the first receiving module is used for receiving a signal of sound generated by impact received by a microphone, the operation parameter acquiring module is used for responding to the microphone to acquire impact sound and acquiring the time for the microphone to acquire the impact sound, the operation parameter is the time point for the microphone to acquire the impact sound, and the time difference for any 2 microphones to acquire the impact sound, the mark confirming module is used for determining the position of the mark pattern on a display screen according to the time difference for any 2 microphones to acquire the impact sound, and the mark pattern displaying module is used for displaying the mark pattern on the display screen, and the mark pattern is used for marking the impact point.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. The utility model provides a launch toy and penetrate screen decision-making system, is including the display screen that shows the image, its characterized in that, display screen the place ahead set up transparent striking and shelter from the screen, transparent striking shelter from being provided with sound collection system around the screen, sound collection system bump into the striking sound collection that produces on the transparent striking shelters from the screen with external object and send for data processor and calculate, calculate the coordinate that obtains the striking point.
2. The shot-screen determination system of claim 1, wherein the sound collection device comprises at least 2 sets of microphones, the microphones are distributed around the transparent impact shielding screen, the microphones are electrically connected with a sound processor with a noise reduction module through wires, and the sound processor is electrically connected with the data processor.
3. The ejection toy screen shot determination system of claim 2, wherein the microphones are in 2 groups.
4. The projectile shooting decision system as claimed in claim 2, wherein said microphones are 3 groups, which are distributed around said transparent impact shielding screen and three groups are not in the same line.
5. The projectile shooting decision system as claimed in claim 2, wherein said microphones are 4 groups, and are distributed around said transparent impact shielding screen, and any three groups are not in the same line.
6. A screen shooting judgment method for an ejection toy comprises the following steps:
A. the external object impacts the transparent impact shielding screen, and the generated impact sound is transmitted to the periphery of the transparent impact shielding screen;
B. the microphone arranged around the transparent impact shielding screen collects the impact sound and sends the impact sound to the sound processor, and the sound processor reduces noise and identifies the impact sound, converts the impact sound into a digital signal and sends the digital signal to the data processor for calculation;
C. and calculating to obtain the coordinates of the impact point.
7. The method for judging screen shooting of the ejection toy as claimed in claim 6, wherein the specific way of identification is as follows: the identification is based on the frequency, amplitude of the impact sound and the air vibration of the transverse or longitudinal waves.
8. The screen shooting method for the ejection toy as claimed in claim 6, wherein the impact sound is converted into a digital signal, and then compared with a digital voice signal of the impact sound recorded in advance, so as to screen out a corresponding digital voice signal belonging to the impact sound, and record the time for obtaining the impact sound.
9. The method for determining screen shooting of an ejection toy according to claim 6, wherein the algorithm adopted in the calculation in the step C specifically comprises the following steps:
c1, speed of sound in air 340 m/s, assuming coordinates of impact point (x)i,yi,zi);
C2, selecting the microphone to be used, assuming the coordinates of its location point: the first microphone has coordinates of (x)1,y1,z1) The second microphone has coordinates of (x)2,y2,z2) The coordinate of the third microphone is (x)3,y3,z3) The coordinate of the fourth microphone is (x)4,y4,z4) … …, and so on;
c3, the time difference between the arrival of the impact sound at the first microphone and the second microphone is ti,12The time difference between the sound arriving at the second microphone and the third microphone is ti,23The time difference between the sound arriving at the third microphone and the fourth microphone is ti,34The time difference between the sound arriving at the fourth microphone and the first microphone is ti,41… …; taking absolute values of the time differences;
the difference d between the distance from the impact sound to the first microphone and the distance from the impact sound to the second microphonei,12A difference d between the arrival of the impact sound at the second microphone and the arrival of the impact sound at the third microphonei,23A difference d in distance between the arrival of the impact sound at the third microphone and the arrival of the impact sound at the fourth microphonei,34The difference d in the distance between the arrival of the impact sound at the fourth microphone and the arrival of the impact sound at the first microphonei,41… …, respectively; taking an absolute value of the distance;
since the distance is the speed of sound, di,12=ti,12340. f; solving d in the same wayi,23;di,34And di,41……;
C4, calculated according to the following formula:
Figure FDA0002727068690000021
the coordinate (x) of the impact point can be obtained according to the calculation formulai,yi,zi);
And C5, converting the coordinates of the impact point into pixel rendering point coordinates on the display screen, and transmitting the data of the pixel rendering point coordinates on the display screen to the display screen for displaying.
CN202011106427.1A 2020-10-16 2020-10-16 System and method for judging screen shooting of ejection toy Pending CN112328126A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011106427.1A CN112328126A (en) 2020-10-16 2020-10-16 System and method for judging screen shooting of ejection toy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011106427.1A CN112328126A (en) 2020-10-16 2020-10-16 System and method for judging screen shooting of ejection toy

Publications (1)

Publication Number Publication Date
CN112328126A true CN112328126A (en) 2021-02-05

Family

ID=74313805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011106427.1A Pending CN112328126A (en) 2020-10-16 2020-10-16 System and method for judging screen shooting of ejection toy

Country Status (1)

Country Link
CN (1) CN112328126A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916153A (en) * 2010-08-04 2010-12-15 中兴通讯股份有限公司 Method, device and terminal for positioning touch point of touch screen
CN102043529A (en) * 2009-10-16 2011-05-04 卡西欧计算机株式会社 Indicated position detecting apparatus and indicated position detecting method
US20120070009A1 (en) * 2010-03-19 2012-03-22 Nike, Inc. Microphone Array And Method Of Use
WO2014061242A1 (en) * 2012-10-15 2014-04-24 パナソニック株式会社 Coordinate input device, electronic calculation device and coordinate input method
CN104181506A (en) * 2014-08-26 2014-12-03 山东大学 Sound source locating method based on improved PHAT weighting time delay estimation and implementation system thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102043529A (en) * 2009-10-16 2011-05-04 卡西欧计算机株式会社 Indicated position detecting apparatus and indicated position detecting method
US20120070009A1 (en) * 2010-03-19 2012-03-22 Nike, Inc. Microphone Array And Method Of Use
CN101916153A (en) * 2010-08-04 2010-12-15 中兴通讯股份有限公司 Method, device and terminal for positioning touch point of touch screen
WO2014061242A1 (en) * 2012-10-15 2014-04-24 パナソニック株式会社 Coordinate input device, electronic calculation device and coordinate input method
CN104181506A (en) * 2014-08-26 2014-12-03 山东大学 Sound source locating method based on improved PHAT weighting time delay estimation and implementation system thereof

Similar Documents

Publication Publication Date Title
CN109151442B (en) Image shooting method and terminal
US7030905B2 (en) Real-time method and apparatus for tracking a moving object experiencing a change in direction
KR101898782B1 (en) Apparatus for tracking object
US20120188371A1 (en) Stabilizing Directional Audio Input from a Moving Microphone Array
CN111368811B (en) Living body detection method, living body detection device, living body detection equipment and storage medium
CN113281706B (en) Target positioning method, device and computer readable storage medium
CN107534725A (en) A kind of audio signal processing method and device
CN103404169A (en) Microphone array steering with image-based source location
CA2496785A1 (en) Sound source search system
CN106887236A (en) A kind of remote speech harvester of sound image combined positioning
JP2006516728A5 (en)
CN110188179B (en) Voice directional recognition interaction method, device, equipment and medium
JP2003514298A (en) How to capture motion capture data
CN110505403A (en) A kind of video record processing method and device
CN107566749A (en) Image pickup method and mobile terminal
CN106886017B (en) Underwater target space position calculation method based on double-frequency identification sonar
CN109005314B (en) Image processing method and terminal
CN112130154A (en) Self-adaptive K-means outlier de-constraint optimization method for fusion grid LOF
CN110572600A (en) video processing method and electronic equipment
CN112328126A (en) System and method for judging screen shooting of ejection toy
WO2019011017A1 (en) Method and device for noise processing
CN105023255B (en) The Infrared DIM-small Target Image sequence emulation mode of Gauss model area array cameras shake
EP1071045A1 (en) Device and process for displaying an image on a screen according to a perspective that depends on the user's position
JP2008198111A (en) Instructed position calculation system, instruction object, and game system
CN102034112B (en) Method for identifying moving and static targets by using phased array three-dimensional acoustic image pickup sonar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210205

RJ01 Rejection of invention patent application after publication