WO2013061998A1 - 電子音響信号発生装置および電子音響信号発生方法 - Google Patents
電子音響信号発生装置および電子音響信号発生方法 Download PDFInfo
- Publication number
- WO2013061998A1 WO2013061998A1 PCT/JP2012/077461 JP2012077461W WO2013061998A1 WO 2013061998 A1 WO2013061998 A1 WO 2013061998A1 JP 2012077461 W JP2012077461 W JP 2012077461W WO 2013061998 A1 WO2013061998 A1 WO 2013061998A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- acceleration data
- acceleration
- touch panel
- touch
- pieces
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 66
- 230000001133 acceleration Effects 0.000 claims abstract description 111
- 230000000694 effects Effects 0.000 claims abstract description 17
- 230000005236 sound signal Effects 0.000 claims description 10
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims 1
- 238000001514 detection method Methods 0.000 description 28
- 238000004364 calculation method Methods 0.000 description 25
- 230000006870 function Effects 0.000 description 16
- 238000004891 communication Methods 0.000 description 7
- 239000011295 pitch Substances 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012886 linear function Methods 0.000 description 4
- 239000010454 slate Substances 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000030279 gene silencing Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/02—Means for controlling the tone frequencies, e.g. attack or decay; Means for producing special musical effects, e.g. vibratos or glissandos
- G10H1/04—Means for controlling the tone frequencies, e.g. attack or decay; Means for producing special musical effects, e.g. vibratos or glissandos by additional modulation
- G10H1/053—Means for controlling the tone frequencies, e.g. attack or decay; Means for producing special musical effects, e.g. vibratos or glissandos by additional modulation during execution only
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/221—Keyboards, i.e. configuration of several keys or key-like input devices relative to one another
- G10H2220/241—Keyboards, i.e. configuration of several keys or key-like input devices relative to one another on touchscreens, i.e. keys, frets, strings, tablature or staff displayed on a touchscreen display for note input purposes
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/395—Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing
Definitions
- the present invention detects the strength of a user's touch operation on an operator displayed on a touch panel display (hereinafter abbreviated as “touch panel”), and determines an acoustic signal characteristic according to the detected strength.
- touch panel a touch panel display
- the present invention relates to an electronic acoustic signal generator and a program for realizing the control method.
- an electronic acoustic signal generator that detects the strength of a user's touch operation with respect to an operator displayed on a touch panel and determines acoustic signal characteristics according to the detected strength is conventionally known.
- Non-Patent Document 1 describes application software that is installed in a mobile terminal and causes the mobile terminal to exhibit the above functions.
- the touch panel cannot detect the strength of a user's touch operation, it is estimated that the strength of the touch operation is detected using the acceleration sensor. .
- the non-patent document 1 does not describe at all how the sensor output from the acceleration sensor is used to detect the strength of the touch operation.
- the intensity of the touch operation is detected using the acceleration sensor, it is estimated that the volume of the sound to be generated is determined according to the magnitude of the acceleration detected by the acceleration sensor.
- the timing at which the touch panel detects the touch operation and the timing at which the acceleration sensor detects the touch operation generally do not completely match. Therefore, if the output value from the acceleration sensor is detected at the timing when the touch panel detects a touch operation, and the sound volume is determined and generated accordingly, the strength of the user's touch operation is appropriate for the determined volume. In other words, a sound having a volume corresponding to the user's touch operation may not be generated.
- the present invention has been made paying attention to this point, and provides an electronic acoustic signal generator and an electronic acoustic signal generation method capable of determining acoustic signal characteristics corresponding to a user's touch operation. Objective.
- an electronic acoustic signal generator includes a touch panel, a display circuit for displaying an operation element on the touch panel, and a touch operation performed on the operation element on the touch panel.
- An acceleration sensor for detecting acceleration; acceleration data indicating the acceleration detected by the acceleration sensor; the storage unit storing the acceleration data in a new order; After a touch operation of the user is detected with respect to the displayed operation element, k (k ⁇ n) pieces of acceleration data are stored in the storage unit, and the n pieces of acceleration data including the k pieces of acceleration data are stored.
- a processing circuit that selects at least one acceleration data that matches a predetermined condition from the acceleration data, and the processing circuit selects the acceleration data. Based on the velocity data, and a tone and effect circuit for determining the signal characteristics of the audio signal to be generated.
- the values of n and k can be changed to arbitrary values as long as the condition of k ⁇ n is satisfied.
- the sound source / effect circuit converts the selected acceleration data into a velocity value based on a predetermined conversion characteristic, and the acoustic signal is converted into the velocity value by the converted velocity value. It is preferable to determine the signal characteristics of
- the electronic acoustic signal generation program of the present invention causes a computer to perform the electronic acoustic signal generation method of the present invention.
- the k pieces of acceleration data are stored. At least one piece of acceleration data that meets a predetermined condition is selected from the n pieces of stored acceleration data including the signal, and a signal characteristic of the generated acoustic signal is determined based on the selected acceleration data. Is done. That is, even if the timing at which the touch operation of the user using the touch panel is detected and the timing at which the touch operation is detected by the acceleration sensor vary, the stored n pieces of acceleration data include the user's touch operation. Acceleration data that appropriately represents the acceleration of. The acceleration data is selected depending on whether or not it matches the predetermined condition, and the acoustic signal characteristics are determined based on the acceleration data. Therefore, it is possible to determine a sound signal characteristic that matches the sound signal characteristic corresponding to the user's touch operation.
- FIG. 1 is a block diagram showing a schematic configuration of an electroacoustic signal generator according to an embodiment of the present invention.
- the electronic acoustic signal generator includes a setting operator 1 composed of a plurality of switches for inputting various information, a plurality of performance operators, various musical tone parameters, and various operation modes. In addition to displaying multiple setting controls and various information for selecting and setting, the user can select the corresponding performance status, musical tone parameters, and operation modes by touching each displayed control and information. And a touch panel 2 to be set.
- the setting operator 1 is connected to a detection circuit 3 that detects the operation state.
- the touch panel 2 includes a detection circuit 4 that detects a touch operation on the touch panel 2 by a user, and a GUI (graphical user interface) for selectively setting various states and information related to music, including a performance state, various musical tone parameters, and various operation modes.
- the detection circuit 3, the detection circuit 4, and the display circuit 5 are connected to a bus 14 connected to a CPU 6 that controls the entire apparatus.
- the bus 14 includes a ROM 7 for storing a control program executed by the CPU 6 and various table data, a RAM 8 for temporarily storing performance information, various input information, calculation results, and the like, various application programs including the control program,
- a storage device 9 for storing various music data and various data is further connected.
- an external device (not shown) is connected to the bus 14, and a communication interface (I / F) 10 that transmits / receives data to / from the external device is connected.
- the electroacoustic signal generator of this embodiment further includes an acceleration sensor 11 connected to the bus 14.
- the acceleration sensor 11 detects acceleration generated in the electronic acoustic signal generator when the user shakes or strikes the electronic acoustic signal generator.
- the bus 14 further converts performance information input using a performance operator, performance information obtained by reproducing any music data stored in the storage device 9, and the like into a musical sound signal.
- a sound source / effect circuit 12 for applying various effects to the musical sound signal is connected.
- the sound source / effect circuit 12 is connected to a sound system 13 for converting a musical sound signal from the sound source / effect circuit 12 into sound.
- the sound system 13 includes, for example, a DAC (digital-to-analog converter), an amplifier, a speaker, and the like.
- the touch panel 2 in this embodiment, a panel having a multipoint recognition function that recognizes simultaneous pressing operations of a user at a plurality of positions is employed, but a touch panel that cannot recognize multiple points may be used.
- the electronic acoustic signal generator according to the present embodiment assumes a small portable terminal (specifically, a general-purpose slate PC or a smartphone) that can be operated with one hand on the palm. Therefore, the touch panel 2 is also small.
- a keyboard hereinafter simply referred to as “keyboard”
- a plurality for performing various settings such as tone settings.
- the keyboard 2a is used as the performance operator.
- the present invention is not limited to this, and any kind of drum operation such as various drums of drum sets or guitar frets may be used.
- the present invention is an apparatus that generates an acoustic signal in response to the operation of the operation device, such as a DJ device operation device or a game machine operation device. Any device that controls signal characteristics such as volume may be used.
- the storage device 9 is, for example, a storage medium such as a flexible disk (FD), a hard disk (HD), a CD-ROM, a DVD (digital versatile disk), a magneto-optical disk (MO), and a semiconductor memory, and a driving device thereof.
- the storage medium may be detachable from the drive device, or the storage device 9 itself may be detachable from the electroacoustic signal generator of this embodiment. Alternatively, neither the storage medium nor the storage device 9 may be detachable.
- the storage device 9 (the storage medium) can also store a control program executed by the CPU 6 as described above.
- control program When the control program is not stored in the ROM 7, the control program is stored in the storage device 9 and is read into the RAM 8, thereby performing the same operation as when the control program is stored in the ROM 7. Can be made. In this way, control programs can be easily added and upgraded.
- a music-dedicated wired I / F that exclusively transmits and receives music signals such as MIDI signals, general-purpose short-distance wired I / Fs such as USB (universal serial bus) and IEEE 1394, and Ethernet (registered trademark)
- General-purpose network I / F such as wireless LAN (local area network) and general-purpose short-range wireless I / F such as Bluetooth (registered trademark)
- communication I / F for digital telephone network for example, a music-dedicated wired I / F that exclusively transmits and receives music signals such as MIDI signals, general-purpose short-distance wired I / Fs such as USB (universal serial bus) and IEEE 1394, and Ethernet (registered trademark)
- General-purpose network I / F such as wireless LAN (local area network) and general-purpose short-range wireless I / F such as Bluetooth (registered trademark)
- communication I / F for digital telephone network is employed as the communication I / F 10.
- the acceleration sensor 11 is a three-axis acceleration sensor that detects acceleration in the three-axis directions of the x-axis, y-axis, and z-axis, and outputs a sensor value for each axis.
- the sensor output value of each axis is read from the acceleration sensor 11 at a predetermined read cycle and stored in a sensor output storage area secured at a predetermined position in the RAM 8.
- a processing unit (mainly constituted by the CPU 6, the ROM 7, and the RAM 8) 101 reads out the sensor output value of each axis from the sensor output storage area at a predetermined cycle (in this embodiment, every 10 msec), and performs control processing described later. Used for.
- a predetermined cycle in this embodiment, every 10 msec
- the sound source / effect circuit 12 is assumed to generate a musical sound signal only by hardware and to give various effects to the sound signal.
- a musical sound signal may be generated only by software processing and various effects may be imparted thereto, or processing may be shared between hardware and software.
- the sound source / effect circuit 12 is not limited to the one provided in the electronic acoustic signal generation device, and the sound source / effect circuit 12 is provided in an external device connected via the communication I / F 10, for example.
- a sound generation instruction including sound generation characteristic control information (velocity to be described later) may be given to the external device from the generation device, and the external device may generate an acoustic signal.
- a general-purpose slate PC or smartphone equipped with the touch panel 2 as described above is assumed as the form of the electronic acoustic signal generator.
- a hardware configuration dedicated to music with (light emitting diode) and actual controls may be used.
- the electronic acoustic signal generation device When the user touches (slaps) the key of the keyboard 2a with, for example, a finger, the electronic acoustic signal generation device according to the present embodiment generates a sound corresponding to the pitch of the pitch assigned to the touched key. Occurs.
- the strength of this touch is detected based on the sensor output value from the acceleration sensor 11 built in the apparatus.
- the feature of this embodiment is how to detect the strength of the touch using the sensor output value. Note that the number of keys to be touched is not limited to one key, and even if there are a plurality of keys, a number of pitches corresponding to that number are generated.
- the CPU 6 reads the sensor output value of each axis in the sensor output storage area of the RAM 8, performs the following equation (1), and calculates the sensor output value of each axis. Calculate the root sum square of the difference.
- x t, y t, z t is x-axis of the current time t, indicates the sensor output value of the y-axis and z-axis, x t-1, y t -1, z t-1 , the current
- the sensor output values of the x-axis, y-axis, and z-axis at time t ⁇ 1 one cycle before time t are shown.
- the sensor output values for each axis need to be the current time and the time one cycle before the current time, so the current sensor output storage area and the old sensor output storage are stored at predetermined positions in the RAM 8.
- the sensor output value of each axis read out from the sensor output storage area is first stored in the current sensor output storage area, and after the above calculation is completed, it is read out from the current sensor output storage area and the old sensor output storage area. Save over.
- the reason why the difference in the sensor output value of each axis in the above equation (1) is taken is to remove the influence of the gravitational acceleration from the sensor output value of each axis. Also, the root sum square of the sensor output value (difference) of each axis is taken from the posture of the device (whether it is held vertically by hand, held diagonally, held horizontally, etc.) ) And the direction of the force applied when touched.
- the arithmetic expression is not limited to the above expression (1).
- the calculation results (acceleration data) calculated in this way are stored as a plurality of logs (five in this embodiment, but not limited thereto) from the current time to the past. That is, five pieces of acceleration data are stored in the RAM 8 in order from the newest.
- FIG. 3 is a diagram illustrating an example of a method for selecting a plurality of calculation results (acceleration data) stored as a log to be used for control processing.
- the detection circuit 4 detects that the user has touched the key. “X” in FIG. 3 indicates the timing at which this touch is detected.
- the predetermined condition include one that takes the maximum value, one that takes the maximum value and the next largest value, and the like.
- the maximum value is adopted as the predetermined condition. Therefore, in the example shown in the figure, the calculation result indicated by the arrow labeled “Adopted Value” is selected. The selected calculation result is used to determine the volume as described above, and the determination method will be described later.
- the sensor output from the acceleration sensor 11 (actually, it is not the sensor output itself but the above calculation result obtained by performing a predetermined calculation on the sensor output, and may be referred to as acceleration data).
- a plurality of data items are stored in order starting from the latest data and are selected.
- the plurality of sensor outputs to be stored include not only the sensor output after the user's touch operation on the touch panel 2 is detected, but also the sensor output before that.
- the sensor output before and after the detection of the touch operation is selected as a selection target in some cases where a large acceleration is obtained at the time of touching, or slightly before or slightly before the touched time.
- the number of sensor outputs that should be stored before and after the detection of the touch operation depends on the cycle of detecting the output value of the acceleration sensor 11 or until the sound generation instruction is issued from the time when the touch operation is detected. It may be arbitrarily determined in consideration of, for example, how much delay is allowed, and is not limited to a total of five before detection and two after detection as in the present embodiment.
- the electroacoustic signal generation device when a plurality of sensor outputs from the acceleration sensor 11 are stored, both the sensors before and after the point in time when the user's touch operation on the touch panel 2 is detected.
- the output is stored, and the one that takes the maximum value from the plurality of stored sensor outputs is selected and used to determine the volume of the sound to be generated. That is, even if the timing at which the operation is detected from the touch panel 2 and the timing at which the operation (acceleration output corresponding to the acceleration sensor 11 is detected) varies with respect to the user's touch operation, the operation is stored.
- sensor output acceleration data
- This sensor output is selected depending on whether or not a predetermined condition (here, the maximum value) is met. Based on the selected sensor output (acceleration data), a musical sound characteristic (here, volume) is determined, so that it is possible to determine a musical sound characteristic that matches the musical sound characteristic corresponding to the user's touch operation.
- a predetermined condition here, the maximum value
- the selected sensor output is used to determine the volume of the sound to be generated.
- the present invention is not limited to this, and may be used to determine other musical tone characteristics such as timbre.
- FIG. 4 is a flowchart showing the procedure of the touch / movement / release detection process executed by the electronic acoustic signal generator of the present embodiment, particularly the CPU 6.
- the touch / movement / release detection process is activated every 5 msec, for example. Executed.
- a series of user operations (O2) keyboard 2a mainly comprising a first operation of touching the key of (O1) keyboard 2a and a second operation of releasing the touched finger.
- a series of user operations consisting of is detected.
- each of the series of user operations (O1) and (O2) is for one finger, and when operated with a plurality of fingers, a series of (O1) or (O2).
- the user operations may be detected in parallel by the number of fingers.
- the CPU 6 When the detection circuit 4 detects any one of the touch operation, the movement operation, and the release operation, the CPU 6 notifies the detected operation type (touch / movement / release) and the coordinates on the touch panel 2 at that time. Is done. Note that when a movement operation is detected, the CPU 6 is notified of two coordinates before and after movement.
- the detection circuit 4 notifies the touch operation and the coordinates at that time.
- the CPU 6 starts touch / move / release detection processing.
- the CPU 6 creates a new record in the touch management register, and records the touched coordinates (the coordinates of the touch position on the touch panel 2) as initial coordinates and current coordinates.
- the CPU 6 obtains an operator ID (identification) corresponding to the coordinates, and notifies the sound generation management process of FIGS. 5A and 5B described later that the operator of the operator ID has been turned on (step). S1->S2->S3-> S4).
- the touch management register is a register secured at a predetermined position in the RAM 8, and as shown in FIG. 6A, a plurality of sets of one set of data including initial coordinates (x, y) and current coordinates (x, y) can be stored. Has an area. This set of data structures is called a “record”.
- the operator ID is an ID assigned to each key image of the keyboard 2a. In the image coordinate table (not shown), the ID of each key image (operator) and the coordinates where the image is arranged are registered, and the operator ID displayed at the coordinates is acquired from the touched coordinates. can do.
- the detection circuit 4 notifies the release operation and the coordinates at that time.
- the CPU 6 searches the touch management register for a record in which the released coordinates (the coordinates of the release position on the touch panel 2) match the current coordinates.
- the CPU 6 obtains an operator ID corresponding to the initial coordinates recorded in the record.
- the CPU 6 notifies the sound generation management process that the operator with the operator ID has been turned off, and deletes the record from the touch management register (steps S 1 ⁇ S 5 ⁇ S 7 ⁇ S 8 ⁇ S 9).
- a record in which the released coordinates coincide with the current coordinates is always stored in the touch management register.
- the detection circuit 4 notifies the touch operation and the coordinates at that time. Immediately after that, the touch / move / release detection process is started, and the CPU 6 performs the same process as that executed when the first operation (touch operation) of the series of user operations (O1) is performed. Execute (Steps S1-> S2-> S3-> S4).
- the detection circuit 4 indicates that the operation is a moving operation and the coordinates before the movement in addition to the coordinates at that time. Will be notified.
- the touch / movement / release detection process is started immediately after that, and the CPU 6 searches the touch management register for a record in which the coordinates before the movement coincide with the current coordinates. If there is a matching record, the CPU 6 records the coordinates after movement as the current coordinates (updates the current coordinates with the coordinates after movement) (steps S1 ⁇ S5 ⁇ S6).
- this second operation when the user touches his / her finger, if the moved position is still on the same operator, only the current coordinates are updated. . In this case, in the present embodiment, only the current coordinates are updated even when the operator moves off the operator before movement and moves to another operator or moves to an area outside the operator. Yes.
- the touch management register is referred to obtain initial coordinates, and an operator ID corresponding to the initial coordinates is acquired to perform a mute operation. . That is, even when the finger is displaced from the operation element due to the movement operation, the state in which the operation element is operated is maintained.
- the detection circuit 4 notifies the release operation and the coordinates at that time.
- the touch / move / release detection process is started immediately after that, and the CPU 6 executes the process executed when the second operation (release operation) of the series of user operations (O1) is performed. Similar processing is executed (steps S1, S5, S7, S8, S9).
- FIGS. 5A and 5B are flowcharts showing the procedure of sound generation management processing executed by the electronic acoustic signal generator of this embodiment, particularly the CPU 6. This sound generation management process is started and executed every 10 msec, for example.
- This sound generation management process mainly includes (1) sensor output related process related to the sensor output from the acceleration sensor 11 (steps S11 to S13). (2) Sound generation processing (steps S16 and S17) (3) Silence processing (step S19) (4) Sound generation management register initialization process (step S21) (5) Mute management register initialization process (step S23) It is constituted by.
- the CPU 6 reads the sensor output value of each axis from the sensor output storage area as described above in the outline of the control process, Store in the current sensor output storage area (step S11).
- the CPU 6 performs the calculation of the equation (1) on the sensor output values of the respective axes stored in the current sensor output storage area and the old sensor output storage area (step S12).
- the calculation result is stored in a ring buffer (not shown) secured at a predetermined position in the RAM 8 (step S13).
- the calculation results are obtained exceeding the number to be stored.
- a ring buffer that can easily overwrite and save the calculation result is adopted.
- the present invention is not limited to this, and a normal memory may be employed to manage the number stored by software.
- the CPU 6 reads the sensor output value of each axis stored in the current sensor output storage area after the process of step S13. The process of storing in the old sensor output storage area is executed. In addition, when the real sound generation management process is activated, the sensor output value of each axis may not yet be stored in the old sensor output storage area.
- step S12 since the CPU 6 cannot execute the calculation process of step S12, following the process of storing the sensor output value of each axis read from the sensor output storage area in the current sensor output storage area, the processing of steps S12 and S13 is performed. Without executing the process, the process of storing the sensor output value of each axis stored in the current sensor output storage area in the old sensor output storage area is executed.
- the CPU 6 increments the value of each counter in the sound generation management register and the mute management register by “1” (step S14).
- Both the sound generation management register and the mute management register are registers secured at predetermined positions in the RAM 8, and as shown in FIGS. 6B and 6C, a plurality of sets of data (records) including an operator ID and a (software) counter are provided. An area capable of storing a set is provided. In some cases, neither the sound generation management register nor the mute management register stores one record. In this case, it goes without saying that the CPU 6 does not increment step S14.
- the CPU 6 advances the process to the sound generation process (2) (steps S15 ⁇ S16).
- FIG. 6D is a diagram illustrating an example of a function that is a conversion characteristic used when converting the employed calculation result into a velocity value.
- the function in FIG. 6D is a sigmoid function, which is an example of a non-linear function.
- a “velocity sensibility” parameter may be defined, and the shape of the function may be changed by this parameter.
- the CPU 6 instructs the sound source / effect circuit 12 as the sound source unit to start sound generation using the pitch corresponding to the operator ID of the record and the converted velocity value, and the record is stored in the sound generation management register. Is deleted (step S17).
- the sound source / effect circuit 12 generates a musical tone signal with the musical tone characteristics of the acoustic signal such as volume and tone determined according to the instructed velocity value.
- the CPU 6 advances the process to the mute process (3) (step S18 ⁇ S19), and the operator ID of the record.
- the sound source / effect circuit 12 is instructed to mute the pitch corresponding to, and the record is deleted from the mute management register.
- step S20 ⁇ 21 the CPU 6 advances the process to the sound generation management register initialization process (4) (step S20 ⁇ 21). A new record is created, the operator ID is recorded, and "0" is set in the counter.
- the CPU 6 advances the process to the mute management register initialization process (5) (step S22 ⁇ S23), and records a new record in the mute management register. , Record the controller ID, and set the counter to “0”.
- a predetermined default value may be adopted as a velocity value when a finger moves from an operation element before movement to an operation element after movement by a movement operation.
- the velocity value used for generating the sound of the operation element before movement may be stored and used when the sound of the operation element after movement is generated. Since there is no velocity value used to generate the sound of the operation element before the movement, a predetermined default value is preferably adopted as the velocity value when the finger moves on the operation element from outside the operation element.
- a plurality of calculation results obtained by performing a predetermined calculation on the detected acceleration value are stored as a log.
- a plurality of detected acceleration values are stored as a log, and in the sound generation process (step S16 in FIG. 5A), the stored acceleration values are set to the respective values.
- a predetermined calculation may be adopted after performing a predetermined calculation.
- achieves the function of embodiment mentioned above to a system or an apparatus, and was stored in the storage medium by the computer (or CPU and MPU) of the system or apparatus. It goes without saying that the object of the present invention can also be achieved by reading and executing.
- the program code itself read from the storage medium realizes the novel function of the present invention
- the program code and the storage medium storing the program code constitute the present invention.
- a storage medium for supplying the program code for example, a flexible disk, hard disk, magneto-optical disk, CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, DVD + RW, magnetic A tape, a non-volatile memory card, a ROM, or the like can be used.
- the program code may be supplied from a server computer via a communication network.
- the function expansion is performed based on the instruction of the program code. It goes without saying that the CPU or the like provided in the board or the function expansion unit performs part or all of the actual processing and the functions of the above-described embodiments are realized by the processing.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Electrophonic Musical Instruments (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
本願は、2011年10月24日に、日本に出願された特願2011-232614号に基づき優先権を主張し、その内容をここに援用する。
検出回路3、検出回路4および表示回路5は、装置全体の制御を司るCPU6に接続されたバス14に接続される。バス14には、CPU6が実行する制御プログラムや各種テーブルデータ等を記憶するROM7と、演奏情報、各種入力情報および演算結果等を一時的に記憶するRAM8と、前記制御プログラムを含む各種アプリケーションプログラムや各種楽曲データ、各種データ等を記憶する記憶装置9とがさらに接続される。また、バス14には、図示しない外部機器を接続し、この外部機器とデータの送受信を行う通信インターフェース(I/F)10が接続される。
ここで、xt,yt,zt は、現時刻tのx軸、y軸およびz軸の各センサ出力値を示し、xt-1,yt-1,zt-1 は、現時刻tより1周期前の時刻t-1のx軸、y軸およびz軸の各センサ出力値を示している。このように、各軸のセンサ出力値は、現時刻のものと、現時刻より1周期前の時刻のものとが必要になるので、RAM8の所定位置に現センサ出力格納領域および旧センサ出力格納領域を確保し、センサ出力格納領域から読み出した各軸のセンサ出力値は、まず現センサ出力格納領域に格納し、上記演算が終了した後、現センサ出力格納領域から読み出して旧センサ出力格納領域に上書き保存する。
ここで、所定の条件としては、たとえば、最大値を取るもの、最大値とその次に大きい値を取るもの、などを挙げることができる。後者の条件のように、複数個の演算結果が選択される場合には、それらの平均値を取るようにすればよい。本実施形態では、所定の条件として、最大値を採るものを採用する。したがって図示例では、「採用される値」と記載された矢印が指示する演算結果が選択される。選択された演算結果は、前述のように音量の決定に使用されるが、その決定方法については、後述する。
このように、タッチ操作の検出前および検出後のいずれのセンサ出力も選択対象としたのは、タッチされた時点で大きな加速度が得られるケースもあれば、タッチされた時点よりも少し前あるいは少し後に大きな加速度が得られるケースもあることが、本願の発明者による実験にて確認できたからである。その理由は、前述のように、多くの処理が並行して実行されているため、タッチの仕方(タッチパネル2のパネル面に対して垂直にタッチしたのか、やや斜めにタッチしたのか、など)や、装置の保持状態(手で持っているのか、机の上などに置いているのか、など)に応じて、大きなセンサ出力値(本来のタッチの強さを反映していると考えられる値)の時間的な出方が変わると考えられるからである。
(O1)鍵盤2aの鍵をタッチする第1の操作と、そのタッチした指をリリースする第2の操作とからなる一連のユーザ操作
(O2)鍵盤2aの鍵をタッチする第1の操作と、そのタッチした指を、そのタッチ状態を保持させたまま、その鍵上を移動させる第2の操作と、そのタッチした指をリリースする第3の操作とからなる一連のユーザ操作を検出する。ただし、上記(O1)および(O2)の各一連のユーザ操作は、ある1本の指を対象としたものであり、複数の指で操作した場合には、(O1)または(O2)の一連のユーザ操作がその指の数だけ並行して検出されるようにすればよい。
次にCPU6は、その座標に対応する操作子ID(identification)を求めて、その操作子IDの操作子がオンされた旨を、後述する図5Aおよび5Bの音発生管理処理に通知する(ステップS1→S2→S3→S4)。タッチ管理レジスタは、RAM8の所定位置に確保されたレジスタであり、図6Aに示すように、初期座標(x,y)および現在座標(x,y)からなる1組のデータを複数組記憶できる領域を備えている。この1組のデータ構造は「レコード」と呼ばれる。
操作子IDは、鍵盤2aの各鍵画像にそれぞれ割り当てられているIDである。図示しない画像座標テーブルには、各鍵画像(操作子)のIDとその画像が配置されている座標が登録されており、タッチされた座標から、その座標に表示されている操作子IDを取得することができる。
これに代えて、移動操作により操作子上から指がずれて新たな操作子上に移動した場合は、移動前の座標で一旦リリース操作があったものとみなして、後述する第3の操作(リリース操作)に対応した処理を実行するとともに、移動後の座標で新たなタッチ操作があったものと見なして、前述の第1の操作(タッチ操作)に対応した処理を実行するようにしてもよい。これにより、移動操作により操作子上から指がずれたら一旦消音され、新たな操作子に対応した楽音が発生されるようになる。なお、操作子外から操作子上に移動した場合は、新たなタッチ操作のみとし、操作子上から操作子外に移動した場合は、リリース操作のみとする。 また、本実施形態の動作をするのか、これに代える動作をするのかをユーザが選択できるようにしてもよい。
(1)加速度センサ11からのセンサ出力に関連するセンサ出力関連処理(ステップS11~S13)
(2)音発生処理(ステップS16,S17)
(3)消音処理(ステップS19)
(4)音発生管理レジスタ初期化処理(ステップS21)
(5)消音管理レジスタ初期化処理(ステップS23)
によって構成されている。
前述したように、演算結果は、時系列的に得られたもののうち、最新の複数個(たとえば5個)を保存しておくため、本実施形態では、保存する個数を超えて演算結果が得られた場合に、その演算結果を簡単に上書き保存できるリングバッファを採用している。これに限らず、通常のメモリを採用し、ソフトウェアで保存する個数を管理するようにしてもよい。
なお、フローチャートには記載されていないが、この(1)センサ出力関連処理では、CPU6は、前記ステップS13の処理の後、現センサ出力格納領域に格納されている各軸のセンサ出力値を読み出して、旧センサ出力格納領域に格納する処理を実行する。
また、本音発生管理処理が起動されたときに、旧センサ出力格納領域にまだ各軸のセンサ出力値が格納されていない場合がある。その場合には、CPU6は前記ステップS12の演算処理を実行できないので、センサ出力格納領域から読み出した各軸のセンサ出力値を現センサ出力格納領域に格納する処理に続いて、ステップS12,S13の処理を実行せずに、現センサ出力格納領域に格納した各軸のセンサ出力値を旧センサ出力格納領域に格納する処理を実行する。
図6Dは、採用した演算結果をベロシティ値に換算する際に用いる換算特性である関数の一例を示す図である。図6Dの関数は、非線形の関数の一例である、シグモイド関数(sigmoid function)である。もちろん、他の非線形の関数を用いてもよいし、非線形の関数に限らず、線形の関数を用いてもよい。さらに、「ベロシティ・センサビリティ」パラメータを定義し、このパラメータによって関数の形状を変えられるようにしてもよい。
ここで、消音管理レジスタ内のカウンタ値=+2とは、タッチ・移動・リリース検出処理から「操作子オフ」の通知を受けてから、本音発生管理処理が2回起動されたことを意味する。つまり、タッチパネル2へのユーザのタッチのリリースが検出されてから、2回、各軸のセンサ出力値が取得されたことを意味する。
しかしこの(3)消音処理では、前記(2)音発生処理と異なり、リングバッファに記憶されている演算結果を使っていない。したがって、リングバッファに新たな演算結果が入ってくるのを待つために、消音管理レジスタ内のカウンタ値を監視する必要はないので、消音管理レジスタ内のカウンタ値=+2は、タッチパネル2へのユーザのタッチのリリースが検出されてから消音が開始されるまでの時間と、タッチパネル2へのユーザのタッチが検出されてから音の発生が開始されるまでの時間とを一致させる目的でのみなされている。
操作子外から操作子上に指が動いたときのベロシティ値は、移動前の操作子の音の発生に用いたベロシティ値がないことから、所定のデフォルト値を採用するのがよい。
5…表示回路
6…CPU
8…RAM
11…加速度センサ
Claims (9)
- タッチパネルと、
前記タッチパネル上に操作子を表示させる表示回路と、
前記タッチパネル上の前記操作子に対してタッチ操作がなされた時の加速度を検出する加速度センサと、
前記加速度センサによって検出された前記加速度を示す加速度データを予め定められた時間毎に取得し、前記加速度データを新しい順にn個保存する記憶部と、
前記タッチパネル上に表示された前記操作子に対してユーザのタッチ操作が検出された後に、k(k<n)個の前記加速度データを前記記憶部に保存し、前記k個の加速度データを含む前記n個の加速度データの中から、予め定められた条件に合致する加速度データを、少なくとも1個選択する処理回路と、
前記処理回路によって選択された加速度データに基づいて、発生すべき音響信号の信号特性を決定する音源・効果回路と
を有する電子音響信号発生装置。 - 前記nおよびkの各値は、k<nの条件を満たす限り、任意の値に変更可能である請求項1に記載の電子音響信号発生装置。
- 前記音源・効果回路は、予め定められた換算特性に基づいて、前記選択された加速度データをベロシティ値に換算し、前記換算されたベロシティ値によって前記音響信号の信号特性を決定する請求項1または2に記載の電子音響信号発生装置。
- タッチパネル上に操作子を表示させるステップと、
前記タッチパネル上の前記操作子に対してタッチ操作がなされた時の加速度を加速度センサによって検出するステップと、
前記加速度センサによって検出された前記加速度を示す加速度データを予め定められた時間毎に取得し、記憶部に前記加速度データを新しい順にn個保存するステップと、
前記タッチパネル上に表示された前記操作子に対するユーザのタッチ操作を検出するステップと
前記タッチ操作が検出された後に、k(k<n)個の前記加速度データを前記記憶部に保存するステップと、
前記記憶部に保存された前記k個の加速度データを含む前記n個の加速度データの中から、予め定められた条件に合致する加速度データを、処理回路によって少なくとも1個選択するステップと、
前記処理回路によって選択された加速度データに基づいて、発生すべき音響信号の信号特性を決定するステップと
を有する電子音響信号発生方法。 - 前記nおよびkの各値は、k<nの条件を満たす限り、任意の値に変更可能である請求項4に記載の電子音響信号発生方法。
- 前記音響信号の信号特性を決定するステップは、予め定められた換算特性に基づいて、前記選択された加速度データをベロシティ値に換算するステップと、前記換算されたベロシティ値によって前記音響信号の信号特性を決定するステップとを有する請求項4または5に記載の電子音響信号発生方法。
- タッチパネルおよび加速度センサを備えるコンピュータに、
前記タッチパネル上に操作子を表示させるステップと、
前記タッチパネル上の前記操作子に対してタッチ操作がなされた時の加速度を前記速度センサによって検出するステップと、
前記加速度センサによって検出された前記加速度を示す加速度データを予め定められた時間毎に取得し、記憶部に前記加速度データを新しい順にn個保存するステップと、
前記タッチパネル上に表示された前記操作子に対するユーザのタッチ操作を検出するステップと
前記タッチ操作が検出された後に、k(k<n)個の前記加速度データを前記記憶部に保存するステップと、
前記記憶部に保存された前記k個の加速度データを含む前記n個の加速度データの中から、予め定められた条件に合致する加速度データを、処理回路によって少なくとも1個選択するステップと、
前記処理回路によって選択された加速度データに基づいて、発生すべき音響信号の信号特性を決定するステップと
を行わせる電子音響信号発生プログラム。 - 前記nおよびkの各値は、k<nの条件を満たす限り、任意の値に変更可能である請求項7に記載の電子音響信号発生プログラム。
- 前記音響信号の信号特性を決定するステップは、予め定められた換算特性に基づいて、前記選択された加速度データをベロシティ値に換算するステップと、前記換算されたベロシティ値によって前記音響信号の信号特性を決定するステップとを有する請求項7または8に記載の電子音響信号発生プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280004878.1A CN103348407B (zh) | 2011-10-24 | 2012-10-24 | 电子音响信号产生装置及电子音响信号产生方法 |
EP12842697.0A EP2772903B1 (en) | 2011-10-24 | 2012-10-24 | Electroacoustic signal emitter device and electroacoustic signal emitter method |
KR1020137010455A KR101461448B1 (ko) | 2011-10-24 | 2012-10-24 | 전자 음향 신호 발생 장치, 전자 음향 신호 발생 방법, 및 전자 음향 신호 발생 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체 |
US13/883,044 US8978672B2 (en) | 2011-10-24 | 2012-10-24 | Electronic acoustic signal generating device and electronic acoustic signal generating method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011232614A JP5565399B2 (ja) | 2011-10-24 | 2011-10-24 | 電子音響信号発生装置およびその制御方法を実現するためのプログラム |
JP2011-232614 | 2011-10-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013061998A1 true WO2013061998A1 (ja) | 2013-05-02 |
Family
ID=48167825
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/077461 WO2013061998A1 (ja) | 2011-10-24 | 2012-10-24 | 電子音響信号発生装置および電子音響信号発生方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US8978672B2 (ja) |
EP (1) | EP2772903B1 (ja) |
JP (1) | JP5565399B2 (ja) |
KR (1) | KR101461448B1 (ja) |
CN (1) | CN103348407B (ja) |
WO (1) | WO2013061998A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10599251B2 (en) | 2014-09-11 | 2020-03-24 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US10642407B2 (en) | 2011-10-18 | 2020-05-05 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US10949029B2 (en) | 2013-03-25 | 2021-03-16 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
US11029785B2 (en) | 2014-09-24 | 2021-06-08 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US11175698B2 (en) | 2013-03-19 | 2021-11-16 | Qeexo, Co. | Methods and systems for processing touch inputs based on touch type and touch intensity |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
US11262864B2 (en) | 2013-03-25 | 2022-03-01 | Qeexo, Co. | Method and apparatus for classifying finger touch events |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
EP3028125B1 (en) * | 2013-08-02 | 2023-04-05 | Qeexo, Co. | Capture of vibro-acoustic data used to determine touch types |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104516670B (zh) * | 2013-12-25 | 2018-08-28 | 柳州金之源商务服务有限公司 | 有力度感的乐器琴键 |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
US9829577B2 (en) | 2014-12-19 | 2017-11-28 | The Regents Of The University Of Michigan | Active indoor location sensing for mobile devices |
JP6569479B2 (ja) * | 2015-11-02 | 2019-09-04 | ヤマハ株式会社 | 音楽機器及びプログラム |
US20170206055A1 (en) * | 2016-01-19 | 2017-07-20 | Apple Inc. | Realtime audio effects control |
US11158294B2 (en) | 2018-09-07 | 2021-10-26 | Keith Groover | Electronic musical instrument |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11338470A (ja) * | 1998-05-28 | 1999-12-10 | Canon Inc | 電子楽器及び電子楽器制御方法 |
US20090322498A1 (en) * | 2008-06-25 | 2009-12-31 | Lg Electronics Inc. | Haptic effect provisioning for a mobile communication terminal |
US20100156822A1 (en) * | 2008-12-24 | 2010-06-24 | Samsung Electro-Mechanics Co., Ltd. | Touch-sensitive interface device |
JP2010204401A (ja) * | 2009-03-04 | 2010-09-16 | Casio Computer Co Ltd | 電子楽器 |
JP2010271440A (ja) * | 2009-05-20 | 2010-12-02 | Yamaha Corp | 演奏制御装置およびプログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5634046A (en) * | 1994-09-30 | 1997-05-27 | Microsoft Corporation | General purpose use of a stack pointer register |
CN2239634Y (zh) * | 1995-11-24 | 1996-11-06 | 南京远志资讯科技开发有限公司 | 计算机影音控制器 |
JP4165248B2 (ja) * | 2003-02-19 | 2008-10-15 | ヤマハ株式会社 | 音響信号処理装置及びパラメータ表示制御プログラム |
US7728316B2 (en) * | 2005-09-30 | 2010-06-01 | Apple Inc. | Integrated proximity sensor and light sensor |
US9335869B2 (en) * | 2007-10-01 | 2016-05-10 | Igt | Method and apparatus for detecting lift off on a touchscreen |
KR20090093766A (ko) * | 2008-02-28 | 2009-09-02 | 황재엽 | 모바일 가상 기타의 지판 표시 장치 및 방법 |
JP2010020608A (ja) * | 2008-07-11 | 2010-01-28 | Olympus Imaging Corp | 電子装置、カメラ、オブジェクト選択方法、および、オブジェクト選択プログラム |
KR101554221B1 (ko) * | 2009-05-11 | 2015-09-21 | 삼성전자주식회사 | 휴대단말기를 이용한 악기 연주 방법 및 장치 |
US20130120282A1 (en) * | 2010-05-28 | 2013-05-16 | Tim Kukulski | System and Method for Evaluating Gesture Usability |
-
2011
- 2011-10-24 JP JP2011232614A patent/JP5565399B2/ja active Active
-
2012
- 2012-10-24 US US13/883,044 patent/US8978672B2/en active Active
- 2012-10-24 WO PCT/JP2012/077461 patent/WO2013061998A1/ja active Application Filing
- 2012-10-24 EP EP12842697.0A patent/EP2772903B1/en not_active Not-in-force
- 2012-10-24 CN CN201280004878.1A patent/CN103348407B/zh active Active
- 2012-10-24 KR KR1020137010455A patent/KR101461448B1/ko not_active IP Right Cessation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11338470A (ja) * | 1998-05-28 | 1999-12-10 | Canon Inc | 電子楽器及び電子楽器制御方法 |
US20090322498A1 (en) * | 2008-06-25 | 2009-12-31 | Lg Electronics Inc. | Haptic effect provisioning for a mobile communication terminal |
US20100156822A1 (en) * | 2008-12-24 | 2010-06-24 | Samsung Electro-Mechanics Co., Ltd. | Touch-sensitive interface device |
JP2010204401A (ja) * | 2009-03-04 | 2010-09-16 | Casio Computer Co Ltd | 電子楽器 |
JP2010271440A (ja) * | 2009-05-20 | 2010-12-02 | Yamaha Corp | 演奏制御装置およびプログラム |
Non-Patent Citations (1)
Title |
---|
"Zen Piano - Use the Force,", GREATAPPS LTD,, 22 May 2009 (2009-05-22), Retrieved from the Internet <URL:http://itunes.apple.com/app/zen-piano-use-force/id315585257?rnt=8#> |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10642407B2 (en) | 2011-10-18 | 2020-05-05 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
US11175698B2 (en) | 2013-03-19 | 2021-11-16 | Qeexo, Co. | Methods and systems for processing touch inputs based on touch type and touch intensity |
US10949029B2 (en) | 2013-03-25 | 2021-03-16 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers |
US11262864B2 (en) | 2013-03-25 | 2022-03-01 | Qeexo, Co. | Method and apparatus for classifying finger touch events |
EP3028125B1 (en) * | 2013-08-02 | 2023-04-05 | Qeexo, Co. | Capture of vibro-acoustic data used to determine touch types |
US10599251B2 (en) | 2014-09-11 | 2020-03-24 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
US11029785B2 (en) | 2014-09-24 | 2021-06-08 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
US11543922B2 (en) | 2019-06-28 | 2023-01-03 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
Also Published As
Publication number | Publication date |
---|---|
EP2772903A1 (en) | 2014-09-03 |
JP5565399B2 (ja) | 2014-08-06 |
CN103348407A (zh) | 2013-10-09 |
US8978672B2 (en) | 2015-03-17 |
KR101461448B1 (ko) | 2014-11-13 |
CN103348407B (zh) | 2015-05-20 |
JP2013092545A (ja) | 2013-05-16 |
EP2772903B1 (en) | 2017-09-13 |
KR20130080847A (ko) | 2013-07-15 |
US20130215070A1 (en) | 2013-08-22 |
EP2772903A4 (en) | 2015-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013061998A1 (ja) | 電子音響信号発生装置および電子音響信号発生方法 | |
TWI479476B (zh) | 鈸振動的電子處理的系統與方法 | |
US9779710B2 (en) | Electronic apparatus and control method thereof | |
WO2012037009A2 (en) | Graphical user interface for music sequence programing | |
WO2021033593A1 (ja) | 信号処理装置および方法、並びにプログラム | |
US9368095B2 (en) | Method for outputting sound and apparatus for the same | |
JP5194985B2 (ja) | 制御装置 | |
JP2007322683A (ja) | 楽音制御装置およびプログラム | |
JP6835247B2 (ja) | データ生成装置およびプログラム | |
JP6222964B2 (ja) | 電子機器、電子システム、音響機器、電子機器の制御方法およびプログラム | |
US10805475B2 (en) | Resonance sound signal generation device, resonance sound signal generation method, non-transitory computer readable medium storing resonance sound signal generation program and electronic musical apparatus | |
JP6210057B2 (ja) | 電子楽器の制御装置 | |
JP5821170B2 (ja) | 電子音楽装置及びプログラム | |
JP2008008946A (ja) | 楽音制御装置およびプログラム | |
JP6213455B2 (ja) | 電子楽器の制御装置 | |
KR20120094797A (ko) | 컴퓨터 | |
JP6357772B2 (ja) | 電子楽器、プログラム及び発音音高選択方法 | |
JP2007155806A (ja) | 電子楽器の制御装置及びコンピュータプログラム | |
JP5481843B2 (ja) | 楽音発生装置および楽音発生プログラム | |
JP2007240929A (ja) | 鍵域分割位置決定装置 | |
JP2018097157A (ja) | 電子打楽器、テンポ設定方法およびテンポ設定プログラム | |
CN114730556A (zh) | 信息处理系统、键盘乐器、信息处理方法及程序 | |
JP2017129606A (ja) | 信号生成装置 | |
JP2007256413A (ja) | 楽音制御装置 | |
JP2016038538A (ja) | 再生制御装置、再生制御装置の制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20137010455 Country of ref document: KR Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2012842697 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13883044 Country of ref document: US Ref document number: 2012842697 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12842697 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |