WO2021090592A1 - Système d'entrée, procédé d'entrée, et programme - Google Patents

Système d'entrée, procédé d'entrée, et programme Download PDF

Info

Publication number
WO2021090592A1
WO2021090592A1 PCT/JP2020/036076 JP2020036076W WO2021090592A1 WO 2021090592 A1 WO2021090592 A1 WO 2021090592A1 JP 2020036076 W JP2020036076 W JP 2020036076W WO 2021090592 A1 WO2021090592 A1 WO 2021090592A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
input device
area
orientation
acquired
Prior art date
Application number
PCT/JP2020/036076
Other languages
English (en)
Japanese (ja)
Inventor
哲法 中山
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Priority to JP2021554835A priority Critical patent/JP7340031B2/ja
Priority to CN202080075076.4A priority patent/CN114600069B/zh
Publication of WO2021090592A1 publication Critical patent/WO2021090592A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H18/00Highways or trackways for toys; Propulsion by special interaction between vehicle and track
    • A63H18/02Construction or arrangement of the trackway
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H5/00Musical or noise- producing devices for additional toy effects other than acoustical
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K15/00Acoustics not otherwise provided for
    • G10K15/04Sound-producing devices

Definitions

  • the present invention relates to an input system, an input method and a program.
  • the pattern in which the position information is encoded is printed on the sheet
  • the pattern taken by the camera provided on the tip of the pen or the like is decoded to acquire the position information.
  • the position information indicates the coordinates in the sheet
  • the acquired position information indicates the position on the sheet pointed by the tip of the pen or the like.
  • the input system determines which of the plurality of predetermined areas the acquired position information is included in, and inputs information according to the determined area. The entered information will be used for subsequent processing.
  • Patent Document 1 discloses that a paper on which a game selection area is printed is read by a camera built in an input device, and a game is started according to the reading result.
  • the amount of information that can be input by one operation is small.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique for improving an input interface for a sheet on which a pattern in which position information is encoded is printed.
  • the input system includes a sheet on which a pattern whose position is encoded is printed, an input device including a camera that captures the pattern, and an image captured by the camera.
  • the acquisition means for acquiring the position and orientation of the input device based on the pattern, and the execution means for executing the process based on the position and orientation acquired by the acquisition means.
  • the input method according to the present invention is an image obtained by taking a sheet on which a pattern in which a position is encoded is printed, and the position and orientation of the input device recognized from the image taken by the input device. Includes a step of acquiring the above and a step of executing the process based on the acquired position and orientation.
  • the program according to the present invention determines the position and orientation of the input device, which is an image taken by a sheet on which a pattern whose position is encoded is printed and is recognized from the image taken by the input device.
  • the computer functions as an acquisition means for acquisition and an execution means for executing processing based on the acquired position and orientation.
  • the input interface for a sheet on which a pattern in which position information is encoded is printed is improved.
  • the executing means may output sound and determine the parameters of the output sound according to the user's operation based on the acquired position and orientation.
  • the sheet comprises a setting area
  • the executing means responds to the user's operation based on the acquired orientation when the position of the input device is included in the setting area.
  • the type, pitch, and volume of the output sound may be determined.
  • the executing means when the position of the input device is included in the setting area, the executing means outputs a type of sound according to the operation of the user based on the acquired orientation. Either the pitch or the volume may be determined, and the determined sound may be output.
  • the sheet further includes a plurality of playing areas
  • the executing means performs a performance including the position of the input device when the position of the input device is any of the plurality of playing areas.
  • the execution means outputs a sound having a pitch corresponding to the region, and when the position of the input device is included in the setting region, the execution means outputs the type and pitch of the output sound based on the acquired orientation. You may change either the volume or the scale of the scale.
  • the executing means may determine the parameters of the output sound according to the amount of change in the acquired orientation while the acquired position is within a predetermined region. ..
  • the executing means determines the parameters of the sound output by the input device based on the orientation of the input device with respect to the sheet, when the acquired position is within a predetermined range. You may.
  • the executing means may select information based on the acquired position and orientation, and execute processing based on the selected information.
  • FIG. 1 It is a figure which shows an example of the operation system which concerns on 1st Embodiment. It is a figure which shows the hardware configuration of an operation system. It is a figure which shows the bottom surface of an input device. It is a figure which shows an example of a sheet and an input device. It is a figure which shows an example of a sheet. It is a figure which shows schematicly the pattern on a sheet. It is a block diagram which shows the function realized by an operation system. It is a flow chart which shows the outline of the processing of an operation system. It is a figure which shows another example of a sheet. It is a figure which shows another example of a sheet. It is a flow chart which shows an example of the processing about a setting area.
  • the user holds a self-propelled input device and inputs information by pointing to an area.
  • the process is executed according to the input information.
  • FIG. 1 is a diagram showing an example of an operation system according to the first embodiment.
  • the operation system according to the present invention includes a device control device 10, position input devices 20a and 20b, a controller 17, and a cartridge 18.
  • the position input devices 20a and 20b are self-propelled devices having a camera 24, and both have the same function. In the following, these position input devices 20a and 20b will be referred to as position input devices 20 unless otherwise specified.
  • the device control device 10 controls the position input device 20 via radio.
  • the device control device 10 has a recess 32, and when the position input device 20 is fitted in the recess 32, the device control device 10 charges the position input device 20.
  • the controller 17 is an input device that acquires an operation by the user, and is connected to the device control device 10 by a cable.
  • the cartridge 18 has a built-in non-volatile memory.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the operation system according to the embodiment of the present invention.
  • the device control device 10 includes a processor 11, a storage unit 12, a communication unit 13, and an input / output unit 14.
  • the position input device 20 includes a processor 21, a storage unit 22, a communication unit 23, a camera 24, and two motors 25.
  • the device control device 10 may be a dedicated device optimized for controlling the position input device 20, or may be a general-purpose computer.
  • the processor 11 operates according to the program stored in the storage unit 12, and controls the communication unit 13, the input / output unit 14, and the like.
  • the processor 21 operates according to a program stored in the storage unit 22, and controls the communication unit 23, the camera 24, the motor 25, and the like.
  • the program is stored and provided in a computer-readable storage medium such as a flash memory in the cartridge 18, but may be provided via a network such as the Internet.
  • the storage unit 12 is composed of a DRAM and a non-volatile memory built in the device control device 10, a non-volatile memory in the cartridge 18, and the like.
  • the storage unit 22 is composed of a DRAM, a non-volatile memory, and the like.
  • the storage units 12 and 22 store the above program. Further, the storage units 12 and 22 store information and calculation results input from the processors 11, 21 and the communication units 13, 23 and the like.
  • Communication units 13 and 23 are composed of integrated circuits, antennas, etc. for communicating with other devices.
  • the communication units 13 and 23 have a function of communicating with each other according to, for example, a Bluetooth (registered trademark) protocol.
  • the communication units 13 and 23 Based on the control of the processors 11 and 21, the communication units 13 and 23 input the information received from the other devices to the processors 11 and 21 and the storage units 12 and 22, and transmit the information to the other devices.
  • the communication unit 13 may have a function of communicating with another device via a network such as a LAN.
  • the input / output unit 14 includes a circuit for acquiring information from an input device such as a controller 17 and a circuit for controlling an output device such as an audio output device or an image display device.
  • the input / output unit 14 acquires an input signal from the input device, and inputs the converted information of the input signal to the processor 11 and the storage unit 12. Further, the input / output unit 14 causes the speaker to output the sound and outputs the image to the display device based on the control of the processor 11 or the like.
  • the motor 25 is a so-called servomotor in which the rotation direction, the amount of rotation, and the rotation speed are controlled by the processor 21.
  • the camera 24 is arranged so as to photograph the lower part of the position input device 20, and photographs the pattern 71 (see FIG. 6) printed on the sheet 31 (see FIGS. 4 and 5) on which the position input device 20 is placed. ..
  • the sheet 31 is printed with the pattern 71 recognized in the infrared frequency domain, and the camera 24 captures the infrared image.
  • FIG. 3 is a diagram showing an example of the position input device 20.
  • FIG. 3 is a view of the position input device 20 as viewed from below.
  • the position input device 20 further includes a power switch 250, a switch 222, and two wheels 254.
  • One motor 25 is assigned to each of the two wheels 254, and the motor 25 drives the assigned wheel 254.
  • FIG. 4 is a diagram showing an example of the sheet 31 and the position input devices 20a and 20b.
  • a cover 75 is attached to the position input device 20.
  • the cover 75 has a shape that does not interfere with the shooting of the sheet 31 by the camera 24 and that the user can easily recognize the direction of the position input device 20.
  • the cover 75 is covered with the position input device 20 and covers other than the lower surface of the position input device 20.
  • FIG. 18 is a view showing an example of the cover 75, and is a view showing the front surface and the side surface of the cover 75.
  • the cover 75 shown in FIG. 18 is covered from above the position input device 20a.
  • the cover 75 makes it easier for the user to recognize the front of the position input device 20.
  • the user holds the position input device 20a with his right hand and holds the position input device 20b with his left hand. Then, when the user places the position input device 20 in the area on the sheet 31, information corresponding to the area is input, and the sound corresponding to the input information is output from the device control device 10.
  • FIG. 5 is a diagram showing an example of the sheet 31.
  • An image that can be visually recognized by the user is printed on the sheet 31.
  • a plurality of setting areas 41a to 41i, a plurality of keyboard areas 43, a plurality of rhythm areas 44, and a plurality of chord areas 45 are provided on the sheet 31a shown in FIG. The area can be visually recognized.
  • a pattern 71 that can be read by the camera 24 is further printed on the sheet 31.
  • FIG. 6 is a diagram schematically showing the pattern 71 on the sheet 31.
  • Patterns 71 of a predetermined size for example, 0.2 mm square
  • Each of the patterns 71 is an image in which the coordinates of the position where the pattern 71 is arranged are encoded.
  • the sheet 31 is assigned an area corresponding to the size of the sheet 31 in the coordinate space that can be represented by the encoded coordinates.
  • the camera 24 of the position input device 20 photographs the pattern 71 printed on the sheet 31 or the like, and the position input device 20 or the device control device 10 decodes the pattern 71 to acquire the coordinates. To do.
  • the position input device 20 or the device control device 10 detects the direction of the position input device 20 (for example, the angle A from the reference direction) by detecting the inclination of the pattern 71 in the image captured by the camera 24. To do.
  • the plurality of keyboard areas 43 are arranged side by side in the horizontal direction.
  • the keyboard area 43 is an area for instructing the output of the performance sound.
  • a sound is output.
  • the output sound becomes higher as the keyboard area 43 on which the position input device 20 is placed is on the right side.
  • the plurality of rhythm regions 44 are arranged above the keyboard region 43 in FIG.
  • the rhythm area 44 is an area for instructing the output of the performance sound.
  • the sound corresponding to the arranged rhythm area 44 is output from the device control device 10.
  • the plurality of chord regions 45 are arranged on the right side of the rhythm region 44 in FIG.
  • the chord area 45 is an area for instructing the output of the performance sound.
  • the plurality of setting areas 41a to 41i are areas for acquiring instructions for setting sound parameters output when the position input device 20 is arranged in the keyboard area 43 or the rhythm area 44. In the following, when distinction is unnecessary, these are referred to as setting areas 41.
  • the setting area 41 corresponds to the parameter of the output sound.
  • the setting area 41a is an area for designating the parameters of the volume of the output sound
  • the setting area 41b is an area for adjusting the parameters of the pitch of the sound output when the position input device 20 is arranged in the keyboard area 43.
  • the setting area 41d is an area for setting the parameters of the scale scale (for example, major, minor, pentatonic, or Ryukyu scale) for the sounds associated with the plurality of keyboard areas 43.
  • the setting areas 41h, 41i, and 41g are areas for setting parameters of the type of sound output when the position input device 20 is arranged in the keyboard area 43, the rhythm area 44, and the chord area 45, respectively.
  • FIG. 7 is a block diagram showing the functions realized by the operation system.
  • the operation system functionally includes a position acquisition unit 51, an information input unit 52, an application execution unit 58, and a voice output unit 59.
  • the information input unit 52 functionally includes an area determination unit 53 and an input information determination unit 55.
  • These functions are mainly realized by the processor 11 included in the device control device 10 executing a program stored in the storage unit 12 and controlling the position input device 20 via the communication unit 13. Further, in some of the functions such as the position acquisition unit 51, the processor 21 included in the position input device 20 executes a program stored in the storage unit 22 and exchanges data with the device control device 10 via the communication unit 23. However, it may be realized by controlling the camera 24 and the motor 25.
  • the position acquisition unit 51 recognizes the coordinate-encoded pattern 71 from the image captured by the camera 24, and from the coordinates indicated by the pattern 71, the position coordinate of the position input device 20 and the orientation of the position input device 20. And get.
  • the information input unit 52 acquires input information input by the user based on the position and orientation acquired by the position acquisition unit 51.
  • the area determination unit 53 included in the information input unit 52 determines in which area the position of the position input device 20 is included.
  • the input information determination unit 55 included in the information input unit 52 determines the input information based on the area including the position of the position input device 20. Further, when the area including the position of the position input device 20 is a predetermined area, the input information determination unit 55 determines the input information based on the area and the orientation of the position input device 20. To do.
  • the input information determination unit 55 is based on whether or not one position of the position input device 20 is included in a certain area and whether or not the other position of the position input device 20 is included in the other area. , Determine the input information.
  • the application execution unit 58 executes the process based on the acquired input information.
  • the application execution unit 58 sets sound parameters based on the acquired input information.
  • the audio output unit 59 included in the application execution unit 58 outputs a sound according to the set parameters.
  • FIG. 8 is a flow chart showing an outline of processing of the operation system.
  • the position input device 20a will be referred to as a first device
  • the position input device 20b will be referred to as a second device.
  • the process shown in FIG. 8 is repeatedly executed.
  • the position acquisition unit 51 determines whether the first device has photographed the surface of the sheet 31 (step S101). The position acquisition unit 51 determines whether or not the first device has photographed the surface of the sheet 31 based on whether or not an image element peculiar to the pattern 71 is present in the image captured by the camera 24 of the first device. You can do it.
  • the position acquisition unit 51 acquires the first position which is the position on the sheet 31 of the first device and the direction of the first device.
  • the area determination unit 53 detects the area including the acquired first position (step S103).
  • step S102 the position acquisition unit 51 acquires the orientation of the first device based on the inclination of the pattern 71 taken by the camera 24 of the first device in the image.
  • step S103 the area determination unit 53 selects an area including the first position based on the information relating each of the plurality of areas to the coordinate range on the sheet 31 and the coordinates of the first position. ..
  • the processing corresponding to steps S101 to S103 is also performed on the second device.
  • the position acquisition unit 51 acquires the second position which is the position on the sheet 31 of the second device and the direction of the second device.
  • the area determination unit 53 detects the area including the acquired second position (step S106).
  • the area determined by the area determination unit 53 does not have to be limited to the setting areas 41a to 41i, the keyboard area 43, the rhythm area 44, and the chord area 45 on a single sheet 31.
  • the area determination unit 53 may determine whether or not the first position or the second position is included in the area on the other sheet 31. Patterns 71 in which coordinates in different ranges are encoded are printed on sheets 31 that are different from each other. As a result, even if the area is arranged on the plurality of sheets 31, the area is determined simply by the coordinates of the first position or the second position.
  • FIG. 9 is a diagram showing another example of the sheet 31.
  • a plurality of character areas 46, a plurality of switching areas 47, and a free area 42 are provided on the sheet 31b shown in FIG. 9, and the user can visually recognize these areas.
  • the user arranges the position input device 20 on the sheet 31b shown in FIG. 8 and points to the character area 46 and the switching area 47, so that the operation system outputs the voice of the character corresponding to the pointed area.
  • the plurality of character areas 46 are areas for selecting characters.
  • the number of character areas 46 is smaller than the number of character types that can be input.
  • the character area 46 is an area in which characters mainly indicating seion in Japanese are displayed.
  • the character area 46 is not an area indicating the voiced sound, the semi-voiced sound, and the yoon itself, but the combination of the switching area 47 and the character area 46, which will be described later, makes it possible to input the voiced sound, the semi-voiced sound, the yoon, or a combination thereof. ..
  • the plurality of switching areas 47 are areas for inputting more types of characters than the number of character areas 46.
  • Each of the plurality of character areas 46 is associated with any of a plurality of candidate values included in a certain candidate set. It is also associated with any of a plurality of candidate values included in other candidate sets.
  • the candidate value corresponding to the indicated character area 46 is selected as input information.
  • the plurality of switching regions 47 specifically include a region showing a voiced sound, a region showing a semi-voiced sound, a region showing a yoon, a region showing a sokuon, and a region showing a combination of a voiced sound and a yoon. , Includes areas showing handakuon and yoon. The processing related to these areas will be described later.
  • the free area 42 included in the sheet 31b is an area for inputting a set value according to the planar position of the position input device 20 when the position input device 20 is arranged on the sheet 31b.
  • the x-axis Lx and the y-axis Ly are described by broken lines on the free region 42 of FIG. 9, these may not be described.
  • the setting of a plurality of items may be set based on the movement of one input device 20 in the x direction and the movement in the y direction. For example, the pitch of the output sound may be determined according to the y coordinate of the position of the input device 20, and the mixing volume for the L / R speaker may be determined according to the x coordinate.
  • the orientation of the position input device 20 is not used, and the free area 42 receives one input corresponding to two, a slider corresponding to the movement in the x-axis direction and a slider corresponding to the y-axis direction. It is possible with one position input device 20.
  • FIG. 10 is a diagram showing another example of the sheet 31.
  • a faucet region 48 and a glass region 49 are provided on the sheet 31c shown in FIG. 10, and the user can visually recognize these regions.
  • the faucet area 48 is an area for obtaining an instruction as to whether or not to put water in the glass in a pseudo manner by the amount of rotation of the position input device 20 arranged on the faucet area 48.
  • the glass region 49 is a region for outputting the sound of a glass containing water in a pseudo manner. The processing related to these areas will be described later.
  • the information input unit 52 sets the type of the region including the first position.
  • the input information is acquired by the corresponding processing, and the voice output unit 59 outputs the sound (step S107).
  • the information input unit 52 acquires input information by processing according to the type of the region including the second position, and the voice output unit 59 outputs sound (step S108).
  • step S107 the details of the process of step S107, that is, the process according to the type of the region will be described.
  • the process of step S108 sets the first position and direction of the first device and the second position and direction of the second device in step S107 to the second position and direction of the second device and the first of the first device. Since it is only necessary to change the position and direction, detailed description thereof will be omitted.
  • FIG. 11 is a flow chart showing an example of processing for the setting area 41.
  • FIG. 11 shows a process of acquiring input information and outputting a sound when the first position is included in the setting area 41 in step S107.
  • the input information determination unit 55 acquires correspondence information indicating the correspondence between the direction of the position input device 20 and the input value according to the setting area 41 including the first position (step S301).
  • Correspondence information may include a plurality of items, and each item may associate a range of directions with an input value.
  • the input value is a set value of the parameter corresponding to the setting area 41.
  • Correspondence information is prepared in advance for each of the setting areas 41a to 41i.
  • the input information determination unit 55 acquires an input value based on the acquired correspondence information and the direction of the first device (step S302).
  • the input information determination unit 55 acquires, for example, an input value associated with an item having a range including the direction of the first device among the items included in the corresponding information.
  • the application execution unit 58 sets the acquired input value in the parameter corresponding to the setting area 41 to which the first position belongs (step S303).
  • the volume is set according to the direction.
  • the pitch of the sound (the pitch of the reference sound) output when any one of the keyboard areas 43 is pointed to changes according to the direction thereof.
  • the heights of the keyboard regions 43 are set according to the direction while maintaining the relative difference in pitch between the keyboard regions 43.
  • the application execution unit 58 determines the scale of the scale (for example, major, etc.) of the sound associated with the plurality of keyboard areas 43 according to the direction in which the first device is arranged. Minor, pentatonic, or Ryukyu scale) may be set.
  • the application execution unit 58 is located in the keyboard area 43, the rhythm area 44, and the chord area 45, respectively, according to the direction in which the first device is arranged.
  • the type of sound output when the input device 20 is arranged may be set.
  • the area required for inputting information can be reduced. Further, since the rotation of the position input device 20 is associated with the operation of the knob, the user can input the value more intuitively.
  • the audio output unit 59 When an input value is set in the parameter, the audio output unit 59 outputs a confirmation sound according to the parameter (step S304). By outputting the confirmation sound, the user can easily recognize the current setting, and the setting becomes easier.
  • FIG. 12 is a flow chart showing an example of processing for the keyboard area 43.
  • FIG. 12 shows a process of acquiring input information and outputting a sound when the first position is included in the keyboard area 43 in step S107.
  • the input information determination unit 55 acquires information for identifying which keyboard area 43 is, according to the keyboard area 43 including the first position (step S401).
  • the input information determination unit 55 acquires whether or not the pitch is changed based on the direction of the first device (step S402). More specifically, the input information determination unit 55 receives an instruction to raise the pitch by a semitone when the direction of the first device is tilted to the right by a predetermined angle or more with respect to the reference direction of the sheet as input information. get. Further, the input information determination unit 55 may acquire an instruction to lower the pitch by a semitone as input information when the direction of the first device is tilted to the left by a predetermined angle or more with respect to the reference direction of the sheet 31. Good.
  • the application execution unit 58 determines the pitch of the output sound based on the scale and the pitch of the reference sound, the acquired identification information, and whether or not the pitch of the sound is changed (step S403). More specifically, the application execution unit 58 obtains a relative pitch based on the scale and the identification information of the keyboard area 43, and determines the obtained relative pitch and the reference pitch. Based on this, the absolute pitch of the output sound is obtained, and if there is a change in the pitch, the pitch is changed by a semitone.
  • the audio output unit 59 included in the application execution unit 58 has a determined pitch and outputs a sound of a sound type and a volume set as parameters (step S404).
  • the pitch of the output sound can be raised or lowered by a semitone using the direction of the position input device 20 when pointing to the keyboard area 43.
  • the application execution unit 58 may determine the length of the output sound (the type of musical note corresponding to the output sound) according to the direction of the position input device 20. For example, it is difficult for a child user to accurately adjust the time for placing the position input device 20 in the keyboard area 43, so that the child user can more easily indicate the length of the sound by using the direction. ..
  • FIG. 13 is a flow chart showing an example of processing for the rhythm region 44.
  • FIG. 13 shows a process of acquiring input information and outputting a sound when the first position is included in the rhythm region 44 in step S107.
  • the input information determination unit 55 acquires information (rhythm set) indicating the correspondence between each of the plurality of rhythm areas 44 and the type of the rhythm sound, based on the parameters previously set by the setting area 41i (step). S501).
  • rhythm set There are a plurality of rhythm sets, and each of the plurality of rhythm regions 44 is associated with any of the plurality of rhythm sound types included in a certain rhythm set.
  • Each of the plurality of rhythm regions 44 is also associated with any of the plurality of rhythm sound types included in the other rhythm set.
  • the type of rhythm sound is, for example, drum, percussion, Japanese drum, animal bark, etc., and the parameters used for selecting the rhythm set are set according to the direction of the position input device 20 when the setting area 41i is pointed to. Has been done.
  • the input information determination unit 55 acquires the type of rhythm sound corresponding to the rhythm region 44 including the first position as input information based on the acquired set (step S502). Then, the voice output unit 59 outputs the acquired type of rhythm sound at the set volume (step S503).
  • FIG. 14 is a flow chart showing an example of processing for the character area 46.
  • FIG. 14 shows a process of acquiring input information and outputting voice when the first position is included in the character area 46 in step S107.
  • the input information determination unit 55 determines whether or not the second position is included in any of the plurality of switching areas 47 (step S601). If the second position is not included in any of the switching regions 47 (N in step S601), the input information determination unit 55 selects the default candidate set from the plurality of candidate sets (step S602). Each of the plurality of character areas 46 is associated with any of a plurality of candidate values included in a certain candidate set. It is also associated with any of a plurality of candidate values included in other candidate sets.
  • the input information determination unit 55 determines the switching area 47 including the second position among the plurality of candidate sets. Select a candidate set according to (step S603). When any candidate set is selected, the input information determination unit 55 acquires, as an input value, a candidate site corresponding to the character area 46 including the first position from the selected candidate set (step S604). Further, the input information determination unit 55 acquires an instruction of the type and pitch of the output voice based on the direction of the first device (step S605).
  • the input information determination unit 55 divides the 360-degree range of the angle A into two ranges, and when the angle A of the first device is in one range, the male voice is used as the type of voice. As an instruction, and when the angle A is in the other range, a female voice may be acquired as an instruction as the type of voice. Further, the input information determination unit 55 may acquire an instruction of the pitch of the sound based on the difference between the reference angle and the angle A in each range.
  • the voice output unit 59 outputs the voice corresponding to the acquired input value according to the type and pitch of the acquired voice (step S606).
  • the switching area 47 can be used to switch the candidate set of input values, and many types of information can be input even in a small character area 46. Further, in the process shown in FIG. 14, the candidate set can be switched by leaving the position input device 20 in the switching area 47. Since the user can always confirm which candidate set information (for example, voiced sound or handakuon character) is input from the switching area 47 in which the position input device 20 is placed, the information is input by pointing to the printed area. It becomes possible to input intuitively depending on the case. Further, by using the direction of the position input device 20, it becomes possible to easily adjust the pitch of the voice.
  • candidate set information for example, voiced sound or handakuon character
  • the input information determination unit 55 acquires the position correspondence information indicating the relationship between the parameter item set by the free area 42 and the position, and the position.
  • the input value is acquired based on the correspondence information and the acquired position (x-coordinate and y-coordinate) of the first device.
  • the application execution unit 58 sets the acquired input value in the parameter item.
  • FIG. 15 is a flow chart showing an example of processing for the faucet region 48.
  • FIG. 15 shows a process of acquiring input information and outputting voice when the first position is included in the faucet region 48 in step S107. In this process, the process is executed according to the amount of change in the direction of the first device.
  • the input information determination unit 55 determines whether the first position acquired last time is included in the faucet area 48 (step S701). When the previous first position is not included in the faucet area 48 (N in step S701), the input information determination unit 55 stores the current direction of the first device as the initial direction (step S702).
  • the input information determination unit 55 is based on the difference (direction change amount) between the initial direction and the current direction of the first device. Then, an instruction to turn on or off the running water mode is obtained (step S703). More specifically, the input information determination unit 55 acquires an instruction to turn on the running water mode when the amount of change in the direction is equal to or more than the threshold value, and acquires an instruction to turn off the running water mode when the amount of change in the direction is less than the threshold value.
  • the application execution unit 58 turns on the running water mode, and the voice output unit 59 outputs the sound of flowing water (step S704). Further, the application execution unit 58 determines the pitch of the sound to be output when the glass region 49 is pointed to according to the period during which the running water mode is on (step S705).
  • the sound (output sound) output when the glass area 49 is pointed to corresponds to the sound output when the glass containing water is struck.
  • the set value may be increased or decreased according to the amount of change in the direction of the position input device 20.
  • the input information determination unit 55 determines the amount of change in the volume and the increase / decrease in the height of the key (for example, the reference sound) in the keyboard area 43 according to the amount of rotation of the input device 20 on the setting areas 41a and 41b, respectively. You may.
  • the application execution unit 58 may set the key height of the volume or keyboard area 43 based on the determined increase / decrease and the previously set volume or key height. Intuitive operation is possible by changing a set value such as a volume according to the amount of rotation of the input device 20. This effect can also be obtained by the process shown in FIG.
  • FIG. 16 is a diagram showing an example of the sheet 31d according to the second embodiment.
  • a 9 ⁇ 9 square 80 is printed on the sheet 31d shown in FIG. Further, one of the four corners is the first determination area 81, and the other is the second determination area 82.
  • the application execution unit 58 executes the processing of the game in which the position input device 20 runs on the seat 31d.
  • the application execution unit 58 starts the game from any of a plurality of initial states stored in the storage unit 12 in advance, and changes the values of internal variables and the position of the position input device 20 according to the user's operation.
  • the initial position of the position input device 20 in the initial state is also predetermined, and the position input device 20 self-propells toward the initial position at the start of the game.
  • FIG. 17 is a diagram showing an example of a process of acquiring input information and controlling the position input device 20, and is a process corresponding to steps S107 and S108 of FIG.
  • the input information determination unit 55 includes the first position in the first determination area 81 (Y in step S901) and the second position in the second determination area 82 (Y in step S902). In that case, the restart instruction is acquired as an input value (step S903). Further, when the first position is not included in the first determination area 81 (N in step S901) or the second position is not included in the second determination area 82 (N in step S902). , The input information determination unit 55 acquires the input value according to the area including the first position and the second position (step S904). The restart instruction may be acquired as an input value even when the first position is included in the second determination area 82 and the second position is included in the first determination area 81.
  • step S906 when the restart instruction is acquired as the input value (Y in step S906), the application execution unit 58 initializes the variable of the game currently being executed so that the position input device 20 moves to the initial position. Control (step S907). On the other hand, if the restart instruction is not acquired as the input value (N in step S906), the application execution unit 58 continues the game processing (step S908).
  • information can be input by arranging the two position input devices 20 at special positions. As a result, the area on the sheet 31 can be used more effectively.
  • the position input device 20 can be self-propelled, but it does not have to be self-propelled. Further, the present invention may be used not only for sound output and game restart, but also for general-purpose information input.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un système d'entrée offrant une interface d'entrée améliorée pour une feuille sur laquelle est imprimé un motif dans lequel des informations de position sont codées. Le système d'entrée comprend : une feuille (31) sur laquelle un motif (71), dans lequel une position est codée, doit être imprimé ; un dispositif d'entrée (20) qui inclut un appareil photo (24) pour photographier ledit motif ; un moyen d'acquisition pour acquérir une position et une orientation du dispositif d'entrée sur la base du motif contenu dans une image capturée par l'appareil photo ; et un moyen d'exécution pour exécuter un processus sur la base de la position et de l'orientation acquises par le moyen d'acquisition.
PCT/JP2020/036076 2019-11-08 2020-09-24 Système d'entrée, procédé d'entrée, et programme WO2021090592A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021554835A JP7340031B2 (ja) 2019-11-08 2020-09-24 入力システム、入力方法およびプログラム
CN202080075076.4A CN114600069B (zh) 2019-11-08 2020-09-24 输入系统、输入方法以及存储介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-203467 2019-11-08
JP2019203467 2019-11-08

Publications (1)

Publication Number Publication Date
WO2021090592A1 true WO2021090592A1 (fr) 2021-05-14

Family

ID=75849898

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/036076 WO2021090592A1 (fr) 2019-11-08 2020-09-24 Système d'entrée, procédé d'entrée, et programme

Country Status (3)

Country Link
JP (1) JP7340031B2 (fr)
CN (1) CN114600069B (fr)
WO (1) WO2021090592A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5598788A (en) * 1978-12-28 1980-07-28 Gaber Howard S Method and device for generating acoustic output from musical toy
US20020102910A1 (en) * 2001-01-29 2002-08-01 Donahue Kevin Gerard Toy vehicle and method of controlling a toy vehicle from a printed track
JP2010240345A (ja) * 2009-04-02 2010-10-28 Koto:Kk 移動体玩具
WO2018025467A1 (fr) * 2016-08-04 2018-02-08 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations
JP3215614U (ja) * 2017-12-20 2018-04-05 安譜國際股▲分▼有限公司 教育玩具
US20190164447A1 (en) * 2017-11-30 2019-05-30 Beijing Xiaomi Mobile Software Co., Ltd. Story machine, control method and control device therefor, storage medium and story machine player system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5705823B2 (ja) * 2012-12-28 2015-04-22 株式会社東芝 画像形成装置および画像形成装置における確認音発生方法
JP6900705B2 (ja) * 2017-02-28 2021-07-07 コニカミノルタ株式会社 情報処理システム、情報処理装置、および、プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5598788A (en) * 1978-12-28 1980-07-28 Gaber Howard S Method and device for generating acoustic output from musical toy
US20020102910A1 (en) * 2001-01-29 2002-08-01 Donahue Kevin Gerard Toy vehicle and method of controlling a toy vehicle from a printed track
JP2010240345A (ja) * 2009-04-02 2010-10-28 Koto:Kk 移動体玩具
WO2018025467A1 (fr) * 2016-08-04 2018-02-08 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations
US20190164447A1 (en) * 2017-11-30 2019-05-30 Beijing Xiaomi Mobile Software Co., Ltd. Story machine, control method and control device therefor, storage medium and story machine player system
JP3215614U (ja) * 2017-12-20 2018-04-05 安譜國際股▲分▼有限公司 教育玩具

Also Published As

Publication number Publication date
JPWO2021090592A1 (fr) 2021-05-14
JP7340031B2 (ja) 2023-09-06
CN114600069A (zh) 2022-06-07
CN114600069B (zh) 2024-04-30

Similar Documents

Publication Publication Date Title
US8085242B2 (en) Input control device and image forming apparatus
EP1806643B1 (fr) Procédé d'entrée des commandes et/ou des caractères d'un dispositif portable de communication avec un capteur d'inclinaison
JP4253029B2 (ja) 画像処理方法
JP4309871B2 (ja) 情報処理装置、方法及びプログラム
KR100533839B1 (ko) 동작기반 전자기기 제어장치 및 그 제어방법
JP2009116583A (ja) 入力制御装置および入力制御方法
US8202147B2 (en) Storage medium having game program stored therein and game apparatus
US20110032202A1 (en) Portable computer with touch panel display
EP2218485A1 (fr) Dispositif de génération d'image, programme de génération d'image, support d'enregistrement de programme de génération d'image et procédé de génération d'image
JP2015187877A (ja) 携帯型情報処理端末
JPH08335136A (ja) 座標検出装置及び方法
JP2010224764A (ja) タッチパネル式ディスプレイを持った携帯型ゲーム装置
JP2000298544A (ja) 入出力装置と入出力方法
US20060281545A1 (en) Game program and game apparatus
JPH10283115A (ja) 表示入力装置
WO2021090592A1 (fr) Système d'entrée, procédé d'entrée, et programme
JP2007072569A (ja) プログラム、情報記憶媒体及び手書き図類似度判定装置
JP2021077113A (ja) 入力システム、入力方法およびプログラム
JP5753868B2 (ja) ゲーム装置及びプログラム
JP5232890B2 (ja) ゲーム装置、ゲーム装置の制御方法、及びプログラム
JP2021196800A (ja) プログラムおよびコンピュータの制御方法
JP3899927B2 (ja) 楽音制御装置及び楽音制御処理プログラム
JP2007159800A (ja) ポインタによるオブジェクト選択システム
KR100681550B1 (ko) 이동 통신 단말기 및 그 동작인식을 통한 음악연주 방법
CN113242466B (zh) 视频剪辑方法、装置、终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20885521

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021554835

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20885521

Country of ref document: EP

Kind code of ref document: A1