WO2021230228A1 - Procédé, programme et dispositif électronique - Google Patents

Procédé, programme et dispositif électronique Download PDF

Info

Publication number
WO2021230228A1
WO2021230228A1 PCT/JP2021/017826 JP2021017826W WO2021230228A1 WO 2021230228 A1 WO2021230228 A1 WO 2021230228A1 JP 2021017826 W JP2021017826 W JP 2021017826W WO 2021230228 A1 WO2021230228 A1 WO 2021230228A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
determined
speed
displacement
data points
Prior art date
Application number
PCT/JP2021/017826
Other languages
English (en)
Japanese (ja)
Inventor
修一 倉林
Original Assignee
株式会社Cygames
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Cygames filed Critical 株式会社Cygames
Priority to CN202180049658.XA priority Critical patent/CN115867364A/zh
Publication of WO2021230228A1 publication Critical patent/WO2021230228A1/fr
Priority to US18/054,005 priority patent/US20230117127A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a method, a program, and an electronic device, and more particularly to a method executed in a computer equipped with a touch panel, a program for executing each step of the method, and an electronic device provided with a touch panel.
  • Patent Document 1 discloses a game device provided with a touch panel, which can set an origin according to a user's touch operation and perform an operation imitating a joystick.
  • the game device sets the reference coordinates based on the coordinates when the detection is started when the touch panel changes from the state where the touch is not detected to the state where the touch is detected, and when the touch detection is continued after that, the detection is performed thereafter.
  • the game device realizes a virtual joystick and operates a virtual object by recognizing that the direction of the vector from the reference coordinate to the indicated coordinate is the direction in which the joystick is tilted and the size of the vector is the degree in which the joystick is tilted. It has been realized.
  • Patent Document 1 the user touches one point on the touch panel with a finger to cause the game device to recognize the reference coordinates, slides the finger while touching the touch panel, and the game is based on the contact position of the finger after the slide. Make the device recognize the indicated coordinates.
  • the prior art configured in this way, it is difficult to achieve high responsiveness because it is necessary to generate the distance from the reference coordinate to the indicated coordinate when the user inputs the direction. For example, when the user wants to perform an operation in which the virtual joystick is greatly tilted, it is necessary to generate the size of the vector from the reference coordinates to the indicated coordinates corresponding to the degree of tilting of the greatly tilted joystick.
  • the present invention has been made to solve such a problem, and an object of the present invention is to provide a method and the like capable of further improving operability in an operation on an operation target object via a touch panel. ..
  • the method as one aspect of the present invention is a method executed in a computer provided with a touch panel.
  • One or more data points indicated by the value of the first axis and the value of the second axis acquired based on the touch event generated by the user's operation on the touch panel, data points at predetermined time intervals. Steps to keep as columns and Based on the displacement of the data points in the retained data point sequence, the displacement speed corresponding to the displacement speed of the position where the touch event occurred in the data point sequence is determined, and the latest displacement speed determined is determined.
  • the step of determining the weighted speed at least based on the bias with respect to the mean value of the displacement speed determined before that.
  • Steps to determine the converted weighted speed for determining the parameters of the object to be manipulated, and It is characterized by including.
  • the step of holding as a data point sequence holds one or a plurality of data points as a data point string for each time corresponding to the frame rate.
  • the step of determining the weighted speed corresponds to the displacement speed of the position where the touch event occurred in the frame based on the displacement of the data points in the data point sequence held within the time of one frame.
  • the displacement speed is determined, and the weighted speed is determined based at least on the deviation of the displacement speed in the latest frame with respect to the average value of the displacement speed in the frames before the latest frame.
  • the method is: When a touch event of touch end or touch cancel occurs by the user's operation on the touch panel, the step of ending the holding of the maximum value of the predetermined function is included.
  • the predetermined function corresponds to the input value by determining a constant having a magnitude corresponding to the threshold value according to the input value being equal to or more than the predetermined threshold value. It is configured to determine the value.
  • the predetermined function determines a value corresponding to an input value by applying a function that maps one input value to one value within a certain value range. It is configured as follows.
  • the method is: A step of holding a data point indicated by a value of the first axis and a value of the second axis acquired based on a touch event generated by a user's operation on the touch panel. The step of ending the retention of the retained data points that exceeds the predetermined retention time, and The step of determining the slope of the regression line based on the retained data points, A step of determining the amount of rotation to rotate the slope of the determined regression line based on the displacement direction as a set of the retained data points. A step of determining an angle based on the slope of the determined regression line and the determined amount of rotation at predetermined time intervals. including.
  • the step of determining the converted weighted speed and the step of determining the angle each determine the converted weighted speed and the angle for each predetermined time.
  • the method is Each predetermined time includes a step of generating a composite vector based on the determined converted weighted velocity and the unit vector with the determined angle.
  • the direction of the determined synthetic vector is the moving direction of the operation target object
  • the magnitude of the determined synthetic vector is the moving speed of the operation target object
  • a step of determining the moving state of the operation target object based on the magnitude of the determined composite vector is included.
  • the program as one aspect of the present invention is characterized in that a computer is made to execute each step of the method described above.
  • the electronic device as one aspect of the present invention is an electronic device provided with a touch panel.
  • One or more data points indicated by the value of the first axis and the value of the second axis acquired based on the touch event generated by the user's operation on the touch panel, data points at predetermined time intervals. Keep as a column, Based on the displacement of the data points in the retained data point sequence, the displacement speed corresponding to the displacement speed of the position where the touch event occurred in the data point sequence is determined, and the latest displacement speed determined is determined.
  • the magnitude of a vector can mean the norm of a vector.
  • the electronic device 10 is installed with a game application that presents a virtual object arranged in a virtual space to a user and advances a game.
  • the electronic device 10 is a virtual controller (virtual) for controlling an operation target object, which is a virtual object to be operated by the user in the virtual space, in response to the operation of the user. Controller) is provided.
  • the virtual space and the virtual controller are defined by the game application, and the virtual space can be a two-dimensional space or a three-dimensional space.
  • the operation target object is an operation target character which is a character arranged in the virtual space.
  • the electronic device 10 uses a virtual controller as input of direction and size to determine parameters such as the moving direction and speed of the operation target character.
  • the operation target object may be an item placed in the virtual space, a virtual camera, or the like.
  • the electronic device 10 may implement an application capable of controlling the operation target object according to the operation of the user.
  • the electronic device 10 may be equipped with an input support application or a simulation application that operates an operation target object according to a user's operation in place of or in addition to the game application.
  • the application means an application program in general, and can mean an application installed on a smartphone or a tablet terminal.
  • FIG. 1 is a block diagram showing a hardware configuration of an electronic device 10 according to an embodiment of the present invention.
  • the electronic device 10 includes a processor 11, an input device 12, a display device 13, a storage device 14, and a communication device 15. Each of these components is connected by a bus 16. It is assumed that an interface is interposed between the bus 16 and each component as necessary.
  • the electronic device 10 is a smartphone.
  • the electronic device 10 may be a tablet computer, a computer equipped with a contact-type input device such as a touch pad, or the like, as long as it has the above configuration.
  • the processor 11 controls the operation of the entire electronic device 10.
  • the processor 11 is a CPU.
  • an electronic circuit such as an MPU may be used.
  • the processor 11 executes various processes by reading and executing a program or data stored in the storage device 14.
  • the processor 11 is composed of a plurality of processors.
  • the input device 12 is a user interface that receives input from the user to the electronic device 10.
  • the display device 13 is a display that displays an application screen or the like to the user of the electronic device 10 under the control of the processor 11.
  • the input device 12 is a touch panel 12, and has a structure integrated with the display device 13 (display).
  • the "touch panel 12" can indicate the input device 12 and the display device 13.
  • the input device 12 may be a touch pad
  • the display device 13 may be a display separate from the touch panel correspondingly.
  • the storage device 14 includes a main storage device and an auxiliary storage device.
  • the main storage device is a volatile storage medium capable of high-speed reading and writing of information, and is used as a storage area and a work area when the processor 11 processes information.
  • the auxiliary storage device stores various programs and data used by the processor 11 when executing each program.
  • the auxiliary storage device is, for example, a hard disk device, but may be any non-volatile storage or non-volatile memory as long as it can store information, and may be removable.
  • the storage device 14 is a storage device included in a general smartphone including a volatile memory and a non-volatile memory.
  • the storage device 14 stores various programs including a game application.
  • the storage device 14 stores an operating system (OS), middleware, application programs, various data that can be referred to when these programs are executed, and the like.
  • OS operating system
  • middleware middleware
  • application programs various data that can be referred to when these programs are executed, and the like.
  • the communication device 15 is a wireless LAN module capable of exchanging data with another computer such as a user terminal or a server via a network.
  • the communication device 15 can be another wireless communication device such as a Bluetooth (registered trademark) module, or can be a wired communication device such as an Ethernet module or a USB interface.
  • the electronic device 10 downloads the program from the server by the communication device 15 and stores it in the storage device 14. When not transmitting / receiving data to / from another computer, the electronic device 10 may not include the communication device 15.
  • FIG. 2 is a functional block diagram of the electronic device 10 according to the embodiment of the present invention.
  • the electronic device 10 includes an input unit 21, a display unit 22, an engine unit 23, a controller control unit 24, and an application unit 25.
  • these functions are realized by executing the program by the processor 11.
  • the program to be executed is a program stored in the storage device 14 or received via the communication device 15.
  • another part may have a part or all of one part (function).
  • Other parts may not be described in this embodiment.
  • these functions may be realized by hardware by configuring an electronic circuit or the like for realizing a part or all of each function.
  • the virtual controller is realized by a three-layer structure, and the engine unit 23, the controller control unit 24, and the application unit 25 correspond to each layer.
  • the virtual controller is realized by the processor 11 executing a program composed of each program corresponding to each layer.
  • the input unit 21 is configured by using the touch panel 12, and receives input from the user to the electronic device 10. In the present embodiment, the input unit 21 receives a user's touch operation on the touch panel 12 and generates a touch event.
  • the input unit 21 is a function generally possessed by a smartphone.
  • the display unit 22 displays the game application screen (game screen) on the touch panel 12 and displays the screen according to the user operation.
  • the engine unit 23 uses a touch event generated by a user's touch operation on the touch panel 12 for each frame time corresponding to the frame rate (time between frames) to convert the parameters of the operation target character. Determines the weighted speed.
  • the engine unit 23 determines the angle for determining the parameters of the operation target character by using the touch event generated by the user's touch operation on the touch panel 12 for each frame time.
  • the frame time F (seconds) is the time corresponding to the frame rate for executing the game.
  • the frame rate is generally 30 fps (frames per second) or 60 fps. For example, when the frame rate is 30 fps, F is 1/30 second.
  • "decision" can include calculation and calculation and determination.
  • the engine unit 23 acquires the data points indicated by the values of the first axis and the values of the second axis based on the touch event generated by the user's operation on the touch panel 12.
  • the touch event is a touchstart that occurs when the user touches the touch panel 12 with a finger, a touchmov that occurs when the user moves the touch panel 12 with the finger in contact with the touch panel 12, and the user releases the finger from the touch panel 12.
  • the touch event can also include a touchcancel that occurs when the touch is cancelled.
  • the engine unit 23 acquires the touch event when the touch event occurs.
  • the engine unit 23 acquires a touch event, it acquires a set of numerical values (x, y) consisting of two variables corresponding to the position where the capacitance on the touch panel 12 changes, and uses the first buffer.
  • the data of the numerical value set consisting of the two variables is acquired by the engine unit 23 in association with the touch event, and corresponds to the data point indicated by the value of the first axis and the value of the second axis. It is a thing.
  • the first buffer is an example of a storage area in the storage device 14 prepared for temporarily holding data, and is not limited to this as long as it can temporarily hold data. .. The same is true for other buffers.
  • the engine unit 23 acquires a set of numerical values (x, y) consisting of two variables, and also acquires a time t in which (x, y) is acquired, and a set of numerical values consisting of three variables.
  • t is a value representing the data point acquisition time, which is the time when (x, y) is acquired, and is stored in the first buffer in association with (x, y) as described above.
  • t can be an integer value so-called UNIX (registered trademark) time that can be obtained from the OS, or can be a character string such as "2017/07/14 15: 48: 43.444". ..
  • holding (or ending holding) a data point by the engine unit 23 can include holding (or ending holding) the data point acquisition time t associated with the data point. ..
  • FIG. 3 is a diagram showing coordinate axes including the first axis and the second axis of the present embodiment.
  • the first axis is a horizontal axis (x-axis) substantially parallel to the short side of the touch panel 12, which is substantially rectangular.
  • the second axis is a vertical axis (y-axis) substantially parallel to the long side of the touch panel 12.
  • the position on the touch panel 12 is represented as coordinates (x, y) by the first axis (x axis) and the second axis (y axis).
  • the coordinates (x, y) of the data points correspond to the positions on the touch panel 12, and the engine unit 23 holds the coordinates (x, y) as the data points in the first buffer. do.
  • the coordinate setting shown in FIG. 3 is an example, and can be set differently from the above example by a program implemented by the electronic device 10.
  • the engine unit 23 ends the holding of the data points held in the first buffer that exceeds the predetermined holding time. For example, when the engine unit 23 ends the retention of the data of the data point, the data may be deleted, the data may be invalidated, or a flag indicating that the retention of the data has been terminated may be performed. May be associated and deleted as appropriate.
  • the engine unit 23 defines a variable Da that specifies the life of the data point stored in the first buffer in milliseconds. The time specified by the variable Da corresponds to the default retention time. However, the value of the variable Da is not limited to milliseconds.
  • variable Da is set to 167
  • the engine unit 23 holds the data points stored in the first buffer for 167 milliseconds, and when 167 milliseconds elapses, the first buffer of the data points is set. Ends holding to. 167 milliseconds is an example of the time corresponding to 5 frames when the frame rate for executing the game is 30 fps.
  • FIG. 4 is a flowchart illustrating a process related to data point retention executed in the electronic device 10 according to the embodiment of the present invention.
  • step 101 the engine unit 23 determines whether or not a touch event has occurred. If a touch event occurs, the flowchart proceeds to step 102, and if no touch event occurs, the flowchart proceeds to step 104.
  • step 102 the engine unit 23 determines whether or not the generated touch event is touchend or touchcancel. If the acquired touch event is not touchend or touchcancel, the flowchart proceeds to step 103. If the acquired touch event is touchend or touchcancel, this flowchart proceeds to step 104.
  • step 103 the engine unit 23 acquires a data point from the generated touch event and stores it in the first buffer.
  • the engine unit 23 associates a data point to be stored with a Ta indicating the elapsed time after storage in milliseconds and a variable Da indicating the time that can be stored in the first buffer in milliseconds.
  • step 104 the engine unit 23 ends the holding of the data points whose elapsed time Ta is equal to or greater than the variable Da among the data points held in the first buffer.
  • the engine unit 23 compares the elapsed time Ta and the variable Da for each data point stored in the first buffer, and if the elapsed time Ta is equal to or greater than the variable Da, the engine unit 23 ends holding the data point. do.
  • step 105 this flowchart returns to step 101 unless it is terminated, for example, due to the termination of the game application.
  • the determination of the weighted speed of the engine unit 23 will be explained.
  • a method for determining the weighted speed of the engine unit 23 for example, the same method as that for determining the weighted speed of the engine unit described in Japanese Patent No. 6560801 can be used.
  • the engine unit 23 holds one or a plurality of data points held in the first buffer in the second buffer in the storage device 14 as a data point sequence for each frame time. Alternatively, the engine unit 23 obtains one or a plurality of data points acquired based on the touch event generated by the user's operation on the touch panel 12 without going through the first buffer, at each frame time. Hold in the second buffer as a column.
  • the data point sequence P (i) held by the engine unit 23 in the i-th frame is Represented by.
  • Each data point included in the data point sequence P (i) is a data point held in the first buffer within the time of the i-th frame.
  • the engine unit 23 holds P (i) after one frame of time F (seconds) has elapsed since holding the i-1st P (i-1), and further holds P (i + 1) after one frame of time has elapsed. To hold. Since the variable m is the number of data points included in P (i), it differs depending on P (i).
  • the engine unit 23 ends the holding of the data point sequence held in the second buffer that exceeds the predetermined holding time. For example, when the engine unit 23 ends the retention of the data in the data point sequence, the data may be deleted, the data may be invalidated, or a flag indicating that the retention of the data has been completed may be terminated. May be associated and deleted as appropriate.
  • the engine unit 23 defines a variable Db that specifies the lifetime of the data points stored in the second buffer.
  • the time specified by the variable Db corresponds to the default holding time, and in this embodiment, it corresponds to the frame time.
  • the engine unit 23 in association with time t i, which holds the data point sequence P (i) holds the data value series P (i) in the second buffer.
  • the engine unit 23 When the engine unit 23 stores one data point sequence P (i) in the second buffer, the engine unit 23 monitors the elapsed time Tb after the data point sequence P (i) is stored, and continuously sets the variable Db. compare. For example, the engine unit 23, retained with time t i, and calculates the elapsed time Tb. When the elapsed time Tb of the monitored data point sequence P (i) exceeds the variable Db, the engine unit 23 ends holding the data point sequence P (i) in the second buffer.
  • the engine unit 23 holds one data point sequence P (i) in the second buffer for a time of 5F (seconds) corresponding to 5 frames. Therefore, the engine unit 23 holds five data point sequences.
  • the engine unit 23 holds five data point sequences so as to be P (5), P (4), P (3), P (2), P (1) in order from the new data point sequence. Therefore, when the engine unit 23 holds the data point sequence corresponding to 5 frames, P (5) becomes the latest held data point sequence.
  • the engine unit 23 holds the new data point sequence as P (5) and replaces P (i) (1 ⁇ i ⁇ 4) with the data of P (i + 1). At this time, the engine unit 23 ends the holding of P (1) exceeding the predetermined holding time.
  • the data point column can indicate a column of data points or a matrix of data points.
  • the engine unit 23 holds the x-coordinate value and the y-coordinate value separately as a data point sequence for each frame time.
  • the set X of the x-coordinate values and the set Y of the y-coordinate values held by the engine unit 23 are as shown in the equations (1) and (2), respectively.
  • n is the number of data point strings held in the second buffer by the engine unit 23, and corresponds to the number of frames.
  • n 5
  • n 5
  • the value of the x-coordinate of the most recently held data point sequence P (n) is x n, 0 , ... X n, m
  • the value of the y-coordinate of the most recently held data point sequence P (n) is y. n, 0 , ... y n, m
  • the maximum value of n differs depending on the time for which the engine unit 23 holds the data point sequence.
  • the engine unit 23 determines the displacement speed in the data point sequence based on the displacement of the data points in the data point sequence held in the second buffer for each frame time.
  • the engine unit 23 determines the weighted speed based on at least the deviation of the latest determined displacement speed with respect to the average value of the previously determined displacement speed.
  • the engine unit 23 holds a determined weighted speed value in a predetermined storage area in a storage device 14 that can be referenced by other functional units or programs.
  • the displacement speed corresponds to the displacement speed of the data point (the position where the touch event occurs) in the time of the target frame.
  • the displacement speed can be said to be the speed corresponding to the movement speed of the user's finger calculated from the set of data points (positions where the touch event has occurred) in the time of the target frame.
  • the displacement speed v i is the displacement speed in the i-th data point sequence or the displacement speed in the i-th frame.
  • the engine unit 23 has a displacement speed v n of the latest data point sequence among the retained data point sequences, which is the displacement speed in the data point sequence held before the latest data point sequence.
  • the weighted speed is determined based on the displacement with respect to the mean value of v 1 to v n-1.
  • the weighted speed is determined based on the bias with respect to the average value of the displacement speeds v 1 to v i-1 in the data point sequence.
  • bias towards the displacement velocity v average value of 0 ⁇ v i-1 of the displacement velocity v i for example the displacement velocity v of the displacement velocity v i 0 ⁇ v i-1 of the deviation from the mean value ( Deviation).
  • the engine unit 23 calculates the output value of the Cumulative Pointwize Deviation function (CPD function) defined by the equation (3) and determines it as the weighted speed.
  • CPD function Cumulative Pointwize Deviation function
  • the engine unit 23 when the engine unit 23 holds the x-coordinate value and the y-coordinate value separately as a data point sequence, the engine unit 23 holds the x-coordinate in the data point sequence held in the second buffer.
  • the displacement speed is determined based on the displacement of the value of and the displacement of the value of the y coordinate. In one example, the engine unit 23 is based on the amount of displacement of data points adjacent in time series in the data point sequence held in the second buffer, and the quantity of data points included in the data point sequence. Determine the displacement speed.
  • the displacement speed is calculated by Eq. (4).
  • is a coefficient corresponding to the pixel density DPI (Dot-Per-Inch) of the display.
  • is a real number of 0 or more, and is generally 1.
  • is an integrated weight. Increasing ⁇ makes it easier for the displacement speed to reflect sudden changes, and decreasing ⁇ makes it difficult for the displacement speed to reflect sudden changes.
  • P (i) does not include a data point
  • avg i-1 (v) is the average of the displacement speeds v i up to just before the i-th frame.
  • avg i-1 (v) is calculated by the equation (5). (5)
  • FIG. 5 is a flowchart illustrating a process related to determination of weighted speed executed in the electronic device 10 according to the embodiment of the present invention.
  • the electronic device 10 executes steps 201 to 204 of this flowchart every time corresponding to the frame rate.
  • step 201 the engine unit 23 holds the data points acquired during one frame in the second buffer as the data point sequence P (i). At this time, the engine unit 23 has Tb indicating the elapsed time after storage in milliseconds and variables indicating the time (retention life) that can be stored in the second buffer in milliseconds in the data point sequence P (i) to be retained. Associate Db.
  • step 202 the engine unit 23 ends holding the data point sequence P (i) whose elapsed time Tb is equal to or greater than the variable Db among the data point sequences P (i) held in the second buffer. do.
  • step 203 the engine unit 23 determines the weighted speed using the equations (3) to (5).
  • step 204 this flowchart returns to step 201 unless it is terminated, for example, due to the termination of the game application.
  • this flowchart ends, the engine unit 23 ends holding all the data point sequences P (i).
  • the determination of the angle of the engine unit 23 will be described.
  • the method for determining the angle of the engine unit 23 for example, the same method as that for determining the angle of the angle determination unit described in Japanese Patent No. 6389581 can be used.
  • the engine unit 23 determines whether or not the angle can be calculated for each frame time. If the angle can be calculated, the engine unit 23 uses the data points held by the first buffer to determine the angle indicated by the set of the data points. By obtaining the angle indicated by the set of data points, the engine unit 23 can obtain the angle in the direction intended by the user who has performed a touch operation on the touch panel 12. In one example, the engine unit 23 determines the variable B, and when the quantity of the data points held by the first buffer is equal to or greater than the variable B, the engine unit 23 calculates (determines) the angle using the data points and determines the angle. Hold the angle. In this case, the engine unit 23 holds only the latest determined angle.
  • the engine unit 23 determines the holding angle without calculating the angle.
  • the variable B is set to three or more.
  • the engine unit 23 determines the slope of the regression line based on the data points held in the first buffer. In determining the slope of the regression line, the engine unit 23 uses the displacement amount of the x-axis value and the displacement amount of the y-axis value at the data points held in the first buffer as the axis of the independent variable. Determine one of the x-axis and the y-axis. The engine unit 23 determines the other axis as the axis of the dependent variable. In one example, the engine unit 23 calculates the slope of the regression line using the method of least squares.
  • the engine unit 23 sets the difference between the maximum and minimum values of the x-axis and the difference between the maximum and minimum values of the y-axis at the data points held in the first buffer. Based on this, one of the x-axis and the y-axis is determined as the axis of the independent variable, and the other axis is determined as the axis of the dependent variable. In one example, the engine unit 23 weights the difference between the maximum value and the minimum value of the x-axis value at the data point held in the first buffer using a weighting coefficient, and y. Based on the difference between the maximum and minimum values of the axes, one of the x-axis and the y-axis is determined as the axis of the independent variable, and the other axis is determined as the axis of the dependent variable.
  • the engine unit 23 uses the least squares method to calculate the slope of the regression line, for example, in the range of 0 to 90 degrees and 270 degrees to 360 degrees. Calculated within. Therefore, for example, whether the angle indicated by the set of data points is 45 degrees or 225 degrees, 45 degrees is calculated as the slope of the regression line. Therefore, after determining the slope of the regression line, the engine unit 23 indicates whether or not to rotate 180 degrees with respect to the determined slope (angle) of the regression line based on the displacement direction as a set of data points. Determine the amount of rotation.
  • the engine unit 23 determines the amount of rotation by comparing the positive and negative quantities of the difference values of the determined values of the independent variables, which are back and forth in time series at the data points.
  • the displacement direction as a set of data points indicates a direction in which the data points are displaced with time, and corresponds to, for example, a rough direction in which the user moves a finger on the touch panel 12.
  • the engine unit 23 determines the angle using the function aop (x, y) shown in the equation (6).
  • the function aop (x, y) calculates the angle with a real number from 0 to 360 degrees.
  • the engine unit 23 calculates the angle using the function aop (x, y)
  • the first buffer holds n data points P.
  • the function aop (x, y) uses the function rotate (x, y), the function left (x), and the function down (y) in order to divide the cases.
  • the function aop (x, y) uses the function rotate (x, y) to determine one of x and y as an independent variable.
  • the function rotate (x, y) is defined by Eq. (7). (7)
  • the function rotate (x, y) determines whether or not the n data points P are mainly displaced in the y-axis direction, and returns, for example, a boolean value. In this way, the function rotate (x, y) determines whether the n data points P are displaced mainly in the x-axis (left and right) direction or mainly in the y-axis (up and down) direction. Then, it is determined whether the value on the x-axis or the value on the y-axis is appropriate as the independent variable.
  • (max (x) -min (x)) is the absolute difference between the maximum and minimum values at the x values (x 1 , x 2 , ... X n) of the n data points P. It is a value and indicates the amount of displacement of n data points P in the x-axis direction.
  • (max (y) -min (y)) is the absolute value of the difference between the maximum value and the minimum value at the y values (y 1 , y 2 , ... y n) of the n data points P. Yes, the amount of displacement of n data points P in the y-axis direction is shown.
  • the variable w is a weighting coefficient that weights (max (x) ⁇ min (x)).
  • the function rotate (x, y) satisfies the inequality sign when (max (y) -min (y)) is larger than the value of the product of (max (x) -min (x)) and the variable w, and is a function.
  • aop (x, y) performs coordinate conversion.
  • the function aop (x, y) determines the amount of rotation by using the y-axis as the axis of the independent variable, the x-axis as the axis of the dependent variable, and the function down (y).
  • the function rotate (x, y) does not satisfy the inequality sign when (max (y) -min (y)) is equal to or less than the value of the product of (max (x) -min (x)) and the variable w.
  • the function aop (x, y) does not perform coordinate transformation.
  • the x-axis is the axis of the independent variable
  • the y-axis is the axis of the dependent variable
  • the function left (x) is used to determine the amount of rotation.
  • the variable w is set to, for example, 0.5 or 2, depending on the dimensions in the x-axis direction and the y-axis direction, and how the user holds the smartphone.
  • the function left (x) is represented by the equation (8).
  • the function left (x) determines whether or not the displacement direction of n data points P is in the ⁇ x axis direction (left direction) when the function rotate (x, y) does not satisfy the inequality sign, for example, a truth value. return it.
  • the function left (x) is the difference value (x 2- x 1 , x 3- x) of the values before and after the time series at the x values (x 1 , x 2 , ... x n) of the n data points P. 2 , ... x n- x n-1 ) is calculated.
  • the function left (x) determines whether or not the number of negative difference values is greater than the number of positive difference values, so that the displacement direction of n data points P is in the ⁇ x axis direction (left). Direction) or not. In this way, the function left (x) determines whether the displacement direction of the n data points P is the ⁇ x axis direction (left direction) or the + x axis direction (right direction), and the determined regression line is determined. The amount of rotation indicating whether or not to rotate 180 degrees with respect to the inclination of is determined.
  • the function aop (x, y) determines the amount of rotation to be 180 degrees, and if the function left (x) is false, the function aop (x, y) rotates. The amount is determined to be 0 degrees.
  • the function down (y) is represented by the equation (9). (9) When the function rotate (x, y) satisfies the inequality sign, the function down (y) determines whether or not the displacement direction of the n data points P is in the ⁇ y axis direction (downward), and determines, for example, a truth value. return.
  • the function down (y) is a difference value (y 2- y 1 , y 3- y) of the values before and after the time series in the y values (y 1 , y 2 , ... y n) of the n data points P. 2 , ... y n- y n-1 ) is calculated.
  • the function down (y) determines whether or not the number of negative difference values is greater than the number of positive difference values, so that the displacement direction of the n data points P is in the ⁇ y axis direction (downward). Direction) or not. In this way, the function down (y) determines whether the displacement direction of the n data points P is the ⁇ y axis direction (downward direction) or the + y axis direction (upward direction), and the determined regression line is determined. The amount of rotation indicating whether or not to rotate 180 degrees with respect to the inclination of is determined.
  • the function aop (x, y) determines the amount of rotation to be 180 degrees, and if the function down (y) is false, the function aop (x, y) rotates. The amount is determined to be 0 degrees.
  • the function aop (x, y) calculates the angle obtained from the slope of the regression line as it is when the function rotate (x, y) is false and the function left (x) is false.
  • the function aop (x, y) calculates an angle obtained by adding 180 degrees to the angle obtained from the slope of the regression line when the function rotate (x, y) is false and the function left (x) is true.
  • the function aop (x, y) calculates the angle by subtracting the angle obtained from the slope of the regression line from 90 degrees when the function rotate (x, y) is true and the function down (y) is false. do.
  • the function aop (x, y) is an angle calculated by subtracting the angle obtained from the slope of the regression line from 90 degrees when the function rotate (x, y) is true and the function down (y) is true. The angle obtained by adding 180 degrees to is calculated.
  • the engine unit 23 holds the value of the determined angle in a predetermined storage area in the storage device 14 that can be referred to by other functional units or programs, unless a touchend or touchcance occurs. In this embodiment, the engine unit 23 holds only the latest determined angle.
  • FIG. 6 is a flowchart illustrating a process related to angle determination executed in the electronic device 10 according to the embodiment of the present invention.
  • the electronic device 10 executes steps 301 to 309 of this flowchart for each frame time corresponding to the frame rate.
  • step 301 the engine unit 23 determines whether or not a touch event of touchend or touchcancer has occurred. In one example, the engine unit 23 determines whether or not the touch event has occurred during one frame. If touchend or touchcancel has not occurred, this flowchart proceeds to step 302. If touchend or touchcancel occurs, this flowchart proceeds to step 309.
  • step 302 the engine unit 23 determines whether or not the number n of data points held by the first buffer is equal to or greater than the number of variables B. If the number n is greater than or equal to the variable B, the flowchart proceeds to step 303, and if the number n is less than the variable B, the flowchart proceeds to step 307.
  • step 303 the engine unit 23 sets the x-axis and the axis of the independent variable as the axis of the independent variable based on the amount of displacement of the value on the x-axis and the amount of displacement of the value on the y-axis at the data point held in the first buffer. Determine one of the y-axis. At the same time, the engine unit 23 determines the other axis as the axis of the dependent variable.
  • step 304 when the x-axis is determined as the axis of the independent variable in step 303, the engine unit 23 calculates the inclination angle by using the equation (10) to determine the inclination angle of the regression line. decide.
  • the engine unit 23 calculates the angle of inclination using the equation (11) and subtracts the calculated angle from 90 degrees to incline the regression line. Determine the angle of.
  • the engine unit 23 calculates the angle of inclination using the equation (10) or the equation (11) in the range of 0 degrees to 90 degrees and 270 degrees to 360 degrees.
  • step 305 the engine unit 23 determines whether or not to rotate 180 degrees with respect to the slope of the determined regression line based on the displacement direction as a set of data points held in the first buffer. Determine the amount of rotation shown.
  • the engine unit 23 calculates the difference value of the x-axis values before and after in time series. The engine unit 23 determines the rotation amount to 180 degrees when the number of negative calculated difference values is larger than the positive number, and determines the rotation amount to 0 degrees when the calculated difference value is small.
  • the engine unit 23 calculates the difference value between the y-axis values before and after in time series.
  • the engine unit 23 determines the rotation amount to 180 degrees when the number of negative calculated difference values is larger than the positive number, and determines the rotation amount to 0 degrees when the calculated difference value is small.
  • step 306 the engine unit 23 determines and holds the angle based on the inclination of the determined regression line and the determined rotation amount.
  • the engine unit 23 determines the angle by adding the determined amount of rotation to the angle corresponding to the slope of the determined regression line. When the amount of rotation is 0 degrees, the angle determined by the engine unit 23 is an angle corresponding to the inclination of the determined regression line. This flowchart proceeds to step 308.
  • step 307 the engine unit 23 determines (outputs) the held angle.
  • the engine unit 23 determines (outputs) data indicating that the angle such as NULL is not held.
  • step 308 if the flowchart ends, for example, due to the termination of the game application, the process proceeds to 309, and if not, the process returns to step 301. At step 309, the engine unit 23 ends holding the angle.
  • the controller control unit 24 determines the converted weighted speed for each frame time by inputting the determined weighted speed into a predetermined function.
  • the predetermined function is a function that determines a value corresponding to an input value and retains and determines (outputs) the maximum value among the determined values.
  • the converted weighted speed determined by the controller control unit 24 is the weighted speed for determining parameters such as the movement direction and speed of the operation target character.
  • the controller control unit 24 inputs a determined weighted speed into a predetermined function for each frame time.
  • the predetermined function determines the value corresponding to the input value at each frame time, and the larger value of the determined value and the maximum value among the values determined so far by the predetermined function. Output and hold.
  • the predetermined function holds the value output in the i-th frame, and in the i + 1th frame, the held value is used as the maximum value among the values determined so far by the predetermined function.
  • the controller control unit 24 has a determined converted weighted speed value in a predetermined storage area in a storage device 14 that can be referenced by other functional units or programs, unless a touchend or touchcance occurs. To hold.
  • the predetermined function is the activation function f (s, p) defined by the equation (12) using the function max ⁇ A, B ⁇ that outputs the maximum value of A and B.
  • the controller control unit 24 determines the output value of the activation function f (s, p) as the converted weighted speed.
  • s is the output value (weighted speed) of the latest determined CPD function
  • p is the value output and held immediately before by the activation function f (s, p).
  • the value output immediately before is the frame immediately before one frame when the controller control unit 24 determines the output value (converted weighted speed) of the activation function f (s, p) in one frame.
  • the activation function f (s, p) determines a real number o of 0 to 1 with s as an input, compares o and p, outputs o if o is greater than or equal to p, and outputs o if o is less than p. If p is output. Therefore, the output value of the activation function f (s, p) does not fall below p.
  • the real number o is a value obtained by dividing the output value s of the CPD function by a predetermined constant C.
  • the predetermined constant C is set to the maximum value that the output value of the CPD function can take so that the real number o is a real number of 0 to 1.
  • the range 0 to 1 of the real number o determined by the activation function f (s, p) can be another numerical range.
  • FIG. 7 is an example of the output value of the activation function f (s, p).
  • the output value of the activation function f (s, p) may increase with the passage of time, but does not decrease, and changes continuously. In this way, the activation function f (s, p) converts the value of the CPD function into a stable output value.
  • the activation function f (s, p) ends the holding of p when a touch event of touch end (touchend) or touch cancel (touchcancel) is generated by the user's operation on the touch panel 12, and its output value is zero. Or it becomes NULL.
  • FIG. 8 is a flowchart illustrating a process related to determination of converted weighted speed executed in the electronic device 10 according to the embodiment of the present invention.
  • the electronic device 10 executes steps 401 to 404 of this flowchart for each frame time corresponding to the frame rate.
  • step 401 the controller control unit 24 determines whether or not a touch event of touchend or touchcancel has occurred. In one example, the controller control unit 24 determines whether or not the touch event has occurred during one frame. If touchend or touchcancel has not occurred, the flowchart proceeds to step 402 and then to step 404. If touchend or touchcancel occurs, the flowchart proceeds to step 403 and then to step 404.
  • step 402 the controller control unit 24 determines the converted weighted speed using the equation (12). At this time, the controller control unit 24 holds the determined converted weighted speed in a predetermined storage area in the storage device 14. The retained converted weighted speed is a value referred to as p when the controller control unit 24 uses the equation (12) in the next process of step 402.
  • the controller control unit 24 is the latest weighted speed and equation determined in step 203, which is executed every time corresponding to the frame rate, in step 402, which is executed every time corresponding to the frame rate. (12) is used to determine the converted weighted speed.
  • step 403 the controller control unit 24 ends the holding of the converted weighted speed.
  • step 404 this flowchart returns to step 401 unless it is terminated, for example, due to the termination of the game application.
  • the controller control unit 24 ends holding the converted weighted speed.
  • the controller control unit 24 generates a composite vector for each frame time using the angle determined by the engine unit 23 and the converted weighted speed determined by the controller control unit 24. In one example, the controller control unit 24 synthesizes based on a converted weighted speed value held in a predetermined storage area and a unit vector having an angular value held in the predetermined storage area. Generate a vector. The controller control unit 24 holds the calculated composite vector data in a predetermined storage area.
  • the controller control unit 24 calculates the composite vector comse (v) using the equation (13). (13) here, Is a composite vector determined and held by the controller control unit 24 immediately before. Is an addition vector to be added to the composite vector held by the controller control unit 24, and f (s, p) is the latest converted weighted speed determined by the engine unit 23. Is a unit vector having an angle determined by the engine unit 23, and ⁇ is a weight with respect to the addition vector.
  • the composite vector determined and held immediately before is a composite vector determined and held in the frame immediately before one frame when the controller control unit 24 determines the composite vector in one frame. ..
  • the composite vector Is normalized so that the maximum value does not exceed 1.0. It should be noted that the maximum value of the composite vector is 1.0 as an example, and the maximum value of the composite vector can be another numerical value.
  • FIG. 9 is an example showing the generation of the composite vector of the controller control unit 24.
  • the addition vector is a zero vector and the vector held by the controller control unit 24 is also a zero vector.
  • the composite vector is also a zero vector.
  • the addition vector becomes a zero vector mainly when the engine unit 23 does not acquire and hold data points, that is, when a touch event does not occur.
  • the addition vector and the composition vector in the i-th frame are shown respectively. It is expressed as.
  • the second frame assumes that the addition vector is not a zero vector.
  • the composite vector since the vector held by the controller control unit 24 is a zero vector, the composite vector is It can be confirmed that the addition vector becomes the composite vector as it is.
  • the third frame assumes the case where an addition vector is further added.
  • the vector held by the controller control unit 24 is Therefore, the composite vector is It becomes a vector of the sum of the vector held by the controller control unit 24 and the addition vector.
  • the fourth frame assumes a case where an addition vector is further added and the magnitude (norm) of the composite vector exceeds 1.
  • the composite vector is Therefore, it is a vector of the sum of the vector held by the controller control unit 24 and the addition vector, and the norm is 1.
  • the engine unit 23 determines the angle and the weighted speed for each frame time, and the controller control unit 24 determines the converted weighted speed for each frame time.
  • the controller control unit 24 is configured to be able to generate a composite vector at each frame.
  • the direction (angle) and magnitude of the composite vector determined by the controller control unit 24 correspond to the input of the direction and magnitude of the virtual controller.
  • the converted weighted speed determined by the controller control unit 24 corresponds to the magnitude of the addition vector for generating the composite vector, and is a value that contributes to the input of the magnitude of the virtual controller.
  • the converted weighted speed determined by the controller control unit 24 corresponds to the direction of the addition vector for generating the composite vector, and is a value that contributes to the input of the direction of the virtual controller.
  • the concept of a virtual controller is not indispensable, and the controller control unit 24 can generate (determine) a composite vector in each frame and pass the determined composite vector to the application unit 25. All you need is.
  • the application unit 25 corresponds to a specific game application that implements in-game operations and the like.
  • the application unit 25 is a function implemented in a game application installed in the electronic device 10.
  • the game application processes the main loop of the main program in the same manner as a general game application, for example, every time corresponding to the frame rate.
  • the application unit 25 can correspond to an input support application or a simulation application that operates the operation target object according to the operation of the user.
  • FIG. 10 is a flowchart illustrating a process relating to determination of a composite vector executed in the electronic device 10 according to the embodiment of the present invention.
  • the electronic device 10 executes steps 501 to 505 of this flowchart every time corresponding to the frame rate.
  • Step 501 is a step corresponding to step 402, and the engine unit 23 determines the converted weighted speed.
  • Step 502 is a step corresponding to step 306 or 307, and the engine unit 23 determines the angle. Steps 501 and 502 may be in reverse order.
  • step 503 the controller control unit 24 uses the converted weighted speed determined in step 501, the angle determined in step 502, and the equation (13) to form an addition vector. To decide.
  • step 504 the controller control unit 24 determines the addition vector in step 503. And the composite vector determined and held by the controller control unit 24 immediately before. Composite vector based on the sum of To determine and retain. This retained composite vector is determined and held immediately before the controller control unit 24 refers to in the next process of step 504. Is.
  • step 505 this flowchart returns to step 501 unless it is terminated, for example, due to the termination of the game application.
  • the controller control unit 24 ends holding the composite vector.
  • the angle and magnitude of the composite vector generated by the controller control unit 24 are converted into parameters of the operation target character such as the motion of the operation target character.
  • the application unit 25 makes each of the directions and sizes of the composite vector determined for each time of the frame the moving direction and the moving speed of the operation target character for each time of the frame. Execute the process.
  • the application unit 25 determines the moving state of the operation target character based on the magnitude of the composite vector. For example, the application unit 25 determines (a) "walking" when the magnitude of the composite vector is (a) the threshold value t1 or less, and (b) determines "fast walking” when the magnitude is larger than the threshold value t1 and is equal to or less than the threshold value t2. , (C) When it is larger than the threshold value t2, it is determined to be "running".
  • the engine unit 23 acquires a data point based on a touch event generated by a user's operation on the touch panel 12 and stores it in the first buffer.
  • the engine unit 23 holds the data points held in the first buffer as a data point sequence in the second buffer for each frame time.
  • the engine unit 23 determines and determines the displacement speed in the data point sequence based on the displacement of the data points in the data point sequence held in the second buffer by using the CPD function for each frame time.
  • the weighted speed is determined based at least on the deviation of the latest displacement speed with respect to the previously determined average value of the displacement speed.
  • the controller control unit 24 determines a value corresponding to the input value for each frame time, and inputs the determined weighted speed to the activation function f (s, p) to convert the converted weight. Determine the speed of attachment.
  • the CPD function outputs a larger value when the displacement speed of each frame holds a value larger than the average value of the displacement speeds up to the immediately preceding frame.
  • the CPD function outputs a larger value because the finger inevitably accelerates for a certain period of time when the user intentionally moves the finger quickly.
  • the CPD function is leveled by multiplying by 1 / n when the displacement speed increases for only one frame time due to the contact condition between the touch panel 12 and the finger. Therefore, the CPD function is extremely high, especially when n is the maximum value. It does not output a large value. In this way, the output value of the CPD function becomes a large value when the finger is intentionally kept accelerating, and does not take a large value when the finger is not intentionally accelerated. Can be as large as possible. Furthermore, since the CPD function absorbs the habit of moving the finger for each user and individual differences by using the bias (deviation) with respect to the average value, the electronic device 10 intentionally uses the CPD function. Only acceleration can be detected.
  • the CPD function since the CPD function continuously calculates the value for each frame, it often outputs a value that changes discontinuously according to the discontinuous movement of the user's finger.
  • the output of the CPD function is output by inputting the output of the CPD function to the activation function f (s, p) having a ratchet-like characteristic that outputs a value not less than the value p output immediately before. It is possible to convert the value to a more stable output value. With such a configuration, it is possible to convert the output value of the CPD function that reflects the speed of discontinuous finger movement into a more stable output value.
  • the activation function f ends the holding of the maximum value when a touch end or touch cancel touch event is generated by the user's operation on the touch panel 12.
  • the unstable output value of the CPD function that reflects the speed of discontinuous finger movement is converted into a stable output value only while the touch continues, and the continuation of the touch ends.
  • the output value can be set to zero.
  • the engine unit 23 ends the holding of the data points held in the first buffer that exceeds the predetermined holding time.
  • the engine unit 23 determines the slope of the regression line based on the data points held in the first buffer.
  • the engine unit 23 determines a rotation amount indicating whether or not to rotate 180 degrees with respect to the slope of the determined regression line based on the displacement direction as a set of data points.
  • the engine unit 23 determines the angle based on the slope of the determined regression line and the determined rotation amount.
  • the controller control unit 24 outputs a composite vector based on a unit vector having a determined converted weighted speed and a determined angle.
  • the controller control unit 24 preferably outputs a composite vector using the equation (13). In this way, in the present embodiment, it is possible to determine (output) a composite vector having a direction and a magnitude intended by the user who has performed a touch operation on the touch panel 12.
  • the values of the angle and speed (weighted speed) determined (calculated) by the engine unit 23 can be used as parameters of the operation target character such as the motion of the operation target character. It becomes possible to convert. Alternatively, it is possible to convert the values of the angle and speed (weighted speed) determined (calculated) by the engine unit 23 into the input of the direction and size of the virtual controller. This makes it possible to realize a virtual controller that has higher responsiveness and is more suitable for the user's intuition.
  • the addition vector is continuously added while the touch continues, but when the finger is rolled in the direction ⁇ and then the finger is rolled in the direction ⁇ opposite to the direction ⁇ . , The magnitude of the composite vector is once reduced, as shown by equation (13). After that, the composite vector becomes a vector oriented in the direction ⁇ having a larger norm, and thus fits the user's intuition.
  • the engine unit 23 is configured to continuously calculate the angle and the weighted speed based on the touch event that occurs in an extremely short time. Therefore, it is possible to input the angle and size of the virtual controller or calculate the parameters of the operation target character without using the past touch coordinates as the reference point.
  • the electronic device 10 does not use the spatial concept of points such as the start point (start coordinate) and the end point (end coordinate) used in the virtual controller in the prior art. Calculate the angle and size.
  • the electronic device 10 is not an input according to the movement distance of the finger from the reference coordinates, so that the operation intended by the user is performed by the operation in which the movement amount of the finger is smaller. Can be realized. Therefore, it can be realized with a smaller mounting area as compared with the conventional technology. For example, the same operability can be realized regardless of the size of the touch panel 12.
  • a program that realizes the information processing shown in the functions and flowcharts of the embodiment of the present invention described above, or a computer-readable storage medium that stores the program can also be used.
  • the function of the embodiment of the present invention described above or the method of realizing the information processing shown in the flowchart can be used.
  • the server can be a server capable of supplying a computer with a program that realizes the functions of the embodiment of the present invention described above and the information processing shown in the flowchart.
  • it can be a virtual machine that realizes the functions of the embodiment of the present invention described above and the information processing shown in the flowchart.
  • the given function is not equation (12), but the value entered by determining a constant of magnitude according to the threshold according to the value being greater than or equal to the given threshold. It is configured to determine the value corresponding to.
  • the predetermined function is the activation function f 1 (s, p) defined by the equation (14). s and p are the same as in the case of f (s, p) in the equation (12), s which is the output value of the CPD function is an input, and p is output immediately before by f 1 (s, p). The value. (14)
  • the activation function f 1 (s, p) takes s as an input and is compared with the thresholds t 1 and t 2 (t 1 ⁇ t 2 ), and the value of p is compared with b 1 and 1.0. Output (determine) the value of any one of the real number a 1, the real number b 1 , and 1.0 (a 1 ⁇ b 1 ⁇ 1.0).
  • the activation function f 1 (s, p) is one of the real numbers a 1, the real number b 1 , and 1.0 by comparison with the threshold values t 1 and t 2 with s as an input o. Is determined, and the larger of o and p is output.
  • FIG. 11 is an example showing the relationship between the input s in the activation function f 1 (s, p) and the value o determined for the input s.
  • the output value of the CPD function that reflects the speed of discontinuous finger movement is converted into a plurality of stepwise values according to the size, thereby converting to a more stable output value. It becomes possible to do.
  • the given function is not the value in equation (12), but the value corresponding to the value entered by applying a function that maps one value entered to one value within a certain range.
  • the predetermined function is the activation function f 2 (s, p) defined by the equation (15). s and p are the same as in the case of f (s, p) in the equation (12), s which is the output value of the CPD function is an input, and p is output immediately before by f 1 (s, p). The value. (15)
  • FIG. 12 is an example showing the relationship between the input s in the activation function f 2 (s, p) and the value o determined for the input s.
  • the predetermined function is the activation function f 2 (s, p) defined by the equation (16).
  • s and p are the same as in the case of f (s, p) in the equation (12), s which is the output value of the CPD function is an input, and p is output immediately before by f 1 (s, p). The value.
  • FIG. 13 is an example showing the relationship between the input s in the activation function f 2 (s, p) and the value o determined for the input s.
  • the engine unit 23 holds the data point sequence in the second buffer at a predetermined timing, for example, at a predetermined time different from the time corresponding to the frame rate.
  • the engine unit 23 determines the displacement speed at a predetermined timing, for example, at a predetermined time different from the time corresponding to the frame rate, and determines the weighted speed.
  • the controller control unit 24 determines the converted weighted speed at a predetermined timing, for example, at a predetermined time different from the time corresponding to the frame rate.
  • the controller control unit 24 generates a composite vector at a predetermined timing, for example, at a predetermined time different from the time corresponding to the frame rate.
  • the first buffer includes a buffer A for determining the weighted speed and a buffer B for determining the angle.
  • the two buffers A and B are configured to have different retention times.
  • the engine unit 23 holds one or more data points held in the buffer A in the second buffer as a data point sequence.
  • the engine unit 23 determines the variable V, and when the variance of the independent variable is V or more, the engine unit 23 calculates and determines the angle using the function aop (x, y). When the variance of the independent variable is less than V, the engine unit 23 determines the holding angle without calculating the angle. When the variance of the independent variable is less than V, it means that n data points P are locally concentrated. Therefore, by setting the variable V by the engine unit 23, it is possible to ignore the movement of the finger that is too fine and to calculate a stable angle. For example, the variable V is set to 0.7.
  • the engine unit 23 calculates the slope of the regression line using a known method other than the least squares method. In this case, the engine unit 23 does not determine the amount of rotation indicating whether or not to rotate 180 degrees with respect to the slope of the determined regression line, and accordingly, does not determine the axis of the independent variable and the axis of the dependent variable. ..
  • an algorithm such as a Kalman filter or a particle filter can be used.
  • the engine unit 23 does not define the variable Da and does not end the holding of the data points determined to exceed the predetermined holding time among the data points held in the first buffer. In this case, the engine unit 23 determines the angle by referring to the data points stored in a specific time zone shifted by a predetermined time.
  • the engine unit 23 when the engine unit 23 acquires a touch event, it acquires a set of numerical values (x, y) consisting of two variables, and does not associate the data point acquisition time t with the two variables.
  • a set of numerical values (x, y) is stored in the first buffer.
  • the engine unit 23 stores information corresponding to the data point acquisition time t in a memory area or the like in a storage device 14 other than the first buffer, and manages the information in association with the data stored in the first buffer. Can be done.
  • the process or operation described above can be freely performed as long as there is no contradiction in the process or operation such as using data that should not be available in that step at a certain step. Can be changed.
  • the examples described above are examples for explaining the present invention, and the present invention is not limited to these examples.
  • the present invention can be carried out in various forms as long as it does not deviate from the gist thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un procédé qui permet d'augmenter encore davantage l'exploitabilité par rapport au fonctionnement d'un objet exploité par l'intermédiaire d'un écran tactile. La présente invention concerne un procédé exécuté par un ordinateur équipé d'un écran tactile, le procédé comprenant : une étape consistant à conserver un ou plusieurs points de données acquis sur la base d'un événement tactile en tant que suite de points de données, une fois par période prédéfinie ; une étape consistant à déterminer une vitesse de déplacement correspondant à la vitesse à laquelle une position à laquelle un événement tactile s'est produit est déplacée dans la suite de points de données sur la base d'un déplacement des points de données dans la suite de points de données conservés, et à déterminer une vitesse pondérée de la dernière vitesse de déplacement déterminée sur la base d'au moins l'écart par rapport à la valeur moyenne des vitesses de déplacement déterminées avant la dernière vitesse de déplacement déterminée ; et une étape consistant à déterminer une vitesse pondérée convertie pour déterminer un paramètre de l'objet exploité dans un espace virtuel en entrant la vitesse pondérée déterminée dans une fonction prescrite.
PCT/JP2021/017826 2020-05-15 2021-05-11 Procédé, programme et dispositif électronique WO2021230228A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180049658.XA CN115867364A (zh) 2020-05-15 2021-05-11 方法、程序和电子装置
US18/054,005 US20230117127A1 (en) 2020-05-15 2022-11-09 Method, program, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020085896A JP7143364B2 (ja) 2020-05-15 2020-05-15 方法、プログラム、及び電子装置
JP2020-085896 2020-05-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/054,005 Continuation US20230117127A1 (en) 2020-05-15 2022-11-09 Method, program, and electronic device

Publications (1)

Publication Number Publication Date
WO2021230228A1 true WO2021230228A1 (fr) 2021-11-18

Family

ID=78511621

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/017826 WO2021230228A1 (fr) 2020-05-15 2021-05-11 Procédé, programme et dispositif électronique

Country Status (4)

Country Link
US (1) US20230117127A1 (fr)
JP (1) JP7143364B2 (fr)
CN (1) CN115867364A (fr)
WO (1) WO2021230228A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006150039A (ja) * 2004-11-02 2006-06-15 Nintendo Co Ltd ゲーム装置及びゲームプログラム
JP6005831B1 (ja) * 2015-12-28 2016-10-12 株式会社Cygames プログラム及び情報処理方法
JP6389581B1 (ja) * 2018-05-16 2018-09-12 株式会社Cygames プログラム、電子装置、及び方法
JP2019133208A (ja) * 2018-01-29 2019-08-08 株式会社カプコン オブジェクト移動制御プログラムおよびゲームシステム
JP6560801B1 (ja) * 2018-09-26 2019-08-14 株式会社Cygames プログラム、電子装置、及び方法
JP2019187624A (ja) * 2018-04-20 2019-10-31 株式会社Cygames プログラム、電子装置、方法、及びシステム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4741909B2 (ja) * 2004-11-02 2011-08-10 任天堂株式会社 ゲーム装置及びゲームプログラム
JP6722099B2 (ja) * 2016-12-09 2020-07-15 株式会社ダイヘン 搬送システム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006150039A (ja) * 2004-11-02 2006-06-15 Nintendo Co Ltd ゲーム装置及びゲームプログラム
JP6005831B1 (ja) * 2015-12-28 2016-10-12 株式会社Cygames プログラム及び情報処理方法
JP2019133208A (ja) * 2018-01-29 2019-08-08 株式会社カプコン オブジェクト移動制御プログラムおよびゲームシステム
JP2019187624A (ja) * 2018-04-20 2019-10-31 株式会社Cygames プログラム、電子装置、方法、及びシステム
JP6389581B1 (ja) * 2018-05-16 2018-09-12 株式会社Cygames プログラム、電子装置、及び方法
JP6560801B1 (ja) * 2018-09-26 2019-08-14 株式会社Cygames プログラム、電子装置、及び方法

Also Published As

Publication number Publication date
US20230117127A1 (en) 2023-04-20
JP2021179898A (ja) 2021-11-18
JP7143364B2 (ja) 2022-09-28
CN115867364A (zh) 2023-03-28

Similar Documents

Publication Publication Date Title
JP6389581B1 (ja) プログラム、電子装置、及び方法
CN113168281B (zh) 计算机可读介质、电子装置和方法
US10620720B2 (en) Input controller stabilization techniques for virtual reality systems
US20090153468A1 (en) Virtual Interface System
JP4848515B2 (ja) アバター動作制御システム、そのプログラム及び方法
CN110215685B (zh) 游戏中的虚拟对象控制方法及装置、设备、存储介质
WO2023103615A1 (fr) Procédé et appareil de commutation d'objet virtuel, dispositif, support et produit-programme
US10073609B2 (en) Information-processing device, storage medium, information-processing method and information-processing system for controlling movement of a display area
WO2021230228A1 (fr) Procédé, programme et dispositif électronique
JP6862597B1 (ja) 方法、プログラム、及び電子装置
CN111007930B (zh) 温度控制方法、装置、存储介质及电子设备
CN114428548A (zh) 用户交互的光标的方法和系统
JP6775076B1 (ja) 方法、プログラム、及び電子装置
US9370713B2 (en) Game device, game control method, and game control program for controlling game in which character is moved in three dimensional space
JP7471782B2 (ja) プログラム、電子装置、及び方法
JP6824369B1 (ja) 方法、プログラム、及び電子装置
JP7250451B2 (ja) プログラム、電子装置、及び方法
JP7373090B1 (ja) 情報処理システム、情報処理装置、プログラム及び情報処理方法
CN113769384A (zh) 游戏中视野控制方法、装置、设备及存储介质
JP2019134880A (ja) プログラム及びゲーム装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21804611

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21804611

Country of ref document: EP

Kind code of ref document: A1