WO2015033967A1 - Dispositif d'émulation, procédé d'émulation, programme, et support de stockage d'informations - Google Patents

Dispositif d'émulation, procédé d'émulation, programme, et support de stockage d'informations Download PDF

Info

Publication number
WO2015033967A1
WO2015033967A1 PCT/JP2014/073220 JP2014073220W WO2015033967A1 WO 2015033967 A1 WO2015033967 A1 WO 2015033967A1 JP 2014073220 W JP2014073220 W JP 2014073220W WO 2015033967 A1 WO2015033967 A1 WO 2015033967A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
input data
program
emulated
touch sensor
Prior art date
Application number
PCT/JP2014/073220
Other languages
English (en)
Japanese (ja)
Inventor
山本 徹
田中 利治
武 中川
Original Assignee
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・コンピュータエンタテインメント filed Critical 株式会社ソニー・コンピュータエンタテインメント
Priority to JP2015535497A priority Critical patent/JP5997388B2/ja
Publication of WO2015033967A1 publication Critical patent/WO2015033967A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

Definitions

  • the present invention relates to an emulation apparatus, an emulation method, a program, and an information storage medium.
  • Game devices are roughly classified into portable devices and stationary devices.
  • Many portable game devices include a touch sensor having a detection surface that sequentially detects the position of an object.
  • operation input by a touch sensor is possible.
  • the operation input to the stationary game device is often performed by the controller.
  • Some such controllers have an operating stick.
  • operation sticks there are those that can be rotated in an inclined or inclined state, and those that can perform slide operations in the vertical and horizontal directions.
  • the controller provided with the operation stick transmits a signal according to the direction and size in which the operation stick is operated to the game device.
  • the inventors have developed an emulation device that emulates a game device provided with a touch sensor.
  • the game device to be emulated is referred to as a target device.
  • Programs executable on the target device can also be executed on the emulation device.
  • the inventors have made it possible to execute the processing executed when the position of the object is detected on the detection surface by the operation on the operation stick. Then, by associating the direction in which the operation stick is operated with the direction in which the detection surface is traced by the finger or the stylus, the operation feeling of the operation on the operation stick becomes close to the operation feeling of the operation following the detection surface, and a good operation feeling is obtained.
  • the operation feeling of the operation on the operation stick becomes close to the operation feeling of the operation following the detection surface, and a good operation feeling is obtained.
  • the operation on the detection surface can be emulated with a good sense of operation. Further, even in the case where the controller includes the touch sensor, the operation on the detection surface can be emulated with good feeling of operation by the operation on the operation stick, and the width of the operation is expanded.
  • the emulation device that emulates the target device.
  • the target device includes the operation stick
  • the operation on the detection surface of the touch sensor included in the target device can be emulated with good feeling of operation by the operation on the operation stick.
  • the present invention has been made in view of the above problems, and one of the objects thereof is to provide an emulation apparatus, an emulation method, a program, and an information storage medium capable of emulating an operation on a detection surface with good feeling of operation. .
  • an emulation apparatus is an emulation apparatus that emulates an operation on a detection surface that sequentially detects the position of an object, and input data associated with the direction in which the operation stick is operated.
  • a plurality of input data acquisition units for acquiring a plurality of input data acquisition units, a determination unit for determining a relationship between a plurality of positions in the detection plane that are arranged along the direction according to the input data, And a process execution unit that executes a process to be performed when the position is sequentially detected on the detection surface.
  • the emulation method according to the present invention is an emulation method that emulates an operation on a detection surface that sequentially detects the position of an object, and acquires input data that acquires input data associated with the direction in which the operation stick is operated.
  • a step of determining a relation between a plurality of positions in the detection plane which is arranged along a direction according to the input data, and a plurality of positions arranged according to the relation determined are on the detection plane
  • a process execution step of executing a process to be executed when sequentially detected is an emulation method that emulates an operation on a detection surface that sequentially detects the position of an object, and acquires input data that acquires input data associated with the direction in which the operation stick is operated.
  • the program according to the present invention is a program that causes a computer to emulate an operation on a detection surface that sequentially detects the position of an object, and acquires input data associated with the direction in which the operation stick is operated.
  • a procedure of determining a relationship between a plurality of positions in the detection surface that is disposed along a direction according to the input data; a plurality of positions disposed according to the determined relationship are sequentially detected on the detection surface And a step of executing a process to be executed at the same time.
  • an information storage medium is a computer readable information storage medium storing a program to be executed by a computer emulating an operation on a detection surface which sequentially detects the position of an object, and an operation stick is operated.
  • a computer-readable information storage medium storing a program that causes the computer to execute a process that is executed when a plurality of positions are sequentially detected on the detection surface.
  • the relationship between a plurality of positions in the detection plane, which is an arrangement according to the input data, is determined, and the plurality of positions arranged according to the determined relationship are sequentially detected on the detection plane
  • the process to be executed is executed.
  • the operation on the detection surface can be emulated with a good sense of operation.
  • the input data acquisition unit acquires input data associated with the direction and size in which the operation stick is operated, and the determination unit determines the size associated with the input data.
  • the relationship between the plurality of positions is determined such that the larger the distance, the longer the interval.
  • the determination unit determines the position in the detection plane, the input data acquisition unit sequentially acquires the input data, and the determination unit acquires the input data. Every time, a position separated along the direction determined based on the input data from the previously determined position is determined, and the processing execution unit corresponds to the position determined multiple times by the determination unit The processing to be executed when sequentially detected according to the acquisition order of the input data to be attached is executed.
  • the display control unit may further include a display control unit that causes an image to be displayed at a position in the display unit that substantially matches the position in the detection plane of the position determined by the determination unit.
  • the emulation device emulates an operation on the detection surface capable of multipoint detection
  • the input data acquisition unit associates the input data associated with the direction in which each of the plurality of operation sticks is operated. Is sequentially acquired, and the determination unit determines a plurality of positions in the detection plane associated with any of the plurality of operation sticks each time the input data is acquired, and the process execution unit A process may be performed which is performed when a plurality of positions determined a plurality of times by the determination unit are sequentially detected according to the acquisition order of the input data associated with the plurality of positions.
  • Another emulation apparatus is an emulation apparatus that emulates an input operation to a touch sensor, and includes an execution unit that executes a program, a controller that receives an operation input from an operator, and an operation input that is received.
  • the conversion unit may convert the operation input to the touch sensor, and the execution unit executes the program based on the operation input after conversion by the conversion unit.
  • Another emulation method is an emulation method for emulating an input operation to a touch sensor, the execution step of executing a program, and an operation input received by a controller that receives an operation input from an operator. And a conversion step of converting into an operation input to a touch sensor, wherein the execution step executes the program based on the operation input after conversion by the conversion unit.
  • another program according to the present invention is a program to be executed by a computer emulating an input operation to a touch sensor, and a procedure for executing a program to be executed and an operation received by a controller that receives an operation input from an operation element Causing the computer to execute a procedure for converting an input into an operation input to the touch sensor, and executing the program to be executed based on the operation input after conversion by the converter.
  • Another information storage medium is a computer-readable information storage medium storing a program to be executed by a computer emulating an input operation to a touch sensor, and the procedure for executing the program to be executed.
  • a procedure for converting an operation input received by a controller that receives an operation input from an operator into an operation input for the touch sensor is executed by the computer, and in the procedure for executing, based on the operation input after conversion by the conversion unit
  • a computer readable information storage medium storing a program characterized by executing the program to be executed.
  • the program is a program created on the assumption of an input operation on a touch sensor.
  • the received operation input is associated with at least one of the direction or the size in which the operator is tilted.
  • a display unit in response to the reception of an emulation start instruction for an input operation on a touch sensor, displays an image indicating a position emulating a position at which the input operation on the touch sensor is performed. And a control unit.
  • FIG. 1 is a diagram showing an example of the entire configuration of an information processing system 10 according to an embodiment of the present invention.
  • an information processing system 10 according to the present embodiment includes an emulation device 12, a controller 14, and a display 16.
  • the emulation apparatus 12 is an information processing apparatus that emulates a portable information processing apparatus (see FIGS. 4A and 4B) described later.
  • a communication unit 24, an input / output unit 26, and a card slot 28 are included.
  • the control unit 20 is, for example, a program control device such as a CPU that operates according to a program installed in the emulation apparatus 12.
  • the storage unit 22 is, for example, a storage element such as a ROM or a RAM, a hard disk drive, or the like.
  • the storage unit 22 stores programs executed by the control unit 20 and the like.
  • the communication unit 24 is, for example, a communication interface such as a network board or a wireless LAN module.
  • the input / output unit 26 is an input / output port such as a high-definition multimedia interface (HDMI (registered trademark)) port or a USB port.
  • the card slot 28 is a slot capable of inserting and removing information storage media such as various memory cards.
  • the card slot 28 reads programs and data recorded in an information storage medium such as a memory card inserted into the card slot 28 according to an instruction from the control unit 20, and inserts an information storage medium such as a memory card inserted therein. Write data.
  • Emulation device 12 may have a plurality of card slots 28, and the plurality of card slots 28 may be capable of inserting and removing information storage media conforming to different standards.
  • the controller 14 according to the present embodiment is an operation input device for performing an operation input to the emulation device 12.
  • FIG. 3A is a perspective view of an example of the controller 14 according to the present embodiment.
  • FIG. 3B is a plan view of the controller 14.
  • FIG. 3C is a rear view of the controller 14.
  • the controller 14 has grip portions GL and GR which project to the near side (the lower side in the drawing of FIG. 3B) from the left and right of the horizontally long main body portion. The user holds the grips GL and GR with the left and right hands and uses the controller 14.
  • buttons B1 to B4 and an operation stick SR are provided at positions on the upper right side of the controller 14 where the user can operate the grip GR with the right hand while holding the grip GR.
  • Buttons BL1 and BL2 are provided on the left side of the back of the controller 14 at positions where the user can operate the forefinger or middle finger in a state where the user grips the grip portion GL with the left hand.
  • buttons BR1 and BR2 are provided at positions on the back right side of the controller 14 where the user grips the grip GR with the right hand and can be operated by the index finger or the middle finger. Further, the controller 14 according to the present embodiment is also provided with operators such as other buttons. The user performs various operation inputs using the controller 14 by pressing the direction keys DK1 to DK4, the buttons B1 to B4, the buttons BL1, BL2, BR1 and BR2 or tilting the operation sticks SL and SR. Can. Then, in the present embodiment, the controller 14 outputs input data associated with the operation input to the emulation device 12.
  • the operation sticks SL and SR are stick-like operation members erected on the surface of the housing of the controller 14.
  • the operation sticks SL and SR are inclinable from the upright state in all directions by a predetermined angle.
  • the housing longitudinal direction of the controller 14 is the X1 axis direction (the right direction in FIG. 3B is the positive direction)
  • the housing depth direction orthogonal to the X axis direction is the Y1 axis direction (FIG.
  • the attitudes (operation states) of the operation sticks SL and SR are 0 to 0 as inclinations of the X1 axis direction and the Y1 axis direction (attitude data (X1, Y1)).
  • Posture data according to the present embodiment is data included as a part of the above-described input data.
  • the emulation device 12 can grasp the current tilt state (posture) of the operation sticks SL and SR.
  • the operation sticks SL and SR are also configured as pressure-sensitive buttons, and can be pressed in the axial direction of the sticks.
  • controller 14 includes sensors such as a gyro sensor that detects an angular velocity and an acceleration sensor that detects an acceleration.
  • the controller 14 according to the present embodiment is provided with a USB port.
  • the controller 14 can output input data to the emulation device 12 by wire connection via the input / output unit 26 by connecting the emulation device 12 with the USB cable.
  • the controller 14 according to the present embodiment includes a wireless communication module or the like, and can output input data to the emulation device 12 wirelessly through the communication unit 24. Further, the controller 14 according to the present embodiment can receive power from the emulation device 12 connected by the USB cable.
  • the display 16 is a liquid crystal display, an organic EL display, or the like.
  • the emulation device 12 and the display 16 are connected via the input / output unit 26 by a cable such as a high-definition multimedia interface (HDMI (registered trademark)) cable.
  • HDMI high-definition multimedia interface
  • FIGS. 4A and 4B show an example of a portable information processing apparatus to be emulated by the emulation apparatus 12 according to the present embodiment.
  • the information processing apparatus shown in FIGS. 4A and 4B is referred to as a target apparatus 30.
  • FIG. 4A is a front view of an example of the target device 30.
  • FIG. 4B is a rear view of the target device 30.
  • FIG. 5 is a diagram showing an example of the hardware configuration of the target device 30.
  • the emulation device 12 according to the present embodiment emulates the target device 30.
  • the target device 30 has a substantially rectangular flat plate shape as a whole.
  • the lateral direction (width direction) of the housing is taken as the X2 axis direction
  • the longitudinal direction (height direction) as the Y2 axis direction.
  • the direction from left to right when viewed from the front of the housing is the X2 axis positive direction
  • the direction from bottom to top when viewed from the front of the housing is the Y2 axis positive direction.
  • the target device 30 includes the control unit 40, the storage unit 42, the communication unit 44, the input / output unit 46, the card slot 48, the display unit 50, The touch sensor 52, the operation key 54, and the sensor unit 56 are provided.
  • control unit 40 The roles of the control unit 40, the storage unit 42, the communication unit 44, the input / output unit 46, and the card slot 48 are the same as those of the control unit 20, the storage unit 22, the communication unit 24, the input / output unit 26, and the card slot 28 in the emulation device 12, respectively. Similar to the role of
  • the display unit 50 is a liquid crystal display, an organic EL display, or the like.
  • the touch sensor 52 is a sensor that sequentially detects the contact of an object (for example, a finger or the like) on the detection surface at predetermined time intervals.
  • the target device 30 according to the present embodiment includes two touch sensors 52 (a front surface touch sensor 52a and a rear surface touch sensor 52b).
  • a touch panel 58 in which the display unit 50 and the surface touch sensor 52a are integrated is provided on the front of the housing of the target device 30 according to the present embodiment.
  • the touch sensor 52 according to the present embodiment may be any type of device capable of detecting the position of an object on the detection surface, such as a capacitive type, a pressure-sensitive type, or an optical type. .
  • both the front surface touch sensor 52a and the back surface touch sensor 52b are multipoint detection type touch sensors capable of detecting the contact of an object at a plurality of positions (for example, up to eight positions).
  • the length in the vertical direction of the back surface touch sensor 52b is shorter than the length in the vertical direction of the front surface touch sensor 52a.
  • the back surface touch sensor 52b is offset upward with respect to the front surface touch sensor 52a.
  • the position of the upper edge of the front surface touch sensor 52a and the position of the upper edge of the back surface touch sensor 52b are substantially coincident in the vertical direction.
  • the length in the left-right direction is substantially the same between the front surface touch sensor 52a and the rear surface touch sensor 52b.
  • the front surface touch sensor 52 a and the rear surface touch sensor 52 b are located at the center of the target device 30 in the left-right direction.
  • the operation key 54 is a type of operation unit used for an operation input performed by the user on the target device 30.
  • direction keys DK1 (T) to DK4 (T) are disposed on the front left side of the housing.
  • the buttons B1 (T) to B4 (T) are disposed on the front right side of the housing.
  • Two operation sticks SL (T) and SR (T) are disposed on the left and right sides of the front of the housing.
  • the button BL1 (T) is disposed on the upper side left side of the housing as viewed from the front of the housing.
  • the button BR1 (T) is disposed on the upper right side of the housing as viewed from the front of the housing.
  • the operation sticks SL (T) and SR (T) are slidable vertically and horizontally.
  • the sensor unit 56 is a device that detects the attitude of the target device 30.
  • the sensor unit 56 includes, for example, a three-axis gyro sensor, a three-axis motion sensor (three-axis acceleration sensor), and an electronic compass.
  • a program that can be executed by the target device 30 can be executed by the emulation device 12.
  • the program is hereinafter referred to as a target program.
  • the target program is supplied to the target device 30 via a memory card that can be inserted into the card slot 48 of the target device 30.
  • the target program is also supplied to the emulation device 12 via a memory card that can be inserted into the card slot 28 of the emulation device 12.
  • the emulation apparatus 12 acquires input data indicating an operation input to the controller 14 at a predetermined frame rate (for example, every 1/60 seconds). Then, the emulation apparatus 12 according to the present embodiment executes a target program to execute processing according to the input data or the like at a predetermined frame rate according to acquisition of the input data. For example, generation of data representing a screen corresponding to the input data and output of the data to the display 16 are performed at a predetermined frame rate. In the present embodiment, the display 16 displays a screen representing the data at a resolution of 960 pixels horizontally and 544 pixels vertically.
  • the target program includes a process of determining whether the device being executed is the target device 30 or the emulation device 12. Then, according to the result of the determination, the target program performs an operation input to the controller 14 or an operation input to the touch sensor 52 or the operation key 54 of the target device 30 as a basis of the process to be executed. Decide whether to handle as input.
  • whether or not the operation on the detection surface of the touch sensor 52 in the target device 30 can be emulated by the emulation device 12 can be set by performing a predetermined operation. It has become.
  • the target program being executed by the emulation device 12 may be based on an operation on the detection surface of the target device 30. That is, the target program may be a program created on the assumption of an input operation on the touch sensor. In such a case, if the operation on the detection surface of the target device 30 can not be emulated in some way, the process can not proceed. In the present embodiment, even if the situation where the processing can not be advanced as described above occurs, the processing can be advanced by setting so that the operation on the detection surface can be emulated by performing the predetermined operation described above. It is possible to
  • the emulation device 12 is set so that the operation on the detection surface of the touch sensor 52 in the target device 30 can be performed by the emulation device 12.
  • the emulation device 12 can freely switch between two operation modes, the normal mode in which the operation on the detection surface of the touch sensor 52 can not be emulated and the touch pointer mode in which the operation can be emulated.
  • the normal mode is referred to as N mode
  • the touch pointer mode is referred to as TP mode.
  • the operation mode is N mode
  • the operation on the detection surface of the touch sensor 52 can not be emulated, but the operation on the operation sticks SL (T) and SR (T) can be emulated.
  • the operation mode is the TP mode
  • the operation on the detection surface of the touch sensor 52 can be emulated, the operation on the operation sticks SL (T) and SR (T) can not be emulated.
  • the operation mode at the start of execution of the target program is N mode.
  • pressing direction key DK1 to DK4 of controller 14 causes processing to be executed when direction key DK1 (T) to DK4 (T) is pressed on target device 30. Is executed. Further, by pressing buttons B1 to B4, BL1 or BL2 of the controller 14, the buttons B1 (T) to B4 (T), BL1 (T) or BL2 (T) are pressed on the target device 30.
  • the process to be performed when the event occurs is executed. Further, by tilting the operation sticks SL and SR, the process executed when the operation sticks SL (T) and SR (T) are slid in the target device 30 in the direction corresponding to the tilted direction is executed. .
  • the operation mode is the TP mode
  • processing executed by the target device 30 when the position of the object on the detection surface of the touch sensor 52 of the target device 30 is detected by the touch sensor 52 is executed. Be done.
  • the position of the object on the detection surface of the touch sensor 52 of the target device 30 will be referred to as a detection position.
  • an operation of setting the position at which the detection position of the detection surface is emulated in the emulation device 12 can be performed.
  • the position at which the detected position is emulated in the emulation device 12 will be referred to as an emulated position.
  • the emulated position is a position that emulates the position where the input operation on the touch sensor 52 is performed.
  • the emulated position and the detected position are associated on a one-to-one basis.
  • the emulated position is associated with the position in the display 16.
  • the emulated position associated with the detection position of the detection surface of the front surface touch sensor 52a takes the lower left position of the display 16 as the origin and the right direction is the X3 axis positive direction. It is expressed by coordinate values in which the upper direction is the Y3 axis positive direction.
  • the X3 coordinate value of the emulated position takes any integer value from 0 to 959
  • the Y3 coordinate value takes any integer value from 0 to 543.
  • the detection position of the detection surface of the front surface touch sensor 52a is managed by the X2-Y2 coordinate value whose origin is the lower left coordinate value of the front surface touch sensor 52a when viewed from the front of the housing.
  • the touch sensor 52 according to the present embodiment can specify the detection position in units of 0.5 pixels. Therefore, in the present embodiment, the X2 coordinate value of the detection position takes an integer value of 0 or more and 1918 or less, and the Y2 coordinate value of the detection position takes any integer value of 0 or more and 1086 or less.
  • the relative position of the emulated position in the display 16 and the relative position of the detected position in the surface touch sensor 52a substantially coincide with each other.
  • the operation mode is the TP mode
  • a state where the situation where the detection position is detected by the touch sensor 52 is emulated is referred to as a touch state
  • a state where the situation where the position of the object is not detected by the touch sensor 52 is emulated is the untouched state I will call it.
  • an emulated position will be in a touch state
  • by performing release operation an emulated position will be in an untouched state.
  • the process executed by the target device 30 is executed when the coordinate value (x ', y') of the detection position at which the coordinate value is associated with (x, y) is detected by the surface touch sensor 52a. It becomes. Therefore, for example, when the operation of pressing and releasing the predetermined button described above is performed once with the coordinate value of the emulated position being (x, y), the detection that the coordinate value is (x ', y') When the tap operation is performed on the position, the process performed by the target device 30 is performed.
  • the coordinate value of the emulated position being (x, y)
  • the coordinate value is (x ', y') when the above-mentioned predetermined button is pressed and released twice in succession.
  • the processing performed by the target device 30 is performed.
  • the emulated position changes based on the acquired posture data (X1, Y1) for each frame. That is, the emulated position can be changed by tilting the operation stick.
  • the direction in which the operation stick is inclined is associated with the direction in which the emulated position changes. For example, when the operation stick is tilted in the X1 axis direction, the emulated position changes in the X3 axis direction. When the operation stick is inclined in the Y1 axis direction, the emulated position changes in the Y3 axis direction.
  • the coordinate value of the emulated position in a certain frame is (x1, y1).
  • the coordinate value of the emulated position in the next frame is assumed to be (x2, y2).
  • the coordinate value of the emulated position in the next frame is (x3, y3).
  • the emulated position is assumed to be that the touch state is maintained.
  • the coordinate values of the detection position of the detection surface of the surface touch sensor 52a corresponding to the coordinate values (x1, y1), (x2, y2), (x3, y3) are (x1 ′, y1 ′), ( It is assumed that x2 ', y2') and (x3 ', y3').
  • a drag operation or flick operation in which the detection position is detected over three frames in the order of (x1 ', y1'), (x2 ', y2'), and (x3 ', y3') is emulated.
  • processing performed by the target device 30 is performed when the drag operation or the flick operation is performed.
  • a drag operation with an emulation position when a predetermined button is pressed as a start point and an emulation position when the button is released as an end point is emulated.
  • the drag operation can be emulated by performing the pressing operation of the predetermined button, the operation of changing the emulation position while the button is held down, and the operation of releasing the predetermined button. Become.
  • two emulated positions can be set.
  • one emulated position is changed based on the attitude data (X1, Y1) of the operation stick SL input
  • the other emulated position is changed based on the attitude data (X1, Y1) of the operation stick SR.
  • the emulated position changing based on the posture data (X1, Y1) of the operation stick SL is referred to as a first emulation position
  • the emulated position changing based on the posture data (X1, Y1) of the operation stick SR is referred to as a second It is called an emulated position.
  • the first emulated position is in the touch state, and by performing the releasing operation, the first emulated position is in the untouched state. Then, by performing an operation of pressing the button BR2, the second emulated position is in a touch state, and by performing an operation of releasing the button BR2, the second emulated position is in an untouched state.
  • the touch state and the untouch state can be switched independently for each of the first emulated position and the second emulated position.
  • the coordinate value of the first emulated position in a certain frame is (x11, y11) and the coordinate value of the second emulated position is (x12, y12).
  • the coordinate value of the first emulated position in the next frame is (x21, y21), and the coordinate value of the second emulated position is (x22, y22).
  • the coordinate value of the first emulated position in the next frame is (x31, y31), and the coordinate value of the second emulated position is (x32, y32).
  • the coordinate values of the detection position of the detection surface of the surface touch sensor 52a corresponding to the coordinate values (x11, y11), (x12, y12), (x13, y13) are (x11 ', y11'), ((11) It is assumed that x12 ', y12') and (x13 ', y13').
  • the coordinate values of the detection position of the detection surface of the front surface touch sensor 52a corresponding to the coordinate values (x21, y21), (x22, y22), (x23, y23) are (x21 ', y21'), (x22 It is assumed that ', y22') and (x23 ', y23').
  • FIG. 8 shows, for example, a situation where a pinch-in operation or an operation for reducing the size of a displayed image is emulated.
  • a pinch out operation or an operation for enlarging and displaying the displayed image is emulated.
  • an operation of pressing predetermined two buttons an operation of shortening the length between two emulated positions while the two buttons are pressed, and an operation of releasing the two buttons. By doing this, it is possible to emulate pinch-in operation and the like.
  • an operation of pressing predetermined two buttons an operation of increasing the length between two emulated positions while the two buttons are pressed, and the two buttons By performing the release operation, it is possible to emulate a pinch out operation and the like.
  • various operations on the detection surface can be emulated by the operations on the operation sticks SL and SR.
  • a combination (vx, vy) of the value vx associated with the change amount of the value x and the value vy associated with the change amount of the value y The amount of change in the emulated position is managed by the speed parameter to be represented.
  • each of the value vx and the value vy is expressed by a value of ⁇ 1 or more and 1 or less.
  • the acceleration parameter represents the combination (ax, ay) of the value ax associated with the change amount of the value vx, and the value ay associated with the change amount of the value vy.
  • each of the value ax and the value ay is represented by a value of ⁇ 1 or more and 1 or less.
  • FIG. 9 is a view showing an example of the correspondence between the value of posture data and the value of the acceleration parameter.
  • the acceleration parameter values (ax, ay) are values associated with the acceleration at which the emulated position moves.
  • the relationship between the value X1 and the value ax is the same as the relationship between the value Y1 and the value ay.
  • the value of the posture data X1 is any one of 0 and 15 or less, 96 or more and 160 or less, and 241 or more and 255 or less, the value ax is 0.
  • the value of the posture data Y1 is any one of 0 to 15, 96 to 160, and 241 to 255
  • the value ay is 0.
  • the numerical range of 0 to 15 and 96 to 160, and 241 to 255 will be referred to as a dead zone.
  • a numerical range of 16 or more and 95 or less and 161 or more and 240 or less will be referred to as a sensitive zone.
  • the values (vx, vy) representing the velocity parameters are updated in accordance with the rules shown in FIG. Also, based on the values (vx, vy) representing the updated velocity parameter, the coordinate value (x, y) of the emulated position is updated according to the rule shown in FIG.
  • the value vx is updated to the current value of the value vx plus a value obtained by multiplying the value ax by 1/45.
  • the updated emulated position in the frame is determined based on the posture data (X1, Y1).
  • the amount of change in the emulated position per frame is increased (see FIG. 10 (see (A1) and (A2)). Further, in the present embodiment, when the value X1 and the value Y1 of the posture data are within the numerical range of the dead zone, the emulated position does not change (see (C1) and (C2) in FIG. 10).
  • the absolute value of the value ax or the value ay of the acceleration parameter becomes larger as the size (for example, the inclination) at which the operation stick is operated is larger (see FIG. 9). Therefore, in any of these cases, the larger the size at which the operation stick is operated, the larger the absolute value of the value vx (see (A1) (A2) (B1) (B2) in FIG. 10). As a result, the larger the size at which the operation stick is operated, the longer the distance between the emulated position in the immediately preceding frame and the emulated position in the frame (see (F1) and (F2) in FIG. 10). .
  • the values vx and vy are adjusted so as not to exceed the range of ⁇ 1 or more and 1 or less (see (D1), (D2), (E1), and (E2) in FIG. 10).
  • the value x is adjusted so as not to exceed the range of 0 or more and 959 or less, and in this case, the value vx is 0 and the change of the emulated position in the X3 axis direction is suppressed. (See (G1) in FIG. 10).
  • the value y is adjusted so as not to exceed the range of 0 or more and 543 or less, and in this case, the value vy is 0, and the change of the emulated position in the Y3 axis direction is suppressed (FIG. 10). In (G2)).
  • the emulated position of the back surface touch sensor 52b can be set instead of the front surface touch sensor 52a.
  • the detection position of the back surface touch sensor 52b is also managed by the above-mentioned X2-Y2 coordinate value.
  • the X2 coordinate value of the detection position takes an integer value of 0 or more and 1918 or less, and the Y2 coordinate value is , And any integer between 306 and 1086 inclusive. In this case, as shown in FIG.
  • setting of emulated positions on both sides of the front surface touch sensor 52a and the rear surface touch sensor 52b can also be performed.
  • the position associated with the same detection position when viewed along the direction perpendicular to the X2-Y2 plane is set as the emulated position for the detection surfaces of both the front surface touch sensor 52a and the rear surface touch sensor 52b.
  • This is suitable, for example, in the case of emulating an operation of pinching and moving the front surface touch sensor 52a and the rear surface touch sensor 52b with a finger. It is assumed that the operation is adopted, for example, in a program that simulates opening an envelope or picking up a character.
  • the back surface touch sensor 52b is offset upward with respect to the front surface touch sensor 52a. Therefore, when the Y3 coordinate value of the emulated position is small (here, for example, smaller than 153), the detection position of the detection surface of the back surface touch sensor 52b when viewed along the direction perpendicular to the X2-Y2 plane. Does not exist. Therefore, in this embodiment, when the Y3 coordinate value of the emulated position determined as described above is smaller than 153, it is corrected to 153 (see (G2) in FIG. 10).
  • the TP mode capable of setting the emulated position of the front surface touch sensor 52a will be referred to as a front surface TP mode.
  • a TP mode capable of setting an emulated position of the back surface touch sensor 52b is referred to as a back surface TP mode.
  • the TP mode capable of setting the emulated position for both sides is referred to as a two-sided TP mode.
  • the screen on which the pointer image I is arranged is It is displayed on the display 16 (see FIGS. 11 to 16).
  • the pointer image I indicates the emulated position.
  • the pointer image I is arranged such that the position of the center of gravity of the circular image included in the pointer image I is the emulated position.
  • the relative position of the emulated position in the display 16 substantially matches the relative position of the detected position in the detection plane.
  • the surface touch sensor 52a of the target device 30 and the display unit 50 have substantially the same size and the same shape. Therefore, when the operation mode is the front surface TP mode or the both surface TP mode, when the pointer image I is displayed on the display unit 50, the position of the pointer image I substantially overlaps the detection position.
  • the design of the displayed pointer image I differs depending on which of the front side TP mode, the back side TP mode, and the both side TP mode. Further, in the present embodiment, the design of the displayed pointer image I also differs depending on whether it is in the touch state or the untouched state.
  • FIG. 11 when the operation mode is the surface TP mode, the pointer image Ia1 (nt) is disposed at the first emulated position in the untouched state, and the pointer image Ia2 (nt) is disposed at the second emulated position in the untouched state.
  • the placed screen is shown.
  • FIG. 12 when the operation mode is the surface TP mode, the pointer image Ia1 (t) is disposed at the first emulated position in the touch state, and the pointer image Ia2 (t) is disposed at the second emulated position in the touch state.
  • the placed screen is shown. In FIG.
  • the pointer image Ic1 (nt) is arranged at the first emulated position in the untouched state and the pointer image Ic2 (nt) is arranged at the second emulated position in the untouched state when the operation mode is the double-sided TP mode.
  • the placed screen is shown.
  • the pointer image Ic1 (t) is disposed at the first emulated position in the touch state and the pointer image Ic2 (t) is disposed at the second emulated position in the touch state when the operation mode is the two-sided TP mode.
  • the placed screen is shown. These screens are displayed on the display 16.
  • the pointer image I is displayed on the display 16 in a state of being superimposed on the screen displayed by executing the target program.
  • the images generated by executing the target program are omitted.
  • the user can freely switch between the N mode and the TP mode by operating the controller 14.
  • switching between the N mode and the TP mode will be described with reference to FIG.
  • FIG. 17 is a diagram showing an example of transition rules of the operation mode.
  • the N mode and the TP mode are switched according to the transition rule shown in FIG. In the following description, it is assumed that all emulated positions are untouched.
  • the operation mode is changed to the TP mode and one emulated position is set.
  • the operation stick SL is pressed when the operation mode is the N mode
  • the surface TP mode is set.
  • the pointer image Ia1 is displayed on the display 16.
  • the operation stick SR is pressed when the operation mode is the N mode
  • the back surface TP mode is set.
  • the pointer image Ib1 is displayed on the display 16.
  • the duplex TP mode is set. In this case, the pointer image Ic1 is displayed on the display 16.
  • a predetermined time has elapsed with no operation input for any of the operation members in a state where two emulated positions are set.
  • the operation sticks SL and SR there is no operation input that is the numerical range of the sensing zone (see FIG. 9) for any of the posture data values X1 and Y1, and there is no operation input for direction keys or buttons.
  • the emulated positions to be set are reduced from two to one. In this case, for example, the pointer image Ia2, Ib2, or Ic2 is erased from the display 16.
  • the operation mode is the TP mode
  • the button B4 or the operation stick SL or SR is pressed in the axial direction of the stick
  • the operation mode is changed to the N mode. In this case, all the pointer images I are erased from the display 16.
  • FIG. 18 is a functional block diagram showing an example of functions implemented by the emulation device 12 according to the present embodiment.
  • the emulation apparatus 12 according to the present embodiment not all the functions shown in FIG. 18 need to be mounted, and functions other than the functions shown in FIG. 18 may be mounted.
  • the emulation apparatus 12 functionally includes, for example, an emulated position data storage unit 60, a speed parameter data storage unit 62, an operation mode data storage unit 64, and input data.
  • the acquisition unit 66, the operation mode setting unit 68, the determination unit 70, the display control unit 72, and the emulation process execution unit 74 are included.
  • the emulated position data storage unit 60, the speed parameter data storage unit 62, and the operation mode data storage unit 64 are mainly implemented with the storage unit 22.
  • the input data acquisition unit 66 is mainly implemented with the communication unit 24 or the input / output unit 26.
  • the other functions are implemented mainly by the control unit 20.
  • the above functions are implemented by the control unit 20 executing a program installed in the emulation device 12 which is a computer and including a command corresponding to the above functions.
  • This program is supplied to the emulation apparatus 12 via a computer readable information storage medium such as an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, a flash memory, or a computer network such as the Internet, for example. .
  • the emulated position data storage unit 60 stores emulated position data indicating coordinate values of the emulated position.
  • the emulated position data storage unit 60 stores first emulated position data indicating coordinate values of the first emulated position and second emulated position data indicating coordinate values of the second emulated position.
  • initial values of the coordinate value of the first emulated position and the coordinate value of the second emulated position are, for example, coordinate values (480, 272) of the center of the display 16.
  • the velocity parameter data storage unit 62 stores velocity parameter data indicating velocity parameters associated with the amount of change per emulated position per frame.
  • the velocity parameter data storage unit 62 indicates, for example, first velocity parameter data indicating the value of the velocity parameter for the first emulated position and a second velocity parameter indicating the value of the velocity parameter for the second emulated position.
  • initial values of the first velocity parameter and the second velocity parameter are, for example, (0, 0).
  • the operation mode data storage unit 64 stores operation mode data indicating the operation mode and the number of emulated positions to be set.
  • the value of the operation mode data is "N", for example.
  • the value of the operation mode data is, for example, “surface TP1” when the set emulation position is one, and when the value is two, the operation mode data is The value of is, for example, "surface TP2".
  • the value of the operation mode data is, for example, “back surface TP1” when the set emulation position is one, and when it is two, the operation mode data is The value of is, for example, "back side TP2".
  • the value of the operation mode data is, for example, “double-sided TP1” when the set emulation position is one, and when it is two, the operation mode data The value of is, for example, "both sides TP2".
  • the input data acquisition unit 66 acquires input data associated with an operation input to the controller 14. In the present embodiment, the input data acquisition unit 66 sequentially acquires input data at a predetermined frame rate. In the present embodiment, the input data acquisition unit 66 acquires, for example, input data including posture data associated with at least one of the direction and the size in which the operation stick is operated. More specifically, for example, input data including posture data associated with at least one of the direction and the size in which the operation stick is tilted is acquired. In the present embodiment, the input data acquisition unit 66 acquires, for example, posture data (X1, Y1) of the operation stick SL and posture data (X1, Y1) of the operation stick SR. The input data acquisition unit 66 also acquires data indicating whether or not the direction keys DK1 to DK4, the buttons B1 to B4, BL1, BL2, BR1, BR2 and the like are pressed.
  • the operation mode setting unit 68 determines the operation mode and the number of emulated positions in the frame based on the input data in accordance with the transition rule shown in FIG. 17 according to the acquisition of the input data.
  • the operation mode setting unit 68 holds transition rule data indicating transition rules of the operation mode shown in FIG. Then, based on the transition rule data and the input data acquired by the input data acquisition unit 66, the operation mode setting unit 68 determines the operation mode and the number of emulated positions in the frame. Then, the operation mode setting unit 68 updates the value of the operation mode data stored in the operation mode data storage unit 64.
  • the determination unit 70 determines the relationship between a plurality of positions in the detection plane, which is an arrangement along the direction according to the input data acquired by the input data acquisition unit 66. In the present embodiment, the determination unit 70 determines the emulated position based on the input data acquired by the input data acquisition unit 66. In the present embodiment, the determination unit 70 plays a role as a conversion unit that converts an operation input received by the controller 14 into an operation input to the touch sensor 52. Further, in the present embodiment, the determination unit 70 emulates, in the frame, a position away from the previously determined emulated position along the direction determined based on the posture data each time input data is acquired. Determined as a rate position. Note that the determination unit 70 may determine a plurality of emulated positions.
  • the determination unit 70 determines the detection position based on the emulated position.
  • the determination unit 70 may determine a plurality of detection positions each time input data is acquired.
  • the determination unit 70 may determine, for example, a position vector of the emulated position in the frame when the emulated position determined last time is set as the start point based on the input data. Specifically, for example, the orientation of the emulated position in the frame with respect to the previously determined emulated position, and the length between the previously determined emulated position and the emulated position in the frame may be determined.
  • the determination unit 70 holds posture / acceleration correspondence data indicating the correspondence between the value of posture data and the value of the acceleration parameter as shown in FIG. Then, the determination unit 70 determines the value of the acceleration parameter based on, for example, posture acceleration correspondence relationship data and the posture data acquired by the input data acquisition unit 66.
  • the acceleration parameter whose value is determined based on the value of the posture data of the operation stick SL is called a first acceleration parameter
  • the acceleration parameter whose value is determined based on the value of the posture data of the operation stick SR is a second acceleration We call it a parameter.
  • the determination unit 70 holds determination rule data indicating a rule for determining the value of the velocity parameter and the coordinate value of the emulated position as shown in FIG. Then, the determination unit 70 determines the value of the velocity parameter and the coordinate value of the emulated position based on, for example, the determination rule data and the value of the determined acceleration parameter. Then, the determination unit 70 determines the value of the velocity parameter indicated by the velocity parameter data stored in the velocity parameter data storage unit 62 and the emulated position indicated by the emulated position data stored in the emulated position data storage unit 60. Update coordinate values.
  • the determination unit 70 holds coordinate correspondence relationship data indicating the correspondence between the coordinate value of the emulated position and the coordinate value of the detection position on the detection surface of the touch sensor 52 as shown in FIG. ing. Then, when there is an emulated position in the touch state, the determination unit 70 determines the detected position associated with the emulated position.
  • the determination unit 70 also corrects the Y3 coordinate value of the emulated position indicated by the emulated position data, which may occur when the operation mode is the two-sided TP mode (see FIG. 10).
  • the display control unit 72 causes the display 16 to display the pointer image I in response to the determination of the emulated position.
  • the display control unit 72 arranges the pointer image I, which is determined based on whether the operation mode data and the emulated position are in the touch state, at a position specified based on the emulated position data.
  • the displayed screen is displayed on the display 16.
  • the relative position of the emulated position in the display 16 and the relative position of the detected position in the surface touch sensor 52a substantially coincide with each other. Therefore, the display control unit 72 causes the pointer image I to be displayed at a position in the display 16 that substantially matches the relative position of the detection position in the detection plane.
  • the display control unit 72 receives an instruction to start emulating the input operation to the touch sensor 52, for example, a pointer image in response to receiving an instruction to shift the operation mode from N mode to TP mode. Display I on the display 16.
  • the instruction corresponds to, for example, at least one of the operation stick SL and the operation stick SR received when the operation mode is the N mode.
  • the emulation process execution unit 74 executes a process that is executed when a plurality of positions arranged according to the relationship determined by the determination unit 70 are sequentially detected on the detection surface.
  • the emulation process execution unit 74 executes the process executed by the target device 30 when the touch sensor 52 detects the detection position determined by the determination unit 70.
  • the emulation process execution unit 74 is executed by the target device 30 when the detection positions of the arrangement along the direction determined based on the input data acquired by the input data acquisition unit 66 are sequentially detected. Execute the process Further, as described above, the larger the size at which the operation sticks SL and SR associated with the input data are operated, the longer the interval between detection positions detected in order. Further, in the present embodiment, the emulation process execution unit 74 executes the target program based on the operation input to the touch sensor 52 determined by the determination unit 70.
  • both the first emulated position and the second emulated position are in the touch state (see FIGS. 12, 14 and 16)
  • the detected position corresponding to each of them is detected by the touch sensor 52.
  • processing executed by the target device 30 is executed.
  • a plurality of detection positions are determined continuously over a plurality of frames, that is, a plurality of times.
  • the processing performed by the target device 30 is performed when a plurality of detection positions determined for each frame are sequentially detected according to the acquisition order (frame order) of input data associated with the plurality of detection positions. Is executed (see FIG. 8).
  • the process executed by the target device 30 is executed when the detection position of the detection surface of the surface touch sensor 52a is detected.
  • the processing executed by the target device 30 is executed when the detection position of the detection surface of the back surface touch sensor 52 b is detected.
  • the process executed by the target device 30 is executed when the detection position of the detection surface for both the front surface touch sensor 52a and the rear surface touch sensor 52b is detected.
  • the input data acquisition unit 66 acquires input data (S101). Then, the operation mode setting unit 68 determines the operation mode and the number of emulated positions based on the input data and the transition rule data (S102). Then, the value of the operation mode data stored in operation mode data storage unit 64 is set such that operation mode setting unit 68 attains a value corresponding to the operation mode and the number of emulated positions determined in the process shown in S102. It updates (S103).
  • the operation mode setting unit 68 confirms whether the value of the operation mode data is "N" (S104).
  • N that is, when the operation mode is N mode (S104: Y)
  • the target device 30 executes when the emulation process execution unit 74 acquires the input data acquired in the process shown in S101.
  • the processing is executed (S105), and the processing shown in this processing example is ended. In this case, it is possible to emulate operations on the operation sticks SL (T) and SR (T) of the target device 30.
  • the determination unit 70 determines an acceleration parameter based on the posture acceleration correspondence data and the posture data acquired in the process shown in S101. The value of is determined (S106). At this time, when the value of the operation mode data is “front surface TP1”, “back surface TP1”, or “both sides TP1”, the value of the first acceleration parameter is determined. Further, when the value of the operation mode data is “front surface TP2”, “back surface TP2”, or “both sides TP2”, the values of the first acceleration parameter and the second acceleration parameter are determined.
  • the velocity parameter in the frame is determined.
  • the value of is determined (S107).
  • the determination unit 70 updates the value of the velocity parameter indicated by the velocity parameter data stored in the velocity parameter data storage unit 62 to the value determined in the process shown in S107 (S108).
  • the value of the operation mode data is “front surface TP1”, “back surface TP1”, or “both sides TP1”
  • the value of the first velocity parameter is determined and updated.
  • determination and update of the values of the first velocity parameter and the second velocity parameter are performed.
  • decision unit 70 determines the emulated position in the frame. Coordinate values are determined (S109). Then, the determination unit 70 updates the coordinate value of the emulated position indicated by the emulated position data stored in the emulated position data storage unit 60 to the coordinate value determined in the process shown in S109 (S110). At this time, when the value of the operation mode data is “front surface TP1”, “back surface TP1”, or “both sides TP1”, determination and update of coordinate values of the first emulated position are performed.
  • the coordinate values of the first emulated position and the second emulated position are determined and updated.
  • the operation mode is the double-sided TP mode
  • the Y3 coordinate value of the emulated position determined in the process shown in S109 is corrected to 153 when it is smaller than 153.
  • the display control unit 72 causes the display 16 to display a screen in which the pointer image I determined based on the operation mode data is located at the position specified based on the emulated position data (S111).
  • the display control unit 72 outputs, for example, data representing the screen to the display 16.
  • the display 16 displays the screen represented by the data.
  • the value of the operation mode data is “front surface TP1”, “back surface TP1”, or “both sides TP1”
  • one pointer image I is arranged on the screen.
  • the value of the operation mode data is “front surface TP2”, “back surface TP2”, or “both sides TP2”
  • two pointer images I are arranged on the screen.
  • the determination unit 70 determines a detection position associated with the emulated position in the touch state (S112). Then, the emulation process execution unit 74 executes the process executed by the target device 30 when the detected position determined in the process shown in S112 is detected (S113), and ends the process shown in the present process example. .
  • the present invention is not limited to the above-described embodiment.
  • the operation sticks SL and SR provided in the controller 14 may not be tiltable sticks, but may be operation members that can slide vertically and horizontally. Then, posture data included in the input data may be associated with the direction and the size of the slide operation.
  • the present embodiment may be applied to other than the emulation device 12 that emulates the target device 30.
  • the operation on the touch sensor 52 may be emulated by the operation on the operation sticks SL (T) and SL (R).
  • the emulation device 12 may incorporate the display 16. Also, the emulation device 12 may be configured from a plurality of casings. Further, the above-described specific character strings and numerical values, and the specific character strings and numerical values in the drawings are merely examples, and the present invention is not limited to these character strings and numerical values.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif d'émulation, un procédé d'émulation, un programme, et un support de stockage d'informations, permettant d'émuler des opérations par rapport à une surface de détection avec une bonne sensation opérationnelle. Une unité d'acquisition de données d'entrée (66) acquiert des données d'entrée corrélées avec la direction dans laquelle une manette de commande a été utilisée. Une unité de détermination (70) détermine la relation entre plusieurs positions dans une surface de détection, agencées selon une direction conformément aux données d'entrée. Une unité d'exécution de processus d'émulation (74) exécute un processus à exécuter lorsque les multiples positions déterminées sont détectées dans l'ordre sur la surface de détection.
PCT/JP2014/073220 2013-09-06 2014-09-03 Dispositif d'émulation, procédé d'émulation, programme, et support de stockage d'informations WO2015033967A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015535497A JP5997388B2 (ja) 2013-09-06 2014-09-03 エミュレーション装置、エミュレーション方法、プログラム及び情報記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-185636 2013-09-06
JP2013185636 2013-09-06

Publications (1)

Publication Number Publication Date
WO2015033967A1 true WO2015033967A1 (fr) 2015-03-12

Family

ID=52628436

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/073220 WO2015033967A1 (fr) 2013-09-06 2014-09-03 Dispositif d'émulation, procédé d'émulation, programme, et support de stockage d'informations

Country Status (2)

Country Link
JP (1) JP5997388B2 (fr)
WO (1) WO2015033967A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290592A (ja) * 2000-04-10 2001-10-19 Sharp Corp 指装着型ポインティングデバイス
JP2001306248A (ja) * 2000-04-20 2001-11-02 Cyber Sign Japan Inc 署名入力機能を備えた電子機器
JP2009205685A (ja) * 2008-02-26 2009-09-10 Apple Inc シングルポインティングデバイスによるマルチポイントジェスチャのシミュレーション
JP2011053770A (ja) * 2009-08-31 2011-03-17 Nifty Corp 情報処理装置及び入力処理方法
JP2011257992A (ja) * 2010-06-09 2011-12-22 Fujitsu Component Ltd 変換装置及びプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011107911A (ja) * 2009-11-16 2011-06-02 Fujitsu Component Ltd プログラム、情報処理装置、及び情報処理システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290592A (ja) * 2000-04-10 2001-10-19 Sharp Corp 指装着型ポインティングデバイス
JP2001306248A (ja) * 2000-04-20 2001-11-02 Cyber Sign Japan Inc 署名入力機能を備えた電子機器
JP2009205685A (ja) * 2008-02-26 2009-09-10 Apple Inc シングルポインティングデバイスによるマルチポイントジェスチャのシミュレーション
JP2011053770A (ja) * 2009-08-31 2011-03-17 Nifty Corp 情報処理装置及び入力処理方法
JP2011257992A (ja) * 2010-06-09 2011-12-22 Fujitsu Component Ltd 変換装置及びプログラム

Also Published As

Publication number Publication date
JPWO2015033967A1 (ja) 2017-03-02
JP5997388B2 (ja) 2016-09-28

Similar Documents

Publication Publication Date Title
US9454834B2 (en) Storage medium storing image processing program for implementing controlled image display according to input coordinate, and information processing device
KR101366813B1 (ko) 정보입력장치의 제어방법, 정보입력장치, 프로그램, 및 정보기억매체
US8098879B2 (en) Information processing device, image movement instructing method, and information storage medium
US9433857B2 (en) Input control device, input control method, and input control program
US9223422B2 (en) Remote controller and display apparatus, control method thereof
US7922588B2 (en) Storage medium having game program stored thereon and game apparatus
JP5808712B2 (ja) 映像表示装置
US20060214924A1 (en) Touch input program and touch input device
US8669937B2 (en) Information processing apparatus and computer-readable medium
WO2012043079A1 (fr) Dispositif de traitement d'informations
JP2006146556A (ja) 画像表示処理プログラムおよび画像表示処理装置
JP2011237838A (ja) プログラム、情報入力装置、及びその制御方法
JP2001043010A (ja) ポインティング信号を発生するために画像を用いるポインティング装置
JP6483556B2 (ja) 操作認識装置、操作認識方法及びプログラム
JP5374564B2 (ja) 描画装置、描画制御方法、及び描画制御プログラム
JP5299892B2 (ja) 表示制御プログラムおよび情報処理装置
JP6270495B2 (ja) 情報処理装置、情報処理方法、コンピュータプログラム、及び記憶媒体
JP6100497B2 (ja) 情報処理プログラム、情報処理装置、情報処理システム、および画像表示方法
EP2538308A2 (fr) Commande basée sur le mouvement d'un dispositif commandé
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
WO2015025874A1 (fr) Dispositif de commande d'emplacement de curseur, procédé de commande d'emplacement de curseur, programme et support de stockage d'informations
WO2015033967A1 (fr) Dispositif d'émulation, procédé d'émulation, programme, et support de stockage d'informations
JP5841023B2 (ja) 情報処理装置、情報処理方法、プログラム及び情報記憶媒体
KR20100062628A (ko) 휴대 단말기와 이를 이용한 가상 키보드 제공방법
WO2023221929A1 (fr) Procédé et appareil d'affichage d'image, dispositif électronique et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14842283

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015535497

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14842283

Country of ref document: EP

Kind code of ref document: A1