WO2021220816A1 - Dispositif de commande, dispositif d'exploitation et système d'exploitation - Google Patents

Dispositif de commande, dispositif d'exploitation et système d'exploitation Download PDF

Info

Publication number
WO2021220816A1
WO2021220816A1 PCT/JP2021/015470 JP2021015470W WO2021220816A1 WO 2021220816 A1 WO2021220816 A1 WO 2021220816A1 JP 2021015470 W JP2021015470 W JP 2021015470W WO 2021220816 A1 WO2021220816 A1 WO 2021220816A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
pointer
force
unit
generates
Prior art date
Application number
PCT/JP2021/015470
Other languages
English (en)
Japanese (ja)
Inventor
智仁 山▲崎▼
幹生 岩村
昌崇 久我
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2022517624A priority Critical patent/JPWO2021220816A1/ja
Publication of WO2021220816A1 publication Critical patent/WO2021220816A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to a control device, an operation device, and an operation system.
  • Patent Document 1 discloses the following control device. If the pointer position is significantly far from the center position of the button, the controller sets the force that brings the operating device closer to the center position of the button to zero. When the position of the pointer approaches the center position of the button, the control device generates a force on the operation device that brings the operation device closer to the center position of the button.
  • control device is provided to the operating device used for operating the pointer displayed on the display unit according to the operation performed by the user.
  • a control device that generates a force, a first acquisition unit that acquires attitude information regarding the attitude of the operation device, and a first unit that generates first information indicating the position of the pointer on the display unit based on the attitude information.
  • the generation unit, the second generation unit that generates the second information regarding one or both of the speed of the pointer and the acceleration of the pointer based on the attitude information, and the position of the pointer indicated by the first information are described above.
  • the first region of the display unit it includes a control unit that determines the magnitude of the force based on the second information.
  • the operating device is an operating device used for operating a pointer displayed on a display unit, and is a force for generating a force applied to the user in response to the operation performed by the user.
  • a second generation unit that generates second information regarding one or both of the speed of the pointer and the acceleration of the pointer, and the position of the pointer indicated by the first information are the positions of the display unit.
  • it includes a control unit that controls the magnitude of the force based on the second information.
  • the operation system generates an operation device used for operating a pointer displayed on a display unit and a force applied to the user in response to the operation performed by the user.
  • An operation system including a control device for causing the control device to acquire attitude information regarding the posture of the operation device from the operation device, and based on the attitude information, the pointer of the pointer on the display unit.
  • a first generation unit that generates first information indicating a position
  • a second generation unit that generates second information regarding one or both of the speed of the pointer and the acceleration of the pointer based on the attitude information, and the first generation unit.
  • the operation device includes a first transmission unit that transmits the attitude information to the operation device, and the operation device includes a posture information generation unit that generates the attitude information, a second transmission unit that transmits the attitude information to the control device, and the control. It includes a receiving unit that receives the control information from the device, and a force generating unit that generates the force based on the control information.
  • the operability of the operating device used for operating the pointer can be improved.
  • the figure which shows the outline of the operation system SYS The perspective view which shows the appearance of the image display device 1 which concerns on 1st Embodiment.
  • the block diagram which shows the structure of the image display device 1.
  • the figure which shows the flowchart which shows the operation of the image display device 1. The figure which shows the flowchart which shows the posture information transmission processing.
  • the figure which shows the flowchart which shows the vibration generation processing. The figure which shows the adjustment example of the magnitude A of the vibration force in the 3rd modification.
  • the figure which shows the function of the operation system SYSTEM in the 6th modification The block diagram which shows the structure of the image display device 1f in 7th modification.
  • FIG. 1 is a diagram showing an outline of an operation system SYS.
  • the operation system SYS is a system that displays an image obtained by superimposing an auxiliary image visually recognized by the user U on an outside world image as a virtual image by using AR (Augmented Reality) technology.
  • the operation system SYS has an image display device 1 and an operation device 2.
  • the image display device 1 is attached to the head of the user U.
  • the image display device 1 is a see-through type head-mounted display that displays an image obtained by superimposing an auxiliary image on an external image.
  • the outside world image may be a real outside world image or a virtual outside world image obtained by capturing the surroundings of the user U.
  • a method using an actual external image is called an optical see-through method.
  • a method using a virtual outside world image obtained by photographing the surroundings of the user U is called a video see-through method.
  • the image display device 1 employs an optical see-through method.
  • the image display device 1 is an example of a “control device”.
  • the operating device 2 is gripped by the user U.
  • the operation device 2 is a device that accepts an operation performed by the user U by changing the posture of the operation device 2.
  • the user U is using the operation system SYS outdoors.
  • FIG. 2 is a perspective view showing the appearance of the image display device 1 according to the first embodiment.
  • the image display device 1 includes a temple 94L, a temple 94R, a bridge 96, a projection optical system 98L, a projection optical system 98R, and a sound output device 150.
  • suffixes such as "L” in Temple 94L and "R” in Temple 94R are used.
  • only common numbers without suffixes, such as Temple 94 are used.
  • Temple 94 is a rod-shaped part supported by the pinna.
  • the bridge 96 is arranged between the projection optical system 98L and the projection optical system 98R.
  • the projection optical system 98 includes a display device 140, a light guide path 981, and a half mirror 982.
  • the display device 140 is arranged in the temple 94.
  • the display device 140 displays an image.
  • the display device 140 has various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel.
  • a liquid crystal display panel and an organic EL (Electro Luminescence) display panel.
  • the half mirror 982 reflects the light guided by the light guide path 981.
  • the light reflected by the half mirror 982 is projected onto the retina of the user U. With this light, the user U recognizes the image.
  • the half mirror 982 has a surface facing the user. Hereinafter, this surface is referred to as a “display surface SC”.
  • the display surface SC is an example of a “display unit”.
  • the sound output device 150 is arranged on the side surface of the temple 94.
  • the sound output device 150 outputs sound.
  • FIG. 3 is an example of an image displayed on the display surface SC.
  • the display surface SC displays an image in which the pointer P and the character input image INR are superimposed on the actual outside world image.
  • the character input image INR is an example of an auxiliary image.
  • the image displayed on the display surface SC is hereinafter referred to as a “display image”.
  • the pointer P is superimposed on the character input image INR.
  • the display surface SC is a flat surface or a curved surface.
  • the user U can see an image in which the character input image INR and the pointer P are displayed in the actual space.
  • the position of the character input image INR recognized by the user U is determined by the shape of the light guide path 981 and the shape of the half mirror 982.
  • the display surface SC is a flat surface.
  • the X-axis and the Y-axis on the display surface SC are orthogonal to each other.
  • the character input image INR has an input character area Q and a plurality of buttons Z for character input.
  • some of the plurality of buttons Z are designated with reference numerals in order to reduce the complexity of the drawings.
  • the input character area Q is an area for displaying the character string input by the user U.
  • the button Z is an image showing that a predetermined process corresponding to the button Z is executed by a tap operation on the button Z.
  • the predetermined process corresponding to the button Z is, for example, a process of writing "1" in the input character area Q, a process of deleting one character string displayed in the input character area Q, and the like.
  • the button Z has an image or a symbol suggesting the content of the predetermined process according to the button Z.
  • FIG. 3 shows a situation in which a user who intends to input "123" has input up to "12".
  • FIG. 3 further shows that the pointer P is located at the button Z indicating "3".
  • the position of the pointer P is determined according to the posture of the operating device 2.
  • the image display device 1 performs a predetermined process according to the button Z indicating "3".
  • the image display device 1 displays the display image indicated by the generated image information on the display surface SC.
  • the selection of the button Z is supported by the operation device 2 generating vibration according to the operation (operation of changing the posture of the operation device 2, etc.) performed by the user U on the operation device 2. ..
  • the operation device 2 generates vibration based on the control information CI transmitted from the image display device 1.
  • the vibration force which is the force of vibration according to the operation performed by the user U with respect to the operation device 2, is an example of "the force applied to the user according to the operation performed by the user”.
  • the "force given to the user” is also referred to as the "force acting on the user”.
  • a technique for giving feedback to the skin sensation of the user U by applying a force to the user U in response to an operation performed by the user U is called "haptics".
  • FIG. 4 is a diagram showing an operation example of the operation device 2 and the magnitude A of the vibration force.
  • the position of the pointer P changes according to the posture of the operating device 2. More specifically, the position of the pointer P changes according to the angle of the current longitudinal direction L1 of the operating device 2 with respect to the longitudinal direction of the operating device 2 at the time of initial setting.
  • FIG. 5 is a diagram showing the longitudinal direction of the operating device 2 at the time of initial setting.
  • the operation device 2 shifts to the initial setting mode for storing the longitudinal direction of the operation device 2.
  • the operating device 2 specifies the longitudinal LIN of the operating device 2 in the three-dimensional space.
  • the operating device 2 stores information indicating the specified longitudinal LIN.
  • the longitudinal LIN of the operating device 2 in the three-dimensional space can be specified based on the acceleration applied to the operating device 2.
  • the operating device 2 specifies the direction LV and the direction LH.
  • the direction LV is a direction perpendicular to the longitudinal direction LIN.
  • the direction LV is further the height direction of the operating device 2.
  • the direction LH is a direction perpendicular to the longitudinal direction LIN.
  • the direction LH is further the width direction of the operating device 2.
  • FIG. 6 is a diagram showing a rotation example of the operating device 2 having the direction LV as the rotation axis.
  • the position of the pointer P in the X-axis direction changes according to the angle ⁇ 1 in which the operation device 2 is rotated from the position of the operation device 2 at the time of initial setting with the direction LV as the rotation axis.
  • the angle ⁇ 1 can also be said to be an angle formed by the current longitudinal direction L1 of the operating device 2 with respect to the longitudinal direction LIN of the operating device 2 at the time of initial setting when the operating device 2 is viewed from the direction LV.
  • FIG. 7 is a diagram showing a rotation example of the operating device 2 having the direction LH as the rotation axis.
  • the position of the pointer P in the Y-axis direction changes according to the angle ⁇ 2 in which the operation device 2 is rotated from the position of the operation device 2 at the time of initial setting with the direction LH as the rotation axis.
  • the angle ⁇ 2 can be said to be an angle formed by the current longitudinal direction L1 of the operating device 2 with respect to the longitudinal direction LIN of the operating device 2 at the time of initial setting when the operating device 2 is viewed from the direction LH.
  • the pointer P is located on a virtual straight line overlapping the longitudinal direction L1 of the operating device 2 in the field of view of the user U.
  • the operating device 2 generates vibration according to the position of the pointer P in the display area of the button Z and the speed V of the pointer P.
  • the display area of the button Z is an example of the "first area”.
  • the image display device 1 monotonically increases the magnitude A of the vibration force as the distance from the center position C of the button Z to the position of the pointer P increases. Further, the image display device 1 monotonically reduces the magnitude A of the vibration force as the speed V of the pointer P increases.
  • the graph g1 of FIG. 4 shows the relationship between the X coordinate value of the pointer P and the magnitude A of the vibration force in the situation where the velocity V of the pointer P is the velocity v1.
  • the X coordinate value of the center position C of the button Z is X0.
  • the X coordinate value of the end of the button Z in the + X direction is X1.
  • the Y coordinate value of the pointer P will be described as matching the Y coordinate value of the center position C of the button Z. Further, it will be described that the pointer P is located in the display area of the button Z.
  • the operating device 2 sets the magnitude A of the vibration force to zero when the X coordinate value of the pointer P is X0.
  • the image display device 1 increases the magnitude A of the vibration force as the distance between the center position C and the position of the pointer P increases.
  • the image display device 1 sets the magnitude A of the vibration force to the maximum when the X coordinate value of the pointer P is X1 in a situation where the velocity V of the pointer P is maintained at the velocity v1 or the like.
  • the image display device 1 increases the magnitude A of the vibration force in the ⁇ X direction as the pointer P moves away from the center position C.
  • the image display device 1 sets the magnitude A of the vibration force in the situation where the velocity V of the pointer P is v2, and the vibration force in the situation where the velocity V of the pointer P is the velocity v1. Is less than the size A of. However, the magnitude of the velocity v2 is larger than the magnitude of the velocity v1.
  • FIG. 8 is a block diagram showing the configuration of the image display device 1.
  • the image display device 1 includes a processing device 110, a storage device 120, and a communication device 130 in addition to the display device 140 and the sound output device 150 shown in FIG.
  • the term "device” herein may be read as another term such as a circuit, device or unit.
  • Each element of the image display device 1 is composed of a single device or a plurality of devices. Some elements of the image display device 1 may be omitted.
  • the processing device 110 is a processor that controls the entire image display device 1.
  • the processing device 110 includes, for example, a single chip or a plurality of chips.
  • the processing device 110 includes, for example, a central processing unit (CPU: Central Processing Unit) including an interface for connecting to a peripheral device, an arithmetic unit, a register, and the like. Even if some or all of the functions of the processing device 110 are realized by hardware such as DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), etc. good.
  • the processing device 110 executes various processes in parallel or sequentially.
  • the storage device 120 is a recording medium that can be read by the processing device 110.
  • the storage device 120 stores a plurality of programs including the control program PR1 executed by the processing device 110, character input image information INRI indicating the character input image INR, and the like.
  • the storage device 120 includes, for example, at least one such as a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable ROM), and a RAM (Random Access Memory).
  • the storage device 120 may be called a register, a cache, a main memory (main storage device), or the like.
  • the communication device 130 is hardware (transmission / reception device) for communicating with another device.
  • the communication device 130 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the display device 140 is a device that displays an image.
  • the display device 140 is controlled by the processing device 110.
  • the display device 140 displays various images.
  • FIG. 9 is a block diagram showing the configuration of the operating device 2.
  • the operating device 2 includes a processing device 210, a storage device 220, a communication device 230, a touch sensor 240, an inertial sensor 250, and a vibration generator 260.
  • the vibration generator 260 is an example of a “force generator”.
  • the processing device 210 is a processor that controls the entire operating device 2.
  • the processing device 210 includes, for example, a single chip or a plurality of chips.
  • the processing device 210 includes, for example, a central processing unit (CPU) including an interface for connecting to peripheral devices, an arithmetic unit, registers, and the like. Some or all of the functions of the processing device 210 may be realized by hardware such as DSP, ASIC, PLD, and FPGA.
  • the processing device 210 executes various processes in parallel or sequentially.
  • the storage device 220 is a recording medium that can be read by the processing device 210.
  • the storage device 220 stores a plurality of programs including the control program PR2 executed by the processing device 210, various information used by the processing device 210, and the like.
  • the storage device 220 includes, for example, at least one of ROM, EPROM, EEPROM, RAM and the like.
  • the storage device 220 may be called a register, a cache, a main memory (main storage device), or the like.
  • the communication device 230 is hardware (transmission / reception device) for communicating with other devices.
  • the communication device 230 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the touch sensor 240 detects the tap operation performed by the user U.
  • the processing device 210 generates tap information TI when the touch sensor 240 detects a tap operation.
  • the tap information TI is information indicating that the tap operation performed by the user U has been accepted.
  • the inertial sensor 250 measures the acceleration in each direction of the three axes applied to the operating device 2 in the three-dimensional space and the angular velocity applied to the operating device 2 having each of the three axes as the rotation axis.
  • the processing device 210 generates the attitude information PI based on the measured acceleration in each direction of the three axes and the angular velocity with each of the three axes as the rotation axis.
  • the posture information PI indicates an angle ⁇ 1 and an angle ⁇ 2.
  • the angle ⁇ 1 is an angle obtained by rotating the operating device 2 from the position of the operating device 2 at the time of initial setting with the direction LV as the rotation axis.
  • the angle ⁇ 2 is an angle obtained by rotating the operating device 2 from the position of the operating device 2 at the time of initial setting with the direction LH as the rotation axis.
  • the vibration generator 260 generates vibration based on the control information CI.
  • the vibration generator 260 is an unbalanced mass type, hydraulic type, or electrokinetic type vibration generator.
  • the unbalanced mass type vibration generator 260 generates vibration by centrifugal force generated by rotating a motor having an eccentric weight.
  • the hydraulic vibration generator 260 generates vibration by moving the piston by hydraulic pressure, pneumatic force, electromagnetic force, or the like.
  • the electrokinetic vibration generator 260 generates vibration by utilizing the force generated by passing an electric current through the coil in a magnetic field.
  • the vibration generator 260 generates vibration based on the control information CI received via the communication device 230.
  • the control information CI indicates the magnitude of the vibration force.
  • the processing device 210 adjusts the magnitude of the vibration force by PWM (Pulse Width Modulation) control.
  • PWM Pulse Width Modulation
  • the control information CI shows a value of 50% with respect to the maximum amplitude that the vibration generator 260 can generate, for example, the processing device 210 has a one-to-one ratio between the length of the on period and the length of the off period.
  • the signal is supplied to the unbalanced mass type vibration generator 260.
  • the vibration generator 260 is of the hydraulic type
  • the processing device 210 adjusts the magnitude of the vibration force by, for example, adjusting one or both of the amplitude and frequency of the piston.
  • the vibration generator 260 is a conductive type
  • the processing device 210 adjusts the magnitude of the vibration force by, for example, adjusting one or both of the magnitude and frequency of the current flowing through the coil.
  • FIG. 10 is a diagram showing the functions of the operation system SYS.
  • the processing device 110 reads the control program PR1 from the storage device 120. By executing the control program PR1, the processing device 110 executes the posture information acquisition unit 11, the position information generation unit 12, the speed information generation unit 13, the control unit 14, the first transmission unit 15, the tap information acquisition unit 16, and the button execution. It functions as a unit 17 and a display image generation unit 18.
  • the processing device 210 reads the control program PR2 from the storage device 220. By executing the control program PR2, the processing device 210 functions as a posture information generation unit 21, a second transmission unit 22, a reception unit 24, and a tap information generation unit 27.
  • the position information generation unit 12 is an example of the “first generation unit”.
  • the speed information generation unit 13 is an example of the “second generation unit”.
  • the functions of the operation system SYS will be described for each of the case where the user U changes the posture of the operation device 2 and the case where the user U taps the touch sensor 240.
  • 1-4-1 Function of the operation system SYS when the user U changes the attitude of the operation device 2
  • the inertial sensor 250 has three axes of the operation device 2.
  • the acceleration in each direction and the angular velocity with each of the three axes as the rotation axis are measured.
  • the attitude information generation unit 21 generates the attitude information PI based on the acceleration in each direction of the three axes and the angular velocity with each of the three axes as the rotation axis.
  • the second transmission unit 22 transmits the posture information PI to the image display device 1 by using the communication device 230.
  • the posture information acquisition unit 11 acquires the posture information PI from the operation device 2.
  • the position information generation unit 12 generates position information LI indicating the position of the pointer P on the display surface SC based on the posture information PI.
  • the position information LI is an example of "first information".
  • the storage device 120 stores a table showing the correspondence between the angle ⁇ 1 and the X coordinate value of the pointer P.
  • the angle ⁇ 1 is an angle obtained by rotating the operating device 2 from the position of the operating device 2 at the time of initial setting with the direction LV as the rotation axis.
  • the position information generation unit 12 specifies the X coordinate value of the pointer P corresponding to the angle ⁇ 1 included in the posture information PI with reference to the above table.
  • the storage device 120 stores a table showing the correspondence between the angle ⁇ 2 and the Y coordinate value of the pointer P.
  • the angle ⁇ 2 is an angle obtained by rotating the operating device 2 from the position of the operating device 2 at the time of initial setting with the direction LH as the rotation axis.
  • the position information generation unit 12 specifies the Y coordinate value of the pointer P corresponding to the angle ⁇ 2 included in the posture information PI with reference to the above table.
  • the position information generation unit 12 generates the position information LI including the X coordinate value and the Y coordinate value of the specified pointer P.
  • the display image generation unit 18 generates a display image information DI indicating a display image to be displayed on the display surface SC based on the character input image information INRI and the position information LI.
  • the display image generation unit 18 causes the display device 140 to display the display image indicated by the display image information DI.
  • the speed information generation unit 13 generates a speed information VI indicating the speed V of the pointer P based on the position information LI.
  • the speed information VI is an example of "second information".
  • the speed information VI indicates the speed value of the pointer P in the X-axis direction and the speed value in the Y-axis direction.
  • the speed value in the X-axis direction is a value indicating the speed in the X-axis direction.
  • the speed value in the Y-axis direction is a value indicating the speed in the Y-axis direction.
  • the speed information generation unit 13 calculates the speed value in the X-axis direction by, for example, the following equation (1).
  • X (t1) be the X coordinate value indicated by the position information LI at time t1
  • X (t2) be the X coordinate value indicated by the position information LI at time t2
  • Vx (t2) be the velocity value in the X axis direction at time t2. ..
  • Vx (t2) is given by the following equation (1).
  • Vx (t2) ⁇ X (t2) -X (t1) ⁇ / (t2-t1) (1)
  • the speed information generation unit 13 calculates the speed value in the Y-axis direction by, for example, the following equation (2).
  • Vy (t2) ⁇ Y (t2) -Y (t1) ⁇ / (t2-t1) (2)
  • the control unit 14 determines the magnitude of the vibration force based on the velocity V of the pointer P indicated by the velocity information VI. Determine A. More specifically, the control unit 14 determines the magnitude A of the vibration force based on the velocity V of the pointer P indicated by the position information LI and the position of the pointer P indicated by the position information LI. For example, the control unit 14 calculates the magnitude A of the vibrating force based on the following equation (3).
  • is an arbitrary positive constant.
  • is a positive constant.
  • is the magnitude of the speed V of the pointer P indicated by the speed information VI.
  • the velocity information generation unit 13 represents the velocity V by a vector having a velocity value in the X-axis direction and a velocity value in the Y-axis direction as components.
  • X and Y indicate an X coordinate value (X coordinate value of the pointer P) and a Y coordinate value (Y coordinate value of the pointer P) with the center position C of the button Z as the origin.
  • the control unit 14 generates control information CI indicating the magnitude A of the vibration force.
  • the first transmission unit 15 transmits the control information CI to the operation device 2 by using the communication device 130.
  • the receiving unit 24 receives the control information CI from the image display device 1.
  • the vibration generator 260 generates vibration based on the control information CI.
  • the tap information generation unit 27 Function of the operation system SYS when the user U taps the touch sensor 240 When the touch sensor 240 detects the tap operation, the tap information generation unit 27 generates the tap information TI. do. The tap information generation unit 27 transmits the tap information TI to the image display device 1 by using the communication device 230.
  • the tap information acquisition unit 16 acquires the tap information TI from the operation device 2.
  • the button execution unit 17 refers to the position information LI at the time when the tap information TI is acquired, and determines whether or not the position of the pointer P is included in the display area of the button Z. If the determination result is affirmative, the button execution unit 17 executes a predetermined process according to the button Z. For example, when the pointer P is located in the display area of the button Z indicating "3", the button execution unit 17 describes "3" in the input character area Q as a predetermined process according to the button Z indicating "3". Generates image information indicating the image.
  • the display image generation unit 18 generates a display image information DI indicating a display image to be displayed on the display surface SC based on the character input image information INRI, the position information LI, and the image information generated by the button execution unit 17. ..
  • the display image generation unit 18 causes the display device 140 to display the image indicated by the display image information DI.
  • FIG. 11 is a flowchart showing the operation of the image display device 1.
  • the processing device 110 of the image display device 1 generates the display image information DI based on the information indicating the character input image INR and the position information LI indicating the initial position of the pointer P (step S1).
  • the process in step S1 corresponds to the process executed by the display image generation unit 18.
  • the processing device 110 causes the display device 140 to display the display image indicated by the display image information DI (step S3).
  • the processing device 110 determines whether or not the attitude information PI has been acquired from the operating device 2 (step S5). If the determination result in step S5 is affirmative, the processing device 110 generates position information LI based on the posture information PI (step S7).
  • the process of acquiring the attitude information PI by the processing device 110 corresponds to the process executed by the attitude information acquisition unit 11.
  • the process in step S7 corresponds to the process executed by the position information generation unit 12.
  • the processing device 110 generates the display image information DI based on the information indicating the character input image information INRI and the position information LI (step S9).
  • the processing device 110 displays the display image indicated by the display image information DI on the display device 140 (step S11).
  • the processing apparatus 110 receives the length of the period obtained by subtracting the time t1 from the time t2 and the posture information as shown in the above equations (1) and (2).
  • the speed information VI is generated based on the PI (step S13).
  • the period obtained by subtracting the time t1 from the time t2 is the posture information transmission period shown in FIG.
  • the process in step S13 corresponds to the process executed by the speed information generation unit 13.
  • the processing device 110 determines whether or not the pointer P is located in the display area of the button Z (step S15). If the determination result in step S15 is affirmative and the determination result in step S15 is negative, the processing device 110 returns the process to step S5.
  • step S15 If the determination result in step S15 is affirmative in the situation where the determination result in step S5 is affirmative, the processing device 110 generates control information CI based on the position information LI and the speed information VI (step S17).
  • the process in step S17 corresponds to the process executed by the control unit 14.
  • the processing device 110 uses the communication device 130 to transmit the control information CI to the operating device 2 (step S19).
  • the process in step S19 corresponds to the process executed by the first transmission unit 15. After the processing in step S19 is completed, the processing apparatus 110 returns the processing to step S5.
  • step S5 determines whether or not the tap information TI has been acquired from the operation device 2 (step S21). If the determination result in step S21 is negative in the situation where the determination result in step S5 is negative, the processing device 110 returns the process to step S5.
  • step S21 determines whether or not the pointer P is located in the display area of the button Z (step S23). If the determination result in step S23 is affirmative, it means that the user U has selected the button Z on which the pointer P is superimposed.
  • the process of acquiring the tap information TI by the processing device 110 corresponds to the process executed by the tap information acquisition unit 16. If the determination result in step S23 is negative in the situation where the determination result in step S21 is affirmative after the determination result in step S5 is negative, the processing device 110 returns the process to step S5.
  • step S23 If the determination result in step S23 is affirmative in the situation where the determination result in step S21 is affirmative after the determination result in step S5 is negative, the processing device 110 performs predetermined processing according to the button Z on which the pointer P is located. Is executed (step S25). Next, the processing device 110 generates the display image information DI based on the character input image information INRI, the position information LI, and the processing result of the predetermined processing corresponding to the button Z selected by the user U (step). S27). The processing device 110 causes the display device 140 to display the display image indicated by the generated display image information DI (step S29). After the processing in step S29 is completed, the processing apparatus 110 returns the processing to step S5.
  • the operation device 2 executes a posture information transmission process, a tap information transmission process, and a vibration generation process.
  • the posture information transmission process is a process of transmitting the posture information PI to the image display device 1.
  • the tap information transmission process is a process of transmitting the tap information TI to the image display device 1 when the tap operation performed by the user U is detected.
  • the vibration generation process is a process of generating vibration when the control information CI is received from the image display device 1.
  • the processing device 210 executes the posture information transmission process, the tap information transmission process, and the vibration generation process by different threads.
  • FIG. 12 shows a flowchart showing the posture information transmission process.
  • the processing device 210 transmits the posture information PI for each posture information transmission period.
  • the attitude information transmission period is determined, for example, by the designer of the operating device 2.
  • the length of the posture information transmission period is appropriately determined.
  • the processing device 210 determines whether or not the attitude information PI has never been transmitted, or whether the attitude information transmission period has elapsed since the previous time when the attitude information PI was transmitted (step S31). If the determination result in step S31 is negative, that is, if the attitude information transmission period has not elapsed since the last time the attitude information PI was transmitted in the situation where the attitude information PI is transmitted one or more times, the processing device 210 sets the processing device 210. The process in step S31 is executed again.
  • step S31 If the determination result in step S31 is affirmative, the processing device 210 generates the attitude information PI based on the acceleration and the angular velocity measured by the inertial sensor 250 (step S33). Next, the processing device 210 transmits the posture information PI to the image display device 1 using the communication device 230 (step S35). After the processing in step S35 is completed, the processing apparatus 210 returns the processing to step S31.
  • FIG. 13 shows a flowchart showing the tap information transmission process.
  • the touch sensor 240 determines whether or not the tap operation is detected (step S41). If the determination result in step S41 is affirmative, the processing device 210 generates tap information TI (step S43). Then, the processing device 210 transmits the tap information TI to the image display device 1 by using the communication device 230 (step S45). After the processing in step S45 is completed, the processing apparatus 210 returns the processing to step S41. If the determination result in step S41 is negative, the processing device 210 returns the processing to step S41.
  • FIG. 14 shows a flowchart showing the vibration generation process.
  • the processing device 210 determines whether or not the control information CI has been received from the image display device 1 (step S51). If the determination result in step S51 is affirmative, the processing device 210 uses the vibration generator 260 to generate vibration based on the control information CI (step S53). After the processing in step S53 is completed, the processing apparatus 210 returns the processing to step S51. If the determination result in step S51 is negative, the processing device 210 returns the processing to step S51.
  • the image display device 1 generates a force applied to the user U according to the operation performed by the user U to the operation device 2 used for the operation on the pointer P displayed on the display surface SC.
  • the image display device 1 includes a posture information acquisition unit 11 (an example of an “acquisition unit"), a position information generation unit 12 (an example of a “first generation unit”), and a speed information generation unit 13 (an “second generation unit”). An example) and a control unit 14.
  • the posture information acquisition unit 11 acquires the posture information PI related to the posture of the operation device 2.
  • the position information generation unit 12 generates position information LI (an example of "first information”) indicating the position of the pointer P on the display surface SC based on the posture information PI.
  • the speed information generation unit 13 is an example of speed information VI indicating the speed V of the pointer P based on the posture information PI (“second information regarding one or both of the speed of the pointer and the acceleration of the pointer based on the posture information””. ) Is generated.
  • the control unit 14 bases the vibration force ("" Determine the magnitude of "force”). According to this embodiment, since the intention of the user U can be reflected in the magnitude A of the vibration force, the operability of the operating device 2 can be improved.
  • the intention of the user U is to move the pointer P from the position of the button Z where the pointer P is located to the position of another button Z as compared with the case where the speed V of the pointer P is small. Is likely to be.
  • the vibration is also used to warn the user U that the position of the pointer P is away from a button Z. Therefore, when the speed V of the pointer P is large, it becomes easy for the user U to move the pointer P to another button Z by reducing the vibration force reflecting the intention of the user U.
  • the speed V of the pointer P is small, it is highly possible that the user U intends to keep the pointer P in the display area of the button Z including the position of the pointer P. Therefore, when the speed V of the pointer P is small, it becomes easy to keep the pointer P at the button Z on which the pointer P is located by increasing the vibration force reflecting the intention of the user U.
  • the control unit 14 when the position of the pointer P indicated by the position information LI is included in the display area of the button Z, the control unit 14 indicates the magnitude of the vibration force by the speed information VI. (An example of "one or both of the velocity and acceleration of the pointer P") and the position of the pointer P indicated by the position information LI.
  • the position of the pointer P When the position of the pointer P is included in the display area of the button Z, it may be the intention of the user U to keep the pointer P in the display area of the button Z.
  • the pointer P When the position of the pointer P is close to the end of the button Z, the pointer P is removed from the button Z contrary to the intention of the user U as compared with the case where the position of the pointer P is close to the center position of the button Z.
  • the magnitude of the vibrating force is set to a value that reflects the intention of the user U. .. Therefore, the operability of the operating device 2 is improved.
  • the operation system SYS includes an operation device 2 and an image display device 1.
  • the operating device 2 is used for operating the pointer P displayed on the display surface SC.
  • the image display device 1 causes the operation device 2 to generate a force applied to the user U in response to an operation performed by the user U.
  • the image display device 1 includes a posture information acquisition unit 11, a position information generation unit 12, a speed information generation unit 13, a control unit 14, and a first transmission unit 15.
  • the posture information acquisition unit 11 acquires the posture information PI related to the posture of the operation device 2 from the operation device 2.
  • the position information generation unit 12 generates position information LI indicating the position of the pointer P on the display surface SC based on the posture information PI.
  • the speed information generation unit 13 generates speed information VI indicating the speed V of the pointer P based on the attitude information PI.
  • the control unit 14 When the position of the pointer P indicated by the position information LI is included in the display area of the button Z of the display surface SC, the control unit 14 generates the control information CI indicating the magnitude of the force based on the speed information VI.
  • the first transmission unit 15 transmits the control information CI to the operation device 2.
  • the operation device 2 includes a posture information generation unit 21, a second transmission unit 22, a reception unit 24, and a force generation unit.
  • the posture information generation unit 21 generates the posture information PI.
  • the second transmission unit 22 transmits the posture information PI to the image display device 1.
  • the receiving unit 24 receives the control information CI from the image display device 1.
  • the force generating unit generates a vibrating force based on the control information CI.
  • the magnitude A of the vibration force is controlled based on the velocity V of the pointer P. Therefore, the intention of the user U can be reflected in the magnitude A of the vibration force, and the operability of the operating device 2 can be improved.
  • the control unit 14 determines the vibration force based on the speed V of the pointer P and the position of the pointer P. Determine the size A.
  • the control unit 14 is not limited to this configuration.
  • the control unit 14 has a large vibration force which is an example of the force applied to the user U based on the acceleration of the pointer P and the position of the pointer P.
  • A may be determined.
  • the processing device 110 may function as an acceleration information generation unit instead of functioning as the velocity information generation unit 13 shown in FIG.
  • the acceleration information generation unit generates acceleration information indicating the acceleration of the pointer P based on the position information LI. Acceleration information is an example of "second information”.
  • the acceleration information generation unit is an example of the “second generation unit”.
  • the position of the pointer P is located at the center position C of the display area of the button Z. From this state, when the user U operates the operating device 2 to move the pointer P out of the display area of the button Z, the speed of the pointer P is small in a period shortly after the pointer P starts moving. , The acceleration of the pointer P becomes large. Therefore, the magnitude A of the vibration force applied to the user U can be made smaller by using the acceleration rather than the speed. The point that the magnitude A of the vibration force increases as the distance between the center position C of the display area of the button Z and the position of the pointer P increases is the same as that of the above-described embodiment.
  • the control unit 14 vibrates as an example of the force applied to the user U based on the speed of the pointer P, the acceleration of the pointer P, and the position of the pointer P.
  • the magnitude A of the force may be determined.
  • a second generation unit may be used instead of the speed information generation unit 13 shown in FIG.
  • the second generation unit generates second information indicating the speed of the pointer P and the acceleration of the pointer P based on the position information LI.
  • the control unit 14 determines the method of calculating the magnitude A of the vibrating force when the acceleration indicated by the second information is positive and the method of calculating the magnitude A of the vibrating force when the acceleration is negative.
  • the control unit 14 calculates the magnitude A of the vibrating force by the above-mentioned equation (3).
  • the control unit 14 calculates the magnitude A of the vibrating force by the following equation (4).
  • the control unit 14 may calculate the magnitude A of the vibrating force using either the equation (3) or the equation (4).
  • is a positive constant and is larger than ⁇ . Therefore, when the acceleration of the pointer P is positive, the magnitude A of the vibration force is larger than that when the acceleration of the pointer P is negative, even if the position of the pointer P and the velocity of the pointer P are common. Becomes smaller.
  • the acceleration is positive, the velocity of the pointer P increases.
  • the control unit 14 uses the position of the pointer P as a parameter when determining the magnitude A of the vibration force.
  • the present disclosure is not limited to this configuration.
  • the control unit 14 may determine the magnitude A of the vibration force based on the speed of the pointer P indicated by the speed information VI.
  • the control unit 14 may calculate the magnitude A of the vibrating force using the equation (5) shown below. That is, the position of the pointer P does not become a parameter of the magnitude A of the vibration force.
  • the control unit 14 may determine the magnitude A of the vibration force based on the acceleration of the pointer P indicated by the acceleration information.
  • the processing device 110 may function as an acceleration information generation unit instead of functioning as the velocity information generation unit 13 shown in FIG.
  • the acceleration information generation unit generates acceleration information indicating the acceleration of the pointer P based on the position information LI. Acceleration information is an example of "second information”.
  • the acceleration information generation unit is an example of the “second generation unit”.
  • the control unit 14 determines the magnitude A of the vibration force, which is an example of the force applied to the user U, based on the speed of the pointer P and the acceleration of the pointer P. You may decide.
  • the processing device 110 may function as a second generation unit instead of functioning as the speed information generation unit 13 shown in FIG.
  • the second generation unit generates second information indicating the speed of the pointer P and the acceleration of the pointer P based on the position information LI.
  • the control unit 14 mutually obtains a method of calculating the magnitude A of the vibrating force when the acceleration indicated by the second information is positive and a method of calculating the magnitude A of the vibrating force when the acceleration is negative. It may be different.
  • the control unit 14 calculates the magnitude A of the vibrating force by the above-mentioned equation (5).
  • the control unit 14 calculates the magnitude A of the vibrating force by the following equation (6).
  • the magnitude A of the vibration force is increased according to the increase in the distance from the center position C of the button Z to the position of the pointer P.
  • the present disclosure is not limited to this configuration.
  • the magnitude A of the vibration force may be monotonically increased as the distance from the center position C of the button Z to the position of the pointer P increases.
  • the above-mentioned "monotonically increasing" means a monotonous increase in a broad sense. That is, there may be a section in which the magnitude A of the vibration force does not increase and becomes constant as the distance increases.
  • FIG. 15 is a diagram showing an example of adjusting the magnitude A of the vibration force in the third modification.
  • the Y coordinate value of the pointer P will be described as matching the Y coordinate value of the center position C of the button Z.
  • the control unit 14 in the third modification sets the magnitude A of the vibration force to zero when the X coordinate value of the pointer P is X0. do.
  • the control unit 14 in the third modification increases the magnitude A of the vibration force as the pointer P moves away from the center position C.
  • the control unit 14 in the third modification sets the magnitude A of the vibration force to be constant.
  • the control unit 14 in the third modification increases the magnitude A of the vibration force as the pointer P moves away from the center position C.
  • X0, X1, X2, and X3 have the following relationship. X0 ⁇ X2 ⁇ X3 ⁇ X1
  • the control unit 14 monotonically increases the magnitude of the force as the distance from the center position C of the button Z to the position of the pointer P increases. Further, the control unit 14 monotonically reduces the magnitude of the vibration force as the speed V of the pointer P increases.
  • the above-mentioned "monotonically decreasing" means a monotonously decreasing in a broad sense.
  • the image display device 1 transmits the control information CI every time the distance from the center position C of the button Z to the position of the pointer P changes.
  • the operating device 2 generates vibration based on the received control information CI.
  • the image display device 1b does not transmit the control information CI. You can. Therefore, according to the third modification, the load of processing for controlling the magnitude of the vibration force can be suppressed as compared with the embodiment.
  • the magnitude A of the vibration force increases as the distance from the center position C of the button Z to the position of the pointer P increases. However, in the vicinity of the center position C of the button Z, the magnitude A of the vibration force may be set to zero.
  • FIG. 16 is a diagram showing the magnitude A of the vibration force in the fourth modification.
  • the control unit 14 in the fourth modification determines whether or not the position of the pointer P exists in the predetermined area R1 including the center position C of the button Z. Judgment is made based on the position information LI.
  • the predetermined area R1 is a part of the display area of the button Z. When the determination result is affirmative, the control unit 14 in the fourth modification sets the magnitude of the vibration force to zero.
  • the predetermined region R1 is an example of the “second region”.
  • the predetermined area R1 may include the center position C of the button Z.
  • the predetermined region R1 shown in FIG. 16 is circular.
  • the predetermined area R1 is not limited to a circle, and may be, for example, a reduced display area of the button Z.
  • the specific magnitude A of the vibration force will be explained in the X-axis direction.
  • the Y coordinate value of the pointer P will be described as matching the Y coordinate value of the center position C of the button Z.
  • the control unit 14 in the fourth modification sets the magnitude A of the vibration force to zero when the X coordinate value of the pointer P is included in the range from X0 to X4 in the + X direction.
  • the control unit 14 in the fourth modification increases the vibration force according to the increase in the distance between the center position C and the position of the pointer P.
  • X4 is the X coordinate value of the end portion of the predetermined region R1 in the + X direction.
  • the control unit 14 in the fourth modification sets the magnitude A of the vibration force to zero when the X coordinate value of the pointer P is included in the range from X0 to X5.
  • the control unit 14 in the fourth modification increases the vibration force according to the increase in the distance between the center position C and the position of the pointer P.
  • Increase size A is the X5 is the X coordinate value of the end portion of the predetermined region R1 in the ⁇ X direction.
  • X6 is the X coordinate value of the end of the button Z in the ⁇ X direction.
  • the control unit 14 in the fourth modification determines whether or not the position of the pointer P exists in the predetermined area R1 including the center position C of the display area of the button Z based on the position information LI. When the determination result is affirmative, the control unit 14 in the fourth modification sets the magnitude of the vibration force to zero. It is unpleasant for the user U that the vibration continues to be generated. It is preferable not to generate vibration when the support of the operation performed by the user U is unnecessary. When the pointer P is included in the predetermined area R1, it can be considered that the user U has already selected the desired button Z. Therefore, when the position of the pointer P is included in the predetermined region R1, the magnitude of the vibration force is set to zero. As a result, in the fourth modification, it is possible to suppress the vibration of the operating device 2 from making the user U uncomfortable.
  • the magnitude A of the vibration force increases as the distance from the center position C of the button Z to the position of the pointer P increases.
  • the magnitude A of the vibration force may be smaller than in the case where the change in the moving direction of the pointer P is small.
  • the control unit 14 determines the angle formed by the current moving direction of the pointer P and the moving direction of the pointer P before the posture information transmission period from the present. Determine if it is greater than or equal to the value. When the determination result is affirmative, the control unit 14 sets the magnitude A of the vibration force smaller than when the determination result is negative.
  • FIG. 17 and 18 are diagrams showing the magnitude A of the vibration force in the fifth modification.
  • FIG. 17 shows that the position of the pointer P at time t1 is the position L_t1. Further, FIG. 17 shows that the position of the pointer P at the time t2 is the position L_t2.
  • the time t2 is the time when the posture information transmission period has elapsed from the time t1. Further, FIG. 17 shows that the position of the pointer P at the time t3 is the position L_t3.
  • the time t3 is the time when the posture information transmission period has elapsed from the time t2.
  • the velocity information VI represents the velocity V2 at time t2 by a vector starting from the position L_t1 and ending at the position L_t2.
  • the velocity information VI represents the velocity V3 at time t3 by a vector starting from position L_t2 and ending at position L_t3.
  • FIG. 18 shows that the position of the pointer P at the time t4 is the position L_t4.
  • the time t4 is a time after the time t3 shown in FIG.
  • FIG. 18 shows that the position of the pointer P at time t5 is the position L_t5.
  • the time t5 is the time when the posture information transmission period has elapsed from the time t4.
  • FIG. 18 shows that the position of the pointer P at time t6 is the position L_t6.
  • the time t6 is the time when the posture information transmission period has elapsed from the time t5.
  • the velocity information VI represents the velocity V5 at time t5 by a vector starting from position L_t4 and ending at position L_t5.
  • the velocity information VI represents the velocity V6 at time t6 by a vector starting from the position L_t5 and ending at the position L_t6.
  • the control unit 14 has a predetermined value TH ⁇ at the angle ⁇ d1 formed by the direction of the vector representing the speed V2 and the direction of the vector representing the speed V3. It is determined whether or not it is the above. In FIG. 17, it is assumed that the angle ⁇ d1 is equal to or greater than the predetermined value TH ⁇ . In FIG. 18, the control unit 14 in the fifth modification determines whether or not the angle ⁇ d2 formed by the direction of the vector representing the velocity V5 and the direction of the vector representing the velocity V6 is equal to or greater than the predetermined value TH ⁇ . In FIG. 17, in FIG. 17, the control unit 14 has a predetermined value TH ⁇ at the angle ⁇ d1 formed by the direction of the vector representing the speed V2 and the direction of the vector representing the speed V3. It is determined whether or not it is the above. In FIG. 17, it is assumed that the angle ⁇ d1 is equal to or greater than the predetermined value TH ⁇ . In FIG. 18, the control unit 14 in the fifth modification determines whether or not the angle ⁇ d
  • the control unit 14 in the fifth modification sets the magnitude A of the vibration force.
  • the magnitude A of the vibrating force at the time t3 when the determination result is affirmative is smaller than the magnitude A of the vibrating force at the time t6 when the determination result is negative.
  • the control unit 14 has the direction of the vector indicated by the velocity information VI at the first time point (for example, time t3) and the second time point (for example, for example) before the attitude information transmission period from the first time point. It is determined whether or not the angle formed by the direction of the vector indicated by the velocity information VI at time t2) is equal to or greater than a predetermined value.
  • the control unit 14 makes the magnitude A of the vibrating force when the determination result is affirmative smaller than the magnitude A of the vibrating force when the determination result is negative.
  • the determination result is affirmative, it is possible that the user U intends to keep the pointer P in the display area of the button Z including the current position of the pointer P as compared with the case where the determination result is negative. Is high.
  • the determination result when the determination result is affirmative, it is more likely that the pointer P is moving due to the vibration of the arm of the user U as compared with the case where the determination result is negative.
  • the pointer P moves clockwise from the ⁇ X direction in a direction of approximately 45 degrees in the period from time t4 to time t6. Therefore, it is highly possible that the user U intends to move the pointer P to the outside of the button Z.
  • FIG. 17 it is not suggested that the pointer P moves to the outside of the button Z during the period from time t1 to time t3. Therefore, it is highly possible that the intention of the user U is to keep the pointer P in the display area of the button Z including the current position of the pointer P.
  • the direction of the vector indicated by the speed information VI at the first time point and the direction of the vector indicated by the speed information VI at the second time point before the attitude information transmission period before the first time point are formed.
  • the intention of the user U can be reflected in the magnitude A of the vibration force. Therefore, the operability of the operating device 2 can be improved.
  • the processing device 110 included in the image display device 1 functions as a position information generation unit 12, a speed information generation unit 13, and a control unit 14.
  • the processing device 210 included in the operation device 2 may function as the position information generation unit 12, the speed information generation unit 13, and the control unit 14.
  • FIG. 19 is a block diagram showing the configuration of the image display device 1e in the sixth modification.
  • the image display device 1e included in the operation system SYSTEM in the sixth modification includes a processing device 110e, a storage device 120e, a communication device 130, a display device 140, and a sound output device 150.
  • the processing device 110e is a processor that controls the entire image display device 1e.
  • the processing device 110e includes, for example, a single chip or a plurality of chips.
  • the storage device 120e is a recording medium that can be read by the processing device 110e.
  • the storage device 120e stores a plurality of programs including the control program PR1e executed by the processing device 110e, character input image information INRI indicating the character input image INR, and the like.
  • FIG. 20 is a block diagram showing the configuration of the operating device 2e in the sixth modification.
  • the operation device 2e included in the operation system SYSTEM includes a processing device 210e, a storage device 220e, a communication device 230, a touch sensor 240, an inertial sensor 250, and a vibration generator 260.
  • the processing device 210e is a processor that controls the entire operating device 2e.
  • the processing apparatus 210e includes, for example, a single chip or a plurality of chips.
  • the storage device 220e is a recording medium that can be read by the processing device 210e.
  • the storage device 220e stores a plurality of programs including the control program PR2e executed by the processing device 210e, various information used by the processing device 210e, and the like.
  • FIG. 21 is a diagram showing the function of the operation system SYSTEM in the sixth modification.
  • the processing device 110e reads the control program PR1e from the storage device 120e. By executing the control program PR1e, the processing device 110e functions as a position information acquisition unit 19, a tap information acquisition unit 16, a button execution unit 17, and a display image generation unit 18.
  • the processing device 210e reads the control program PR2e from the storage device 220e. By executing the control program PR2e, the processing device 210e functions as a posture information generation unit 21, a position information generation unit 12e, a speed information generation unit 13, a control unit 14, and a tap information generation unit 27.
  • the content of the process executed by the attitude information generation unit 21 when the user U changes the attitude of the operation device 2e is the same as the content of the process executed by the attitude information generation unit 21 of the embodiment.
  • the position information generation unit 12e executes the same processing as the processing executed by the position information generation unit 12.
  • the position information generation unit 12e further transmits the position information LI to the image display device 1e by using the communication device 230.
  • the position information acquisition unit 19 acquires the position information LI from the operation device 2e. Since the subsequent processing is the same as that of the embodiment, the description thereof will be omitted.
  • the intention of the user U can be reflected in the magnitude A of the vibration force as in the embodiment. Therefore, the operability of the operating device 2e can be improved.
  • the processing device 210e included in the operation device 2e functions as a position information generation unit 12e, a speed information generation unit 13, and a control unit 14. Therefore, the load applied to the processing device 110e can be suppressed as compared with the processing device 110 in the embodiment.
  • the operating device 2 has a vibration generator 260.
  • the operating device 2 may have a reaction force generator.
  • FIG. 22 is a block diagram showing the configuration of the image display device 1f in the seventh modification.
  • the image display device 1f included in the operation system SYSf in the seventh modification includes a processing device 110f, a storage device 120f, a communication device 130, a display device 140, and a sound output device 150.
  • the processing device 110f is a processor that controls the entire image display device 1f.
  • the processing device 110f includes, for example, a single chip or a plurality of chips.
  • the storage device 120f is a recording medium that can be read by the processing device 110f.
  • the storage device 120f stores a plurality of programs including the control program PR1f executed by the processing device 110f, character input image information INRI indicating the character input image INR, and the like.
  • FIG. 23 is a perspective view showing the appearance of the operating device 2f in the seventh modification.
  • the operating device 2f included in the operating system SYSf is, for example, a joystick-type pointing device.
  • the operating device 2f has a pedestal 82 and a lever 84.
  • the lever 84 has a touch sensor 240.
  • the pedestal 82 is a member for installing the operating device 2f on a flat surface such as a desk.
  • the operation device 2f will be described on the assumption that it is set on a horizontal plane.
  • the lever 84 is gripped by the user U.
  • the posture of the lever 84 changes depending on the operation performed by the user U.
  • the position of the pointer P changes according to the posture of the lever 84.
  • FIG. 24 is a block diagram showing the configuration of the operating device 2f in the seventh modification.
  • the operating device 2f includes a processing device 210f, a storage device 220f, a communication device 230, a touch sensor 240, an inertial sensor 250f, and a reaction force generator 270.
  • the reaction force generator 270 corresponds to the "force generator" in the seventh modification.
  • the processing device 210f is a processor that controls the entire operating device 2f.
  • the processing device 210f includes, for example, a single chip or a plurality of chips.
  • the storage device 220f is a recording medium that can be read by the processing device 210f.
  • the storage device 220f stores a plurality of programs including the control program PR2f executed by the processing device 210f, various information used by the processing device 210f, and the like.
  • the inertial sensor 250f measures the acceleration in each direction of the three axes applied to the lever 84 in the three-dimensional space and the angular velocity applied to the lever 84 having each of the three axes as the rotation axis.
  • the reaction force generator 270 generates a force in the direction opposite to the direction in which the posture of the operating device 2f changes.
  • the force in the direction opposite to the direction in which the posture of the operating device 2f changes is a reaction force.
  • the reaction force has a direction and a magnitude.
  • the posture of the operating device 2f will be described with reference to FIGS. 25 and 26. 25 and 26 show a state in which the appearance of the operating device 2f is simplified in order to suppress the complexity of the drawing.
  • FIG. 25 shows a plan view of the operating device 2f as viewed from a direction opposite to the direction of gravity.
  • the plane parallel to the bottom surface of the pedestal 82, that is, the x-axis and the y-axis of the horizontal plane are orthogonal to each other. Further, the direction of gravity is defined as the ⁇ z direction. The direction opposite to the -z direction is defined as the + z direction.
  • FIG. 26 shows an elevational view of the operating device 2f as viewed from the ⁇ x direction.
  • the processing device 210f generates attitude information PIf based on the acceleration in each direction of the three axes measured by the inertial sensor 250f and the angular velocity with each of the three axes as the rotation axis.
  • the posture information PIf indicates, for example, an angle ⁇ f1 and an angle ⁇ f2.
  • the angle ⁇ f1 is an angle in the longitudinal direction of the lever 84 with respect to the xz plane.
  • the angle ⁇ f2 is the angle of the lever 84 in the longitudinal direction Lf1 with respect to the xy plane.
  • FIG. 27 is a diagram showing the function of the operation system SYSf in the seventh modification.
  • the processing device 110f reads the control program PR1f from the storage device 120f. By executing the control program PR1f, the processing device 110f executes the posture information acquisition unit 11f, the position information generation unit 12, the speed information generation unit 13, the control unit 14f, the first transmission unit 15f, the tap information acquisition unit 16, and the button execution. It functions as a unit 17 and a display image generation unit 18.
  • the processing device 210f reads the control program PR2f from the storage device 220f. By executing the control program PR2f, the processing device 210f functions as a posture information generation unit 21f, a second transmission unit 22f, a reception unit 24f, and a tap information generation unit 27.
  • the attitude information generation unit 21f generates the attitude information PIf based on the acceleration in each direction of the three axes measured by the inertial sensor 250f and the angular velocity with each of the three axes as the rotation axis.
  • the second transmission unit 22f transmits the posture information PIf to the image display device 1f by using the communication device 230.
  • the attitude information acquisition unit 11f acquires the attitude information PIf from the operation device 2f.
  • the control unit 14f controls the magnitude of the reaction force instead of the magnitude A of the vibration force by the same process as the process in which the control unit 14 of the embodiment determines the magnitude A of the vibration force. Further, the control unit 14f controls the direction of the reaction force. Specifically, the control unit 14f specifies a direction opposite to the direction in which the posture of the operating device 2f changes, based on the posture information PIf. The control unit 14f sets the direction of the reaction force in the specified direction. For example, the control unit 14f specifies the angle ⁇ f1 and the angle ⁇ f2 as directions opposite to the direction in which the posture of the operating device 2f changes.
  • the angle ⁇ f1 is an angle obtained by subtracting the angle ⁇ f1 indicated by the attitude information PIf at the current time from the angle ⁇ f1 indicated by the attitude information PIf before the attitude information transmission period from the current time.
  • the angle ⁇ f2 is an angle obtained by subtracting the angle ⁇ f2 indicated by the attitude information PI at the current time from the angle ⁇ f2 indicated by the attitude information PI before the attitude information transmission period from the current time.
  • the control unit 14f generates control information CIf including a value indicating the magnitude of the reaction force, an angle ⁇ f1, and an angle ⁇ f2.
  • the first transmission unit 15f transmits the control information CIf to the operation device 2f by using the communication device 130.
  • the receiving unit 24f receives the control information CIf from the image display device 1f.
  • the reaction force generator 270 generates a reaction force of the magnitude indicated by the control information CIf in the direction indicated by the control information CIf.
  • the pointer P can be moved to the center position C of the display area of the button Z by the reaction force. Therefore, the user U does not have to perform fine alignment of the pointer P. Therefore, the operability of the operating device 2f is improved.
  • the speed information generation unit 13 generates the speed information VI based on the position information LI.
  • the speed information generation unit 13 may generate the speed information VI based on the attitude information PI. For example, when the operation device 2 is viewed from the direction LV, the storage device 120 has the angle at which the operation device 2 is rotated from the position of the operation device 2 at the time of initial setting with the direction LV as the rotation axis, and the amount of movement in the X-axis direction. Stores a table showing the correspondence with.
  • the speed information generation unit 13 subtracts the angle ⁇ 1 included in the attitude information PI at a time point before the attitude information transmission period from the current time from the angle ⁇ 1 included in the attitude information PI, so that the rotation angle ⁇ 1 in the attitude information transmission period Is calculated.
  • the velocity information generation unit 13 refers to the above table and specifies the amount of movement in the X-axis direction corresponding to the calculated ⁇ 1.
  • the velocity information generation unit 13 calculates the velocity V of the pointer P in the X-axis direction by dividing the specified movement amount in the X-axis direction by the length of the attitude information transmission period.
  • the speed information generation unit 13 may calculate the speed V of the pointer P in the Y-axis direction in the same manner as the speed V of the pointer P in the X-axis direction.
  • the acceleration information generation unit is used instead of the velocity information generation unit 13.
  • the acceleration information generation unit may also generate acceleration information indicating the acceleration of the pointer P or second information indicating the speed of the pointer P and the acceleration of the pointer P based on the attitude information PI.
  • the display area of the button Z is an example of the “first area”.
  • the first area is not limited to the display area of the button Z.
  • the first area may be a display area of a selection menu.
  • each of the above-described embodiments shows a block of functional units.
  • These functional blocks are realized by any combination of hardware and / or software.
  • the means for realizing each functional block is not particularly limited. That is, each functional block may be realized by one physically and / or logically coupled device, or directly and / or indirectly by two or more physically and / or logically separated devices. (For example, wired and / or wireless) may be connected and realized by these plurality of devices.
  • the input / output information and the like may be stored in a specific place (for example, a memory) or may be managed by a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
  • the determination may be made by a value represented by 1 bit (0 or 1) or by a boolean value (Boolean: true or false). , May be done by numerical comparison (eg, comparison with a given value).
  • the storage device 120 is a recording medium that can be read by the processing device 110, and examples thereof include a ROM and a RAM. Disks, Blu-ray (registered trademark) disks, smart cards, flash memory devices (for example, cards, sticks, key drives), CD-ROMs (Compact Disc-ROM), registers, removable disks, hard disks, floppy (registered trademarks) ) Disks, magnetic strips, databases, servers and other suitable storage media.
  • the program may also be transmitted from the network.
  • the program may also be transmitted from the communication network via a telecommunication line.
  • the storage device 220 is the same as the storage device 120.
  • Each of the above-described aspects includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA (registered trademark). , GSM (registered trademark), CDMA2000, UMB (Ultra Mobile Broadband), IEEE 802.11 (Wi-Fi), LTE 802.16 (WiMAX), IEEE 802.20, UWB (Ultra-WideBand), Bluetooth (registered trademark) ), Other systems that utilize suitable systems and / or next-generation systems that are extended based on them.
  • the information, signals, and the like described may be represented using any of a variety of different techniques.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may be represented by a combination of.
  • the terms described herein and / or the terms necessary for understanding the present specification may be replaced with terms having the same or similar meanings.
  • Each of the functions illustrated in FIGS. 10, 21, and 27 is realized by any combination of hardware and software. Further, each function may be realized by a single device, or may be realized by two or more devices configured as separate bodies from each other.
  • the software uses wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave to websites, servers, or other When transmitted from a remote source, these wired and / or wireless technologies are included within the definition of transmission medium.
  • wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave to websites, servers, or other
  • the information, the parameter, etc. may be represented by an absolute value, a relative value from a predetermined value, or another corresponding information. good.
  • the image display device 1 and the operation device 2 may be mobile stations.
  • Mobile stations can be used by those skilled in the art as subscriber stations, mobile units, subscriber units, wireless units, remote units, mobile devices, wireless devices, wireless communication devices, remote devices, mobile subscriber stations, access terminals, mobile terminals, wireless. It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable term.

Abstract

Ce dispositif de commande amène un dispositif d'exploitation 2 utilisé pour des opérations effectuées par rapport à un pointeur P affiché sur une unité d'affichage à générer une force à appliquer à un utilisateur U conformément à une opération effectuée par l'utilisateur. Le dispositif de commande comporte : une unité d'acquisition pour acquérir des informations d'attitude PI relatives à l'attitude du dispositif d'exploitation 2; une première unité de génération pour générer des premières informations indiquant la position du pointeur P sur l'unité d'affichage; une seconde unité de génération pour générer des secondes informations relatives à la vitesse du pointeur P et/ou à l'accélération du pointeur, sur la base des informations d'attitude PI; et une unité de commande 14 pour déterminer l'ampleur de la force sur la base des secondes informations, si la position du pointeur P indiquée par les premières informations est incluse dans une première région de l'unité d'affichage.
PCT/JP2021/015470 2020-04-27 2021-04-14 Dispositif de commande, dispositif d'exploitation et système d'exploitation WO2021220816A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022517624A JPWO2021220816A1 (fr) 2020-04-27 2021-04-14

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020078468 2020-04-27
JP2020-078468 2020-04-27

Publications (1)

Publication Number Publication Date
WO2021220816A1 true WO2021220816A1 (fr) 2021-11-04

Family

ID=78373511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015470 WO2021220816A1 (fr) 2020-04-27 2021-04-14 Dispositif de commande, dispositif d'exploitation et système d'exploitation

Country Status (2)

Country Link
JP (1) JPWO2021220816A1 (fr)
WO (1) WO2021220816A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002057885A2 (fr) * 2001-01-16 2002-07-25 Immersion Corporation Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique
JP2011227735A (ja) * 2010-04-20 2011-11-10 Tokai Rika Co Ltd 遠隔入力装置
JP2011235780A (ja) * 2010-05-11 2011-11-24 Tokai Rika Co Ltd 遠隔入力装置
JP2012252398A (ja) * 2011-05-31 2012-12-20 Sony Corp ポインティングシステム、ポインティングデバイス及びポインティング制御方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002057885A2 (fr) * 2001-01-16 2002-07-25 Immersion Corporation Commande d'effet de retour haptique pour l'amelioration de la navigation dans un environnement graphique
JP2011227735A (ja) * 2010-04-20 2011-11-10 Tokai Rika Co Ltd 遠隔入力装置
JP2011235780A (ja) * 2010-05-11 2011-11-24 Tokai Rika Co Ltd 遠隔入力装置
JP2012252398A (ja) * 2011-05-31 2012-12-20 Sony Corp ポインティングシステム、ポインティングデバイス及びポインティング制御方法

Also Published As

Publication number Publication date
JPWO2021220816A1 (fr) 2021-11-04

Similar Documents

Publication Publication Date Title
CN109348020B (zh) 一种拍照方法及移动终端
CN110286865A (zh) 一种触摸屏的显示方法及电子设备
KR20200045660A (ko) 사용자 인터페이스를 제어하는 폴더블 전자 장치 및 그의 동작 방법
EP2955610A1 (fr) Procédé de contrôle de fonctionnement avec un visiocasque
KR20150137828A (ko) 데이터 처리 방법 및 그 전자 장치
CN110312073B (zh) 一种拍摄参数的调节方法及移动终端
CN107952242B (zh) 一种终端软件体验方法、终端及计算机可读存储介质
CN110263617B (zh) 三维人脸模型获取方法及装置
KR20160056133A (ko) 이미지 표시 제어 방법 및 이를 지원하는 장치
JP2016511875A (ja) 画像サムネイルの生成方法、装置、端末、プログラム、及び記録媒体
JP5342040B1 (ja) 表示装置、表示方法及びプログラム
CN108174109B (zh) 一种拍照方法及移动终端
CN109240413A (zh) 屏幕发声方法、装置、电子装置及存储介质
WO2021220816A1 (fr) Dispositif de commande, dispositif d'exploitation et système d'exploitation
CN109117466B (zh) 表格格式转换方法、装置、设备及存储介质
CN107913519B (zh) 2d游戏的渲染方法及移动终端
CN111275607B (zh) 界面显示方法、装置、计算机设备及存储介质
CN110168599B (zh) 一种数据处理方法及终端
JP7117451B2 (ja) 視線とジェスチャを用いた情報表示装置
CN114143280B (zh) 会话显示方法、装置、电子设备及存储介质
JP6999822B2 (ja) 端末装置および端末装置の制御方法
JP2022113692A (ja) 仮想オブジェクト操作方法およびヘッドマウントディスプレイ
JP6999821B2 (ja) 端末装置および端末装置の制御方法
JP2022163813A (ja) 装着型の情報端末、その制御方法及びプログラム
CN112329909B (zh) 生成神经网络模型的方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21796725

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022517624

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21796725

Country of ref document: EP

Kind code of ref document: A1